Data-driven Design: an Indeed.com Case Study
Instead of relying on opinions to make product decisions at Indeed we rely heavily on A/B testing. With 75 million monthly unique visitors and more than 1.5 billion monthly searches, data provides empirical answers to (nearly) all of our product questions.
The foundation of these tests is provided by a continuous increase in the measurement of our interfaces. Hundreds of metrics are collected on each search results page, we use these metrics to compare test groups and evaluate the results.
In this session, we'll take a look at what Indeed measures, provide specific examples of how small, overlooked changes have impacted our metrics in significant and unexpected ways, and how implementing simple "no-brainer" updates on our site decreased key business metrics and caused us to revaluate our solutions.
If you'd like insight how Indeed has become one of the world's most popular job search websites and managed to keep it simple, this session is for you.
Additional Supporting Materials
- What are specific examples of features where A/B testing has led to significant positive results that could not have been achieved without testing?
- When should you do evolutionary UI changes vs. drastic redesigns?
- What are some things to look out for when performing A/B tests?
- What types of changes does Indeed test? Every change?
- What role does creativity play in the data-driven design process?
J Christopher Garcia, User Experience Lead, Indeed
Show me another