Oh, the stories I could tell of pitched battles over idea validation.
I’ve seen the debate swing like a pendulum between test and trust through the years. In the pre-digital days – back when dinosaurs and mad men roamed the savannah – creatives like myself battled researchers over the validity of focus groups and mall intercepts. Usually, we’d be given leeway to trust our intuition on ideas… until a concept bombed. Then, like Dad busting up the keg party, researchers would be brought in to restore a bit of sober adult thinking to the proceedings.
Fast forward to the birth of the internet. In the raucous pre-bubble days, we witnessed the rise of dot coms with more investment money than sense. Insane advertising (remember the E*Trade monkey?) made us laugh, but not buy.The battle between testing and intuition of ideas is as old as ideas themselves. But today, it's entered a whole new phase. Click To Tweet
As the pendulum inevitably swings, so too do the trends of idea validation. Which brings us to today, where data gathering and testing rule the roost.
Kevin Indig caught my eye with a recent Linkedin post, where he took the position that far too much faith was being placed in testing. This coming from a person responsible for all things SEO at G2 (and previously at Atlassian). I knew we had to talk.
Here’s the recording of our chat – definitely a thought-provoking podcast for anyone with a vested interest in bringing new ideas or innovations to market!
Idea validation show notes
- Kevin started as a huge proponent of testing and data, and believed that with enough testing and data, all answers would be revealed. This may be true for big companies with reams of data and the means to effectively analyze that data – but it isn’t true for the average SME or startup.
- You need to use the data you have to train your gut, so you can make better decisions based on intuition and research.
- If you were testing medicine, you’d never settle for one result. But marketers rush to early findings and bet the ranch on them.
- This is due to natural impatience, but also to the acceleration of new product innovation – which leads to a drive for a faster cure-all solution.
- Testing is ideal for optimization, but less suited to innovation. As Steve Jobs said, you can’t connect the dots looking forward, only backward. With revolutionary new ideas, it’s difficult – if not impossible – to gauge consumer reaction. Thus, when you innovate, you have to be prepared for failure.
- What makes Kevin crazy? Researchers who ride their high experimentation horses. It’s difficult to run successful experiments, and far fetched to believe you can test everything.
- Innovation isn’t for everyone. The Steve Jobs / Elon Musk personality are rare. To Apple’s credit, they didn’t try to find a ‘new’ Jobs, but focused on optimizing the incredible innovations he launched.
- Kevin sees a clear distinction between successful and unsuccessful startups – successful startups ask people what they want, and what they’re using.
Common sense is critical when it comes to validating new ideas. Read more in my story Use your common sense before you think positioning.
- Multivariate testing is incredibly complex, and not realistic for smaller companies – there are simply too many variables to control for all the combinations. This opens you up for drawing the conclusions you want, vs the true conclusions.
- With testing, people often object vigorously when disruptive research comes out, as was seen with the research surrounding Facebook’s newsfeed. People were publicly appalled, but the data proved the feature’s popularity.
- As machine learning gets better, the system will know us better than we know ourselves. But intuition will not go away as a valid idea validation tool. We as marketers need to realize there is a yin and yang between intuition and testing. We can’t throw either out the window – we need to find a healthy middle ground.
Practical idea validation tips
- As a small company, you shouldn’t start with the naive assumption you can test everything.
- Start with assumptions, and data points to back those assumptions. Then come back later to see how those assumptions worked out – did the data inform your gut well? Or did it lead you off course?
- It’s imperative that you study people in their native environment. A/B testing is great, but can be wildly inaccurate based on the amount of engagement people have with your product.
- The first impression is NOT what’s really happening – don’t draw conclusions too quickly.
- Consumer heterogeneity and paid search effectiveness – A large-scale field experiment
- Evan Miller – How to not run an A/B test
- Eric Ries – The lean startup
- Avinash Kaushik -Experimentation and testing: A primer
- Seth-Stephens Davidowitz – Everybody lies
Enjoyed this story? Here are more you’ll like:
To get my ideas straight to your inbox, subscribe to my newsletter below.
And please, don’t forget to share this story!