Conversion Conference 2016 day 1

After a successful debut last year, I really didn’t wanted to miss Conversion Conference this year. And already after one day, I can tell you, the early rise was wordt its while!

Stephen Pavlovich started the day with a elaborate version of how you can formulate a hypothesis. And this is how he does it:

We know that “quantitive” and “qualitive” data
We believe that “lever” by test “concept”
On “area” for “audience” will result in “goal”
We”ll know by observing “KPI” for “duration”

He also said that we have to view ‘testing’ much broader than only A/B tests on a website. You can do tests throughout the whole customer journey. On TV, SEO, SEA, In Store, outdoor ads,.. There are companies that are testing two different Adwords campagnes with different bidding strategies.


The next session I attended was by Rudger de Groot from Mintminds. This session was about the set-up of A/B testing tools. He gave a few tips I’m definitely going to keep in mind in the future:

  • Keep your Optimizely snippet small
  • Stop using the WYSIWYG
  • No more DOM ready
  • Use a content type tag for testing the same pages
  • Quality assurance through user scenarios
  • Use cookies for QA and not the preview mode

Then off to the two speakers of Google, with mixed feelings. Their presentations are most of the time very superficial. But luckily for me, this time was different.

They talked mostly about the fact that you have to examine whether there are differences in the target audience of the different devices (mobile, desktop, tablet,..). For example, kids search for a skateboard on their mobile, but most likely it will be the parents that buy that skateboard on- or offline. This asks for a different approach per device.


After lunch, for many of us a less attentive moment, André Morys knew how to capture the attention with his fascinating presentation. What mostly happens is that employees get the GO to start with A/B testing. Overexcited, they start testing and testing and testing.This isn’t the best approach. But what is than? The image below explains it best:

You start with drawing up a strategy / plan. Secondly, you’re gonna analyse your website through user research, analytics, CRM,.. Next, you’re going to prioritise your most important takeaways from your analysis. With these, you get started with setting up your A/B tests. Finally, you measure so you can learn from these results. This is the flow you should follow for the tests.


Next up was Els Aerts. She spoke passionately about user testing: moderated and unmoderated testing. If you want to know more about this, you can read that here.

What was most interesting for me, was the ‘five second test’ of a homepage. During this test, you show the homepage to 5 users en ask them these questions:

  1. What is it that this business does?
  2. What’s the name of this company?
  3. What’s the price?
  4. What did you like the most?
  5. What did you dislike the most?

Ask them max. 3. Based on their answers you can be sure whether or not the homepage gets your message across. Very interesting test!



Andreas Remes | 12 October 2016

Contact Us