samedi 30 août 2014

Why Your A/B Tests Are Failing

Why Your A/B Tests Are Failing image headline performance results 1 e1409241141411


In a 2013 study by eConsultancy & RedEye, surveying almost 1,000 client-side & agency marketers, it was found that 60% found A/B testing to be “quite valuable” to their business.


Yet, only just over a quarter (28%) report being satisfied with their conversion rates.


Why Your A/B Tests Are Failing image EconsultancyRedEye Most Valuable Methods for Improving Conversion Rates Nov2013


What’s interesting is that another study by VWO found that only 1 in every 7 A/B tests is a statistically significant winning test. Why Your A/B Tests Are Failing image one in seven In our own research at Convert.com, we analysed 700 experiments and found again that 1 out of 7 experiments (14%) ran made a positive impact on the conversion rate for all clients that were not an agency.


But here’s where it get’s interesting… for the clients who did use an agency that specialized in conversion optimization, 1 out of every 3 (33%) tests drove a statistically significant result.


Yes, you are reading this right: many conversion optimization agencies get almost triple chances on a positive A/B test output vs. other clients using the same A/B testing tool.


What makes these agencies win? I’ll be sharing all the learning from our research.


A/B Testing is Just 20% of the Conversion Process


As CEO of Convert.com I get every single stat from every single test .


One thing I can tell you quite honestly is that the customers who come to our tool (or any other for that matter) with the expectation that an A/B testing platform is going to do the magic for them, barely ever see the success they were expecting.


A/B testing in and of itself is the last part of the process, not the first. So before you become a client of an A/B testing platform, please at a minimum also have the following:



  • some kind of usability testing tool

  • heat mapping software

  • some level of understanding of your analytics


Why Your A/B Tests Are Failing image pricing page heatmap e1409239285317


It’s only once you understand behavior that you stand a chance of implementing more successful tests. Here is what we see the best clients of Convert do with their time:



  • 60% of the total conversion optimization project time goes to managing the project and navigating client internal politics.

  • 20% of the time is focused on understanding the problems on a page and getting insights about the visitor behavior.

  • 20% is for designing, developing, testing and reporting.


The last 20% of the time involves A/B testing and the actual test setup. In all actuality, the development of challenger pages might only be 10% of the total project time. So please don’t get carried away when people like Peep or myself give one of these “stand-up” landing page evaluations at a conference or on Page Fights.


These are fun, but only scratch the surface of what goes into the actual conversion optimization process. Critics and expert opinions are there to get the clients’ interested in the topic of conversion rate optimization but they are working against us all when actually starting the process, since it’s not nearly as glamorous as we make it sound.


CRO is a process, most of which involves staring at analytics, looking at heatmaps & watching user sessions. This is what watching sessions is like using a tool like mouseflow.



Exciting right? Here’s what I see our most successful agencies do to get such great results:


Understanding Your Analytics


When MarketingSherpa released their E-commerce Benchmark one graph that drew my attention was the one below.


From the 1657 surveyed, five out of six groups of the top revenue improving businesses all doubled their chance to success in optimization by using extensive historical data vs the ones that did this on intuition.


Why Your A/B Tests Are Failing image pngbase64e19d80a11ff4cddf 1 The same MECLabs study showed that the likelihood for success is way higher when using historical data the majority of companies rely on intuition and best practices for optimization. Only 13% of the companies use extensive historical data as part of their testing strategy. Why Your A/B Tests Are Failing image cdn2.meclabs.com pubs MarketingSherpa E commerce Benchmark Study.pdf 11 There are tons of great examples on this blog of companies using their analytics data to formulate successful tests.


One that stands out is from Casey Armstrong’s recent article on reducing churn, where Groove used what they call “Red Flag Metrics” to identify churn behavior and proactively win back the customer as they started to show the signs of those who previously canceled.


Takeaway: For the agencies that run more successful tests, that success is primarily attributed to the test being a direct response to observed data within specific segments of traffic on the site.


Understanding Your Market


Why Your A/B Tests Are Failing image know your target market


You will be surprised how many companies don’t understand who their clients are. Sure, you may understand the keywords that got them to you, but do you know:



  • Why they chose you over the competition?

  • What it is about your value proposition that made you unique in their mind?

  • Why do your most loyal customers keep coming back?

  • Why your new customers leave?


Furthermore, have you mined social media to discover if the problem you’re trying to solve actually exists? Are you looking at any & all publicly available information to see how potential buyers in your market describe the problem?


This is where digging deep to find that product/fit is great, but it’s also one way to formulate test hypothesis that aren’t coming out of nowhere.


Takeaway: Many of the most successful agencies using our platform spend an extensive amount of time researching consumer behavior to understand what is most important to the customer & why certain behaviors take place.


This way, when they conduct a test, there’s already a pretty clear understanding of what will work & why, rather than blindly shooting off a test & hoping there is some kind of positive outcome.


Call Customers & Ask “Why?” 5 Times


Why Your A/B Tests Are Failing image 5 whys lean manufacturing example image source


I’m fan of Eric Ries Lean Startup Movement and Running Lean (book by Ash Maurya) as tools to understand clients. An important part of the lean methodology is to ask why 5 times to understand the problem and see if you have a fitting solution.This was adapted from a methodology developed by Taiichi Ohno of Toyota.


In this article on FastCo Eric shows an example of how this works:


“When confronted with a problem, have you ever stopped and asked why five times? It is difficult to do even though it sounds easy. For example, suppose a machine stopped functioning:



  1. Why did the machine stop? (There was an overload and the fuse blew.)

  2. Why was there an overload? (The bearing was not sufficiently lubricated.)

  3. Why was it not lubricated sufficiently? (The lubrication pump was not pumping sufficiently.)

  4. Why was it not pumping sufficiently? (The shaft of the pump was worn and rattling.)

  5. Why was the shaft worn out? (There was no strainer attached and metal scrap got in.)


Applying this “5 why” framework to a conversion project, I suggest you pick-up the phone, meet clients in person, and schedule Skype calls to get actual face time with your clients and learn directly why they pick your service over the competition.


Something else you can do is mine your customer service calls (if you have them) to explore the “why” behind cancellation and buying reasons.


Takeaway: Similar to the previous section, many of the agencies that use our tool go out of their way to get face time with real clients & dig to the root of the problem.


But what’s interesting about face to face meetings is that you may also notice similar posturing, or body language, that is specific to your customers.


When taken into account, these extra little cues can be built into the overall tonality of a site. This kind of emotional design can cover everything from copy, images, layout and more.


Using Surveys To Better Formulate A Hypothesis


Why Your A/B Tests Are Failing image Help us out for a chance to win a 500 gift certificate tommyisastrategist gmail.com Gmail e1404209102831 1


Because customer interviews are difficult to scale, it’s also wise to send out customer surveys that uses open ended questions.


Ott wrote an excellent article on collecting qualitative feedback in the past and of the things I enjoyed the most is how he recommends codifying the answers to get an “at a glance” view of the feedback.


Beyond that though, a well designed customer survey will provide insights into questions you may not have known the answers to before. For example, Conversion Rate Experts client TopCashback.co.uk found that many potential customers thought the offer was “too good to be true” – which of course, ended up becoming addressed in a test variation.


Takeaway: Reading potential customers hesitations with signing up helps you to formulate tests that directly respond to their concerns. For more on customer surveys check out:


Creating The Wireframes


Why Your A/B Tests Are Failing image photo3 image source


After collecting and analyzing as much research as possible, many of the agencies I’ve spoken with about their process will create wireframe mockups to quickly give the client a sense of what the challenger pages are going to look like.


In some cases, those mockups will also be shown to customers, who’ll be taken through some basic usability testing. By walking through mockups with actual customers, the agency can find potential problems with their challenger design & iterate before ever getting into a far more costly design and development process.


Takeaway: Instead of wasting time & resources implementing a design solution that may not work, smart agencies test their hypothesis early. Wireframing, by comparison, is a faster & more flexible solution that gives everyone involved in the process something tangible to evaluate.


A/B testing


Why Your A/B Tests Are Failing image The Wonders Of AB testing 1


It’s only after collecting and analyzing as much research as possible, and doing some basic hypothesis testing with wireframes, do the agencies get into the actual A/B testing process.


There is a lot to be said about running a proper A/B test, but that’s an article all on it’s own.


Conclusion


We have found many more reasons why agencies get better conversion rates but in this post I wanted to focus on the preparation before the experiment even starts.


There is no secret formula in conversion rate optimization and no magic big data tool that you plug in. It’s hard work, lots and lots of research before and then A/B testing to validate it all.






Why Your A/B Tests Are Failing

Aucun commentaire:

Enregistrer un commentaire