Are you an innovator or an iterator? Don’t take it personally, but you might not be cut out for iteration, and may need to innovate instead. However, we’re not talking about innovation in the overused buzzword context, we’re talking about how you go about making improvements to your website.
Quite simply, innovative and iterative testing are two distinctly different methods for using A/B tests to make continual improvements to your website. Innovative and iterative A/B testing are essentially two sides of the same coin.
What is A/B Testing?
Within conversion rate optimization as a practice, A/B testing is the act of serving two different versions of a web page and seeing which yields the best conversion rate. Using a platform such as VWO, Optimizely or Crazy Egg, setting up an A/B test requires you to define a control version of the site page you’d like to test as well as one (or more) variations.
As you plan and launch A/B tests, it’s important to establish a specific conversion improvement goal for each test and also identify the number of visits and conversions that will be required for the test to reach statistical significance. There are a number of free tools available, like this A/B Split and Multivariate Test Duration Calculator, that actually make this very simple.
Both versions of the page are served in equal measure to site visitors. The software testing platform will inform you when and if that lift in conversion is reached, and although you sometimes won’t hit your conversion goal, this is an equally valid result. A number of factors will contribute to the format of A/B testing you’re able to implement, so let’s have a look at the two core types of testing and when it’s best to undertake each one.
What are iterative tests?
When it comes to implementing a conversion rate optimization strategy for your site, one of the very first questions you need to ask yourself is: do I have enough traffic and conversions to be able to make small site changes and see statistically significant results? Can I simply change the button colour, headline or image and watch as conversion increases dramatically? For many small businesses, the answer to that question, unfortunately, is no.
Like so many things in life we don’t want to worry about, the problem boils down to simple math. The CRO experts at ConversionXL suggest you need at least 1,000 conversions per month to be able to see real results using iterative tests. This likely means top line visitor traffic in the tens to hundreds of thousands of visitors per month – not something every marketer has access to.
These minor changes could be as simple as any of the following:
- Modify the button colour
- Change the location of the CTA
- Write a different headline
- Change an image or its scale
- Remove or insert a block of copy
So, when should you implement iterative tests? As noted above, these tests are most likely to generate real results when you have a serious volume of conversions on an asset every month. Otherwise, it will simply take too long to get to a statistically relevant result. If it takes you six months or longer to generate that many conversions, other factors around your traffic may change enough that it influences the test.
Iterative tests are best left for those quick wins. They’re easy to implement, and with the right volume of traffic, you can be testing numerous conversion points across the entire site platform. Plus, as you see success with these tests, it’s generally straightforward to implement the recommendations across similar pages types.
For example, if changing the colour of your Add to Cart button saw a 3% lift in ecommerce sales, the change should be implemented across other site pages that also use that button type. Keep an eye on conversions after the change to ensure that there are no dips in conversion, and move on to the next iteration.
A recent iterative test Kula conducted on a major tire retailer website started with the hypothesis that removing a map on a location finder search results page might make more visitors click through to a dealer. Our visitor recordings showed that some users stumbled on the results page and didn’t realize that they should scroll lower to see all of the results. The variation of the page without the map showed gains of over 3% in clickthroughs to the dealers. We’re presently testing other iterative changes in the search result positioning to see which performs the best.
What about innovative testing?
On the other side of the A/B testing spectrum are innovative tests. Instead of being a simple change to a single element on the page, innovative tests look to modify the entire interface with a wholly different design concept. The purpose of an innovative test is to see if a radical shift can be responsible for major gains in conversion. Each version of the page is then served to 50% of the audience, and conversions are tracked across both page types. In the CRO business, this is called ‘making a big bet’.
Note that just because you’re designing something completely different doesn’t just mean that you can make something that’s simply different for the sake of being different. Any hypothesis you make about your interface design should be couched in data. Yet, the difference between the variation and your control needs to be substantial. Landing pages are a great opportunity to test innovative design theories, as they’re a small, very focused example of a page within your site. Big gains here can power a dramatic increase in leads or customers.
Before we rolled out our new site at Kula, we first designed an entirely new landing page that looks closer to what you see here than our previous site. As we noted in our post about launching the new site, the result was a lift in conversion of nearly 10%. Generally, it can be difficult to see gains of this size with a simple iterative A/B test. With an iterative test, the result may only be an increase of 2%, but with a higher volume of traffic, you can complete more tests in a shorter period of time.
The old version is on the left, and the new on the right:
How do you decide what to test?
When you’re just getting started with A/B testing, the possibilities can seem endless. Doing your research and relying on data and qualitative usability testing can help shed some light on where to begin.
Visitor recordings, heatmaps and live usability tests can serve to highlight troublesome areas for your users. They may be stumbling to checkout of your ecommerce store, or they may start to fill out a form and abandon it. Your main homepage Call to Action (CTA) may be getting glossed over. Watching how people use the site will allow you to form a meaningful hypothesis and generate a test to help prove or disprove it.
Once you have an idea of what you want to test, what you expect the result to be, and why, you can create the variation to test against your control design. Using a Conversion Rate Optimization tool such as VWO will allow you to configure your parameters and run the test until a result is generated. If the result generated is positive, it can then be implemented as the new control. At this point, you can either continue to refine the design or move on to other areas of the site that require improvement.
What kind of results can you expect from iterative vs innovative tests?
Which type of testing you implement will depend on your traffic numbers and your goals. With frequent iterative tests, you could be seeing smaller gains of 1-4% at a time, although we’ve occasionally seen even higher conversion increases. Sometimes there may be no noticeable improvement, at which point going back to your research and data will allow you to plan for additional and better tests.
With innovative testing, the results can be substantially higher than those seen via small iterative tests. However, the effort to get there is larger as well. Since you’ll need to fully redesign an entire page or area of your site, the time to set up each test is greater. It’s not uncommon for innovative tests to generate an increase of 10-25% or sometimes even more, so they can be truly groundbreaking in terms of their effect on lead capture or customer conversion. The biggest issue with innovative A/B testing is that it can be hard to know exactly which change or combination of changes elicited the increase in conversion.
Top of funnel traffic growth becomes very important when testing; it will help you to refine your innovative design over time in a more iterative way.
Every day that a website is live without an active A/B test is another day of visitors that could be telling you how to make your site better through their own actions. No matter whether you’re doing minor iterative changes to your interface, or blowing it all up with an innovative change to the layout, the important thing is to always be testing. Site owners with reasonable traffic levels should be implementing both types of A/B tests for the greatest potential for improvement. Growing your traffic to levels that can accommodate iterative testing is an excellent goal, especially since those visitors will be arriving at a site that is heavily optimized for conversion.