Invoca Summit Virtual | Oct 5-6 2021

Invoca earned the highest score possible in 13 criteria including:

  • Product and Technology Innovation Roadmap
  • Supporting Products and Services
  • Partner Ecosystem
  • Ease of Use
Get the Report →

DialogTech is now Invoca. Learn about the benefits of Invoca's AI-powered conversation intelligence platform here →

Three Important Elements of a Powerful A/B Testing Strategy

DialogTech

Successful B2B and B2C companies throughout the world are using A/B testing to optimize marketing campaigns and increase conversions. While it may seem like an easy concept to implement, A/B testing often takes time, resources, and a well-thought out strategy in order to reveal meaningful results.

A proper A/B test has two variations of a web page – the A and the B version. You drive traffic in even proportions to the control (A) and the variation (B). The B variation has sometimes simple, sometimes dramatic changes from the original. Some A/B tests have very minor changes, such as a different colored call to action button, while others have drastically different layouts or color schemes. Either way, an A/B test can validate whether or not the changes your team has made to web pages are having a positive impact on conversion rates.

Below are three of the most important elements of a powerful A/B testing strategy.

Variation and Defined Goals

Oftentimes having the resources to come up with variations of landing pages, ads, and other components of your website is the most challenging part of A/B testing. Once you have a dedicated design or web development resource, you can begin testing everything on your website. From copy to design, testing everything is the best way to understand what is effective for your business.

Below are screenshots of A/B test we are currently running. You’ll see the Subscribe to RSS Feed buttons are what we are testing. The goals we have defined for this test are as follows:

  • The original and variation must each have at least 100 unique visitors before a winner can be determined.
  • There must be a minimum of four conversions (on either variation) before a winner can be determined.
  • A conversion in this test is a click on the Subscribe to RSS Feed button.
  • We are also tracking the number of new signups we receive during this test to our RSS feed.

Original (A):

ab-testing-original

Variation (B):

ab-testing-variation

Once we have reached the amount of traffic and conversions defined for this A/B test, we will be able to determine which version of the RSS feed button works better at converting blog readers into blog subscribers on our site.

Accurate Data

Skewed data in an A/B test can impact the decisions your marketing teams makes about design and layout in the future, leading to poor conversion rates. Ensuring that the data you collect from your A/B tests is accurate is one of the most important aspects in determining meaningful results. Tracking the number of unique visitors to each page must not be skewed by visits to those pages made by your own team – to combat this, you can block specific IP addresses from being included in analytics data.

It is also important that any tests you make on your A/B tests do not get counted in the conversions you track towards your results.

One often-overlooked conversion metric in A/B testing is the phone calls your web pages generate. For example, if you are testing two versions of a paid search landing page you may want to include calls, along with form fills, as a conversion metric in determining a winner. A page may seem to be performing poorly at generating form fills, but perhaps instead of filling out the form people are picking up the phone and calling. If you aren’t using call tracking phone numbers on your original and variation page of your A/B test, you’ll never know which page is actually producing the higher number of conversions.

In the example below, we are testing variations of a paid search landing page. If you look in the upper right hand corner of each image, you’ll see that there are different phone numbers for each variation. Using call tracking and dynamic number insertion, we track the number of phone calls each page generates, in addition to the number of form fills each page receives.

Original (A):

lp-ab-testing-original

Variation (B):

lp-ab-testing-variation

Accurately tracking visits to each page running in an A/B test, along with the true number of conversions generated by each, is key to understanding and proving what works best in your marketing campaigns.

To learn more about tracking phone leads from A/B tests download the free guide, Tracking Phone Leads: The Missing Piece of Marketing Automation.

Tools for Implementing Tests and Tracking Results

It can be difficult to implement and track the results of A/B tests without the proper tools. Splitting traffic evenly to the original and variation of an A/B test can be very tricky to do properly, and tracking the number of conversions to each is also challenging. That’s why having the proper tools to implement and track an A/B test is key to success.

At DialogTech we use Optimizely to structure many of our non-paid search related A/B tests. This testing platform enables you to run A/B tests with traffic split evenly between as many variations of a page as you wish to run (A/B/C/D: you get the picture). You then set up goals to track on each page – as many elements as you would like to track. If you don’t have the design or web developer resources we talked about earlier, Optimizely enables you to build a variation to your web pages within their interface. Optimizely also integrates with DialogTech’s call tracking technology, enabling marketers to include call conversion data within their Optimizely A/B testing platform.

Below is just one of the dashboards Optimizely offers per each A/B test you run.

Optimizely-Dashboard

One of the things I find most challenging in A/B testing is coming up with the next test to run. Leave your ideas for A/B tests in the comments section below. Thanks for reading!