By Heather Schallert As we step into a new year and a new decade, we are also stepping into a new definition of normal. Smartphones have changed the landscape of the internet. Everyone is walking around with mini computers in their hands, making getting information in...
The Basics of Split Testing (A/B Testing)
The Basics of Split Testing (A/B Testing)
By: Madison Johansen
A/B testing or split testing of any kind is a must in advertising – after all we are always in search of a way to improve and maximize performance. Some may think that A/B testing is only done to compare two or more versions of a webpage, but you can actually use this type of testing on your paid media campaigns as well. I will fill you in on some of the A/B testing basics, if you are looking for a simple way to have a serious impact.
Below are some initial tips to think about before you implement any A/B test:
- What is your goal of the test? Always define the objective.
- What variables are you testing to determine this goal? For example: running a landing page test, testing a new CTA, etc.
- Where are you implementing the test? If you’re running a campaign specific test, choose to run in a campaign that gets a decent amount of traffic.
- How long are you running the test? Give your test a long enough time to run so that you receive enough exposure and traffic to make some sort of conclusion.
A/B Testing in AdWords
A simple form of A/B testing is a straightforward test run through your paid media platform (AdWords, Bing Ads, etc.) for either search only or display only campaigns. Once you have gone through the steps outlined above – determined your goals, what variable you are going to test, where you are implementing etc. – you basically just need to implement in your platform and set your ads to rotate evenly. One thing to remember in any test of this sort, is not to test too many different things at once, it is advised to stick to one variable at a time so that you can accurately see whether or not that specific variable caused any change in performance.
To provide a real-life example, we are currently running an A/B test in AdWords for a travel client of ours. We are testing two similar landing pages against each other to see if one version outperforms the other. We are running this test in search only campaigns, and simply duplicating the ads in the ad groups and only changing the landing page. So, users will see the exact same ad copy based on searching the same keywords, the only difference is the page they are directed to. This way we can monitor bounce rate and conversions to see which landing page we should continue to use in our search campaigns. We can also take these learnings to create new landing pages in the future. As mentioned earlier, it is important to remember to change your campaign settings to ‘rotate evenly’ so that each ad has the same chance to be shown and there is no auto-optimization.
Testing Using Experiments
Another way to perform A/B or split testing, is through experiments in either AdWords or Analytics. Performing a Campaign Experiment in AdWords is ideal if you have been wanting to test different settings in a top-performing campaign, but are nervous they could jeopardize performance. AdWords Campaign Experiments are done in search and display network campaigns only, but you can test anything from device settings, bid strategies, display topics, remarketing audiences, or other structural campaign changes. It is not a great avenue to test changes in ad copy (CTA tests, changes in headline, etc.) because of the reset in quality score. If you want to test ad copy variables you should use the first option we described above.
Once you know the settings you would like to change in a campaign, you create a draft in AdWords, make the desired setting changes, and decide the amount of traffic you would like to attribute to this test. You want to be sure that the campaign you are using will get enough traffic for the test to be statistically significant. The experiment shares your original campaign’s traffic (and budget) and lets you test changes so you can make more informed decisions on which tactics give you a better return on investment. After your draft is complete, you can set the experiment live! One thing to note is that experiments cannot be extended, so it is best to set the experiment to run for a longer period of time than you may think and pause if you come to a conclusion early.
The experiments run through Google Analytics are different than these other campaign tests, and are most exclusively used to compare different landing pages or content from all traffic sources. These basic content experiments allow you to easily test different variations of pages and split traffic between them. As with other testing, you must have some sort of objective or goal of the test – GA requires you to determine this goal in the setup of the experiment. You are also required to place additional code on your site to implement an Analytics experiment. If you have existing goals setup in GA, have the basic code on all pages, and you have the ability to easily place additional code on site, these content experiments are a very simple and effective way to test several different landing pages at a time.
Just Test It
Testing of any sort should be an integral part of your overall strategy. It can increase conversion rate, tell you more about your customers, and keep you competitive. A great aspect of split testing (A/B testing) is that you can easily go from research to action in just a few steps. Every program, if not every campaign, likely has some variable that you should be testing. Now that I’ve got you thinking of what to test in your own programs, go out and get started!
If you’re having trouble determining what to test or need assistance managing your digital programs, get in touch with us and we can help!