Split Testing Google Ads
Feb 27, 2023Split testing your ad copies in Google Ads can DOUBLE your conversion rates & CTR... when done correctly! This is the process I use to split test my headlines & descriptions for Google Search & Performance Max campaigns.
When I review Google Ads accounts, I am constantly amazed that people are still not actively split-testing their ad copies.
What is split testing or A/B testing?
A/B testing is a method of comparing two versions of an online element (such as an Ad, website, or landing page) to determine which performs better. It involves creating two versions – A and B – of the same element and then testing them against each other to measure their performance.
The goal of A/B testing is to optimise for desired outcomes such as increased click-through rates or higher conversion rates. This type of testing allows you to make informed decisions based on data instead of guesswork when it comes to launching successful campaigns.
Why do A/B tests?
A/B testing is an essential part of a Google Ads campaign. Here are the main reasons why you need to A/B test your campaigns:
- Discover the best-performing variations: By running a split test, you can determine which version of your Ads is more effective at achieving your desired outcomes. This allows you to make informed decisions about which Ad is more likely to succeed in the future.
- Reduce launch time: Split testing reduces the time it takes to launch a successful Ad by quickly providing feedback on which variations are working and which are not. This eliminates wasted time trying out multiple versions of your Ads before finding one that works.
- Improve ROI: By testing different versions of your Ads, you can optimize for better returns on investment by focusing on what’s working instead of wasting time and money on ineffective campaigns.
- Increase conversions: Testing different combinations of Ad elements helps to identify which elements work best with each other resulting in higher conversion rates.
Google’s learning is getting better & better, but in 2023 you will still get faster results and be able to lower your conversion acquisition costs quicker by running scheduled ad copy split tests every 30 days.
The reason why you can see better results with manual split testing of your ad copies is that Google needs somewhere between 2,000 - 5,000 impressions in a 30-day period in order to complete a round of split testing.
And if you are running a small budget that does not get this number of impressions in a 30-day period, Google will never be able to complete its spilt testing properly.
So in this video, I am going to take you through the exact split testing process that I use that has seen me provide results for my clients.
Which elements of a Google ad campaign should I split test?
There are several elements you should consider testing. These include:
- Ad headline: Try different variations of the headline to see which one is most effective at catching the attention of your target audience.
- Ad copy: Test different versions of the ad copy to determine which resonates best with your audience and increases engagement.
- Call to action (CTA): Experiment with different calls to action to find out which one encourages more people to click on your Ads.
- Images/Videos: Test images and videos against each other to see how they impact conversion rates.
- Targeting settings: Try changing up targeting settings such as location, age, gender, and interests to reach the right audience for your Ads.
- Landing pages: this includes any elements on landing pages such as social proof elements, buttons, forms etc.
How to A/B test correctly?
I will show both the split testing process that I use for Search campaigns & Performance Max campaigns.
But for both types of campaigns, you need to follow these core principles for split testing:
- Test 1 clear thing at a time.
What I mean by this is that when you are setting up your tests, make sure that you can keep track of exactly what you are testing. Otherwise, you won’t know what gave you the better or poorer results.
So if you want to test the best Call to Action in your ads, have 2 x identical ads with a different CTA.
For example, if you want to test whether mentioning your price gave better results, you would run the same identical ads except that one would say:
- Prices start from $54
With the other one saying
- Call for a Free Quote Today
- Make sure you are getting enough data.
As a rule, I like seeing both ads get at least 1,000 impressions. But if you are running a really small account, you could make a rule that you review after every 30 days.
- Have a clear process for recording these results and the different tests that you have carried out.
- Understand that split testing never stops and that it is a continual process of setting up split tests, reviewing the results and then setting up new split tests.
To help make sure that you always remember to split test get my copy of my search and eCommerce checklist.
Analyzing the results and setting up the next test
Once you have collected and analyzed the data from your split testing experiment, it’s time to prepare for the next one. Setting up your next test can involve choosing different variables to compare or altering existing elements based on insights gained from previous experiments. It’s also important to implement any changes made in order to see them take effect and measure their impact.
FAQ
Is split testing the same as AB testing?
Yes, split testing and A/B testing are the same things. A/B testing is a type of split testing that compares two versions of an element to determine which performs better. Both types of testing involve creating two versions of the same element and then measuring their performance against each other.
Which tool is the best for split testing?
Some popular tools include Google Analytics, Optimizely, Visual Website Optimizer, and Unbounce. Each of these tools provides different features to help you measure the performance of your split tests. Additionally, many online services offer A/B testing as a feature or have plugins available to facilitate split testing.