Heat Level: Hot: These tips are meant for marketing experts.
Bottom Line: Facebook’s split testing tools teach you which ad variations are more likely to get you leads.
Do This: Create a split test (or A/B test) to compare ad variations such as:
Running Facebook ads is like buying a house. There are a lot of options, and you probably want to compare a few before investing in one!
When you set up ads, you want to know which headline, which photo, which ad type, which link, will perform best. In the olden days, that meant creating a whole bunch of different ads, running them all separately, and then crunching the numbers to see which performed best.
Those days are gone! Facebook now offers built-in split testing. This makes it a lot easier to figure out which ads (or combinations) are more likely to accomplish your goals.
A/B testing is only available through Facebook’s Ads Manager. If you haven’t tried this free portal yet, we wrote about why you should stop boosting and start managing.
When you open Ads Manager, you’ll see a green "Create" button in the top left corner. From there, these are the steps to create an A/B test.
Check the Special Ad Category box, and select your marketing objective. If you want to run a link, use Traffic. If you want to get likes/comments, try Engagement. If you want form submissions, try Lead Generation. Just make sure this choice is in line with the ultimate goal of the ad(s). But don’t stress - you can change it before starting the ad.
Name your campaign, then turn on the "Create Split Test" toggle button.
Select a variable. From the drop-down, choose what you want to test. If you want to test different photos or text, select Creative. The other options are more advanced, but are great for testing different demographics, budgets, ad types/locations, etc. But most likely, you should start with Creative. Click "Continue."
Add your audience, delivery settings, placement settings, budget and schedule. If you’re testing Creative, these settings will act as the overall rule for all of your ads. If you’re testing a different variable, you’ll have to set up different settings for each ad set. Click "Continue."
Now, you’ve got a campaign with two different ad sets, and one ad under each. Set up these ads with the different photos, graphics, text, links, etc. that you want to compare. You can navigate between the two ads on the left-hand side panel.
Publish the campaign, and the experiment will begin.
If a winning creative/optimization is found quickly, you’ll get an email notification and an update in Ads Manager. Or, if the two creatives are neck-and-neck, the split test will run until the end date or the budget runs out. It will then tell you which performed better. You can then recreate the winner as its own campaign to keep it going.
Of course, with this great power comes great responsibility. Running an A/B test requires a certain amount of finesse and know-how. Here are the things to consider to get good results.
Focus on one difference between the ads, like the body copy or the photo. If the two ads are completely different, you won’t know exactly why one performed better. If you want to test a bunch of different variables, start with just one A/B test and then create a new test using the winning creative. So run one where the text is different, then use the winning text in a test where the photos are different, then use the winning photo in a test where the CTA is different, etc.
Changing a few words between the ads won’t give you statistically different results. The variations should be noticeable, such as:
Each test ad should be under its own ad set. Otherwise, Facebook will start to auto-optimize and favor a certain variation. That gives the ad an unfair advantage and you won’t see real, significant results.
It can be tempting to call a winner early on. But remember that the Facebook algorithm needs time to work its weird, wonderful magic. Wait at least 72 hours for both ads to have a fair shot. Then, wait until there’s a big difference between the two ads. SEO expert Neil Patel has a really cool Statistical Significance calculator to help determine if you really have a winner. (Remember doing this in high school stats and psych?)
You might think, if this is just a test, why would I give it a big budget? If you try to test with a small budget, Facebook won’t have enough ammo to deliver your ads to the right people, and you won’t see a significant difference between the two variations. A/B tests shouldn’t be a throw-away, anyway - you’re actually running live ads, so make sure they’re both ready for prime time!
A/B testing is a fantastic tool for figuring out what your audience prefers. Instead of guessing, you’ll know for a fact that drone videos really do get you more leads than a still photo (for example)! Use this knowledge to create better, more successful ads.