Put Your Marketing Campaigns to the A/B Test

By on May 5, 2020

When it comes to your content, you should never make assumptions. Marketers should always be making steps to improve things like the design, user experience and copy to drive conversions and increase revenue.

One very impactful way to do that is to run A/B tests.

What is A/B Testing?

In marketing, an A/B test is when you take two different versions of one piece of content with one version incorporating changes to a single variable. You then split your audience and show each of them one version of the content to see which performs better.

There are a number of things that marketers can test:

  • Call to Action (CTA)
  • Subject Lines
  • Copy
  • Colors
  • Layouts
  • Pricing Displays
  • Button Placement or Size
  • Images or Videos
  • Form Length or Location

Performing A/B tests can save you time and money. Instead of having to go back and forth evaluating two different options for your content, or waste time disagreeing internally over which direction to go, you can simply test to see which one is performing better and then let the data determine how to proceed. A/B testing also provides valuable data that you can use in future content and campaigns.

Can You A/B Test Multiple Things at the Same Time?

The more you or those in your organization tend to scrutinize every aspect of your content, the more things you will want to test. Will this color be better than that? Will this word resonate more, or will that one? Should my headline be a question or a statement? If the copy in this CTA button is in all caps, will that improve the conversion rate?

There are a number of variables you may want to test, but in order for an A/B test to be effective, you are going to need to show some restraint and only test one variable at a time. Isolating an independent variable can give you definitive insights into what made a difference. If you test too many variables at once, though, you may have no idea which changes actually made a difference and which elements to continue to incorporate in your future content.

So, with all the possible elements to choose from, how do you choose what to test first? This will depend on your goal.

Selecting a Goal First

Before you run your tests, you will need to choose a specific goal. If your goal is too broad, like “gather data,” then your tests will not have any direction and the data that you do collect will not have much actionable meaning.

Let’s say your goal is to determine what wording is more compelling to your audience. We have a client that wanted to run a series of email campaigns around their green solutions. However, there was uncertainty over whether we should use the word “sustainable” or “eco-friendly.” Because we were not sure which would yield a higher response, we decided to A/B test the subject lines of the first email in the campaign.

You can see that the only difference between those two subject lines was a single word, so it was obvious what caused the difference in performance. We decided to repeat the same test later that month with a different subject line, but still testing the same variable. Once again, “Sustainable” was victorious with 8.6% increase in the open rate.

Because we took the time to test just those two words, the bulk of our audience received emails with a subject line that we proved to be more compelling and generate better results. We also learned an important lesson as to what terminology we should be using in similar content. Had we gone with our assumption and used “eco-friendly,” we would not have seen as much success with the campaign.

A, B, See the Results

Here are some other examples of what you can A/B test and how:

A/B Tests

Example Hypothesis

How to Prove it

User Experience

Prospects are more likely to fill out a form on our landing page if it’s located above the fold.

Use your existing landing page as a control, then create a challenger version with the form in a different spot.

Ad Copy

Our PPC ads will perform better if we include the name of our product in the copy.

Run an alternate version of your current ad with the copy change incorporated and analyze performance over time.

Subject Lines

Open rates will improve if we use personal information, like first names, in the subject line.

Take a small percentage of your total audience and send half of them one subject line and the remaining half the other. After a day, see which email had a better open rate and send the remaining audience the email using the winning subject line.

Email Sender

If we use the company name as the sender instead of a person’s name, open rates will suffer.

Perform the same test as the one outlined above, but instead of using different subject lines, use different senders. In this scenario, you would still measure open rates to determine success.


Changing the color of our CTA button will increase click through rates.

Perform the same email A/B test, but instead send out two separate versions of your email to your test audience. Measure success by reviewing the click through rate of each email. If you’re testing design variables on a landing page, create a challenger version of your page with only one change incorporated, then measure results.

Think of A/B testing like search engine optimization (SEO). You don’t stop optimizing your site and content for conversions just because you’ve hit one goal. Both need consistent attention from marketers in order to be as successful as possible. After you’ve run enough tests to prove or disprove your hypothesis, it’s time to move onto the next experiment! Once enough time has passed, or as your audience or brand evolves, you may even want to revisit some of your older tests to see if you get the same results.

About Ridge Marketing

Ridge Marketing is an agency that specializes in crafting creative digital assets and using websites, search, advertising and email to ensure that the right prospects interact with your brand and become loyal customers.