A step-by-step guide to A/B testing your content

A step-by-step guide to A/B testing your content

7 minute read

A step-by-step guide to A/B testing your content

7 minute read

A step-by-step guide to A/B testing your content

Daniel Fries

Freelance Writer

When it comes to creating effective content, intuition isn't enough. You want content that's purposeful, accurate and meets user needs. One way to determine if your content is effective is through A/B testing.Also known as split testing, A/B testing offers a preview of your content to real users in a manner that provides valuable insight and feedback. Avoiding hit-and-miss content publication and one-size-fits-all, generic solutions save time while allowing you to better serve your users. However, you need to get it right for the testing to be effective.Every page on your website should provide value to your visitors. But, not all pages are created equally in terms of ROI. A/B testing is essential for determining what content is working, what isn't, and why. It does this by measuring how different elements like headlines, formatting, and type of content affect traffic, consumer behaviour and conversions.

What is A/B testing?

A pyramid diagram to show the times when you need to test your content and when you can implement changes without testing.

As applied in this context, Alpha/Beta (A/B) testing is a means of trying out two different versions of content to an audience made up of actual or potential consumers who reflect your target market.There are two types of A/B testing:

  • User experience (UX) test: Tests features like CTA buttons or forms and their placement, and measures how they improve or degrade the user experience.
  • Design test: Measures how colour schemes, placement of images, and other visual aspects affect consumer behaviour and engagement.

The benefits of A/B split testing content

Split testing provides insight into user behaviour and what elements on your content drive key metrics like conversion and bounce rates, consumer confidence, and revenue generation. Of them all, revenue increases are a more tangible metric than trying to gauge something subjective like visitor experience.

The traditional eCommerce approach to A/B testing (and why it doesn't always work)

Traditionally, A/B testing was performed by eCommerce marketers to test metrics like click-through rates or conversions and to monitor paid traffic. While this sort of analysis and reporting is important, it overlooks one thing that Google and consumers care about: the quality of the content.Study after study has been able to demonstrably prove that correctly performing A/B tests and making the changes indicated by the testing can increase revenue by up to 12%. Shopify, one of the first and still-recognised best eCommerce platforms, specialises in an exhaustive split testing analysis that drills deep into the topic.

In addition to an insanely close focus on shopping cart abandonment, Shopify circles back to a few main points: know what elements need to be optimised, which should be removed completely, and when to leave well enough alone.

Proper SEO and good formatting help you rank higher on search results, but web crawlers also look for fresh, relevant content, quality user experience (UX), and other elements that are more subjective and difficult to gauge. Further, not all website-based businesses use an eCommerce business model. Many of them are affiliate marketing websites, lead generation sites, software-as-a-service (SaaS) applications, or good ‘old-fashioned brick and mortar stores.A/B testing your high-value pages lets you know what content elicits a favourable response from your target market. It's the wording of your content, the layout, and design that draws them in and keeps them on your page longer, which is another important metric. It also helps you to figure out the root cause of problems like shopping cart abandonment.

How do I know which are my "value pages"?

The simple answer is: the pages that make the most money for your business are the most valuable. Traditionally, four pages draw more traffic than any other part of your website:

  • Home page
  • About page
  • Blog
  • Contact page

Depending on what type of website you have, other pages like services or products might draw more traffic and/or convert more visitors. You can measure page visits and other metrics for individual pages to discover which are your most valuable pages, and then focus your testing on those.

To assign the value of pages yourself in Google Analytics, just follow the steps outlined in the Google Analytics support forum. Alternatively, if you’re making an investment in SEO, then you likely have some idea of which keywords represent the most opportunity.

Website analytics tools like Ahrefs will show you the top pages on your site, calculated as a function of the traffic to those pages and your ranking. Basic keyword tracking tools like Accuranker will track your keywords and let you know where you stand in the search results, giving you a good idea of how valuable a keyword is based on its current rank.

A step-by-step guide to conducting an A/B split test

Testing is pretty straight forward as long as you go into it with clear goals and a plan of action. Choosing the best A/B testing strategy for your clients takes several things into account. Here's an 11-step guide to get you through it with a minimum of stress and hassle.

1. Pick one test variable

You may be looking at different components, like titles, layouts, or images, trying to determine which will improve your website's performance. It may be tempting to tweak multiple parts of your page to test them, but that makes it more difficult to determine which had an effect, and to what extent.

2. Set your testing goal

Before you can put any plan into action, you need a clear idea of what goal you're trying to achieve. Write your hypothesis out, try to predict the outcome, and then use your chosen testing element to prove it. In the case of Brookdale Living (which we’ll get to in a minute), the company speculated that adding images or video would increase landing page conversions. The results can be found below.

3. Create separate 'Control' and 'Challenger' pages

In order to test your hypothesis, your test pages should consist of one unaltered page and one page with your chosen element. The results will conclusively determine if the alterations improved page performance or not.

4. Create equal, random testing groups

In order for the test results to be relevant, your testing groups should be chosen randomly and of the same size and general demographic. If your sample audience is 1,000 people, 500 should be shown to the control page and 500 the challenge page.Random sampling helps reduce unintentional biases. Don't let your test audience know that you're conducting A/B testing, as this could subconsciously affect their response.

5. Choose an adequate sample size

Your sample size should be large enough to get an accurate, statistically significant measure of how a wide audience responds to your content. Checking the behaviour of 20 random people who visit your website isn't going to tell you much.

6. Determine the significance of your results

Are you happy with a 50% increase or is 80% improvement what you need to make a decision? The answer to this question will help determine the scope and purpose of your testing. One metric to pay special attention to is the confidence level. The higher it is, the more confident you can be in the results. You can evaluate the significance of your results with an A/B testing calculator.

7. Run only one test per campaign

Testing multiple components of your way page can muddy your results. Run one campaign at a time, analyse the results, then test another element, if you feel the need.

8. Use a tool specifically for A/B testing

The most reliable way to measure your test results is to use an A/B testing tool. A tool like Google Analytics' Experiments will allow you to test up to 10 full versions of your web pages and provide a comparative analysis. It's much easier and more scientifically relevant than conducting a poll.

9. Test both versions simultaneously

In order to gain clear insight, both samples should be tested at the same time, using the same visitor sample size. Running them one after another won't tell you if the results are due to changes in content or simple fluctuations in interest due to season or other reasons.

10. Give the test enough time to bear fruit

It's impossible to gather any deep insight if you test your page content for a week or two. Consumer behaviour waxes and wanes over the short term for a variety of reasons.How long is long enough? That depends on your sample size and that amount of traffic you generate. The higher the number, the longer your testing should run.

11. Ask for feedback from real users

The best way to do this is to ask visitors to take a short survey before leaving your website. Questions should be short, few, and focus on the goal of the testing.Did they like the new headline? Were the images you added a factor in their opinion or decision-making process? What about the changes did they like or not like?

Pro tips and a few words of caution for testing content

You may be anxious to get your testing over and take your content live, but just remember: slow and steady wins the race. Give your testing enough time to really show results.Also make sure that your sample size is large enough. It should use a sample of at least 1,000 viewers and last for at least two months.You also need to make sure that you follow up your testing with action. Finding out that your setting your video to begin automatically is driving your visitors away means nothing if you don't remove it or set it to user-generated play.

Three case studies that demonstrate the effectiveness of targeted A/B testing

It's easy to read an article about something, but that won't really demonstrate how concepts translate into the real world. The following A/B testing case studies and how they helped these businesses refine their website content.

1: Bionic Gloves

This case study involved testing elements of the checkout page for Bionic Gloves, a sports accessory company. The A/B test was conducted on two groups from a total of 1,400 visitors over a period of 48 days.

One group was directed to a checkout page that included a promo code/gift card box, and the other was provided a basic checkout interface with no promo code box. It was determined that the presence of a promo code/gift card box resulted in a higher rate of shopping cart and page abandonment because customers would leave the website to find discount codes.

Removing this box increased page revenue by 24.7%, with a per visitor increase of 17.1%.

The main takeaway: Keep your checkout page free from clutter, distractions, and any links that lead customers away from your website.

2: Brookdale Living

Brookdale Living, an assisted living solution provider for seniors, wanted to upgrade their text-only landing page. They created one version that included video and another that incorporated images, working on the theory that richer landing page content would result in higher conversions.

The test was conducted from a pool of 3,000 website visitors over a period of two months.The company was able to increase their revenues by $106,000 per month and conversions by 3.93%, and their confidence level rose to 99.99%. Although both versions were well-received, the page that contained images won out over the video version due to the fact that many of their potential customers had low-speed internet connections, so the video didn't play as well.

The main takeaway: Adding richer content resulted in a better quality user experience (UX), but make sure that you know your target market.

3: 160 Driving Academy

The 160 Driving Academy wanted to increase their enrollment and inquiries into their truck driving school. Their A/B test involved creating two different truck driving classes pages, one with a stock photo, and another with a photo of an actual student posed in front of one of the driving academy's own trucks. The challenge page also included two more mentions of the school's name than the control page.

The new page resulted in a 161% rise in conversions and a 38.4% increase in class registrations; both netted a 98% confidence level.

Main takeaway: Generic doesn't cut it. People respond better to natural, realistic photos of people they can relate to, so use real photos on your eCommerce website.

If you're looking for more examples of how A/B testing can boost your conversion rates, revenues, and confidence levels, check out this Shopify case study.

The bottom line for testing content

How you organise your content, and the elements you add or remove, can be a game-changer for how effective your content is.But, doing it right involves more than guesswork. Now that you can see how meaningful A/B testing can help you boost traffic, engagement, and revenue, what are you waiting for? Those tests won’t run themselves.

When it comes to creating effective content, intuition isn't enough. You want content that's purposeful, accurate and meets user needs. One way to determine if your content is effective is through A/B testing.Also known as split testing, A/B testing offers a preview of your content to real users in a manner that provides valuable insight and feedback. Avoiding hit-and-miss content publication and one-size-fits-all, generic solutions save time while allowing you to better serve your users. However, you need to get it right for the testing to be effective.Every page on your website should provide value to your visitors. But, not all pages are created equally in terms of ROI. A/B testing is essential for determining what content is working, what isn't, and why. It does this by measuring how different elements like headlines, formatting, and type of content affect traffic, consumer behaviour and conversions.

What is A/B testing?

A pyramid diagram to show the times when you need to test your content and when you can implement changes without testing.

As applied in this context, Alpha/Beta (A/B) testing is a means of trying out two different versions of content to an audience made up of actual or potential consumers who reflect your target market.There are two types of A/B testing:

  • User experience (UX) test: Tests features like CTA buttons or forms and their placement, and measures how they improve or degrade the user experience.
  • Design test: Measures how colour schemes, placement of images, and other visual aspects affect consumer behaviour and engagement.

The benefits of A/B split testing content

Split testing provides insight into user behaviour and what elements on your content drive key metrics like conversion and bounce rates, consumer confidence, and revenue generation. Of them all, revenue increases are a more tangible metric than trying to gauge something subjective like visitor experience.

The traditional eCommerce approach to A/B testing (and why it doesn't always work)

Traditionally, A/B testing was performed by eCommerce marketers to test metrics like click-through rates or conversions and to monitor paid traffic. While this sort of analysis and reporting is important, it overlooks one thing that Google and consumers care about: the quality of the content.Study after study has been able to demonstrably prove that correctly performing A/B tests and making the changes indicated by the testing can increase revenue by up to 12%. Shopify, one of the first and still-recognised best eCommerce platforms, specialises in an exhaustive split testing analysis that drills deep into the topic.

In addition to an insanely close focus on shopping cart abandonment, Shopify circles back to a few main points: know what elements need to be optimised, which should be removed completely, and when to leave well enough alone.

Proper SEO and good formatting help you rank higher on search results, but web crawlers also look for fresh, relevant content, quality user experience (UX), and other elements that are more subjective and difficult to gauge. Further, not all website-based businesses use an eCommerce business model. Many of them are affiliate marketing websites, lead generation sites, software-as-a-service (SaaS) applications, or good ‘old-fashioned brick and mortar stores.A/B testing your high-value pages lets you know what content elicits a favourable response from your target market. It's the wording of your content, the layout, and design that draws them in and keeps them on your page longer, which is another important metric. It also helps you to figure out the root cause of problems like shopping cart abandonment.

How do I know which are my "value pages"?

The simple answer is: the pages that make the most money for your business are the most valuable. Traditionally, four pages draw more traffic than any other part of your website:

  • Home page
  • About page
  • Blog
  • Contact page

Depending on what type of website you have, other pages like services or products might draw more traffic and/or convert more visitors. You can measure page visits and other metrics for individual pages to discover which are your most valuable pages, and then focus your testing on those.

To assign the value of pages yourself in Google Analytics, just follow the steps outlined in the Google Analytics support forum. Alternatively, if you’re making an investment in SEO, then you likely have some idea of which keywords represent the most opportunity.

Website analytics tools like Ahrefs will show you the top pages on your site, calculated as a function of the traffic to those pages and your ranking. Basic keyword tracking tools like Accuranker will track your keywords and let you know where you stand in the search results, giving you a good idea of how valuable a keyword is based on its current rank.

A step-by-step guide to conducting an A/B split test

Testing is pretty straight forward as long as you go into it with clear goals and a plan of action. Choosing the best A/B testing strategy for your clients takes several things into account. Here's an 11-step guide to get you through it with a minimum of stress and hassle.

1. Pick one test variable

You may be looking at different components, like titles, layouts, or images, trying to determine which will improve your website's performance. It may be tempting to tweak multiple parts of your page to test them, but that makes it more difficult to determine which had an effect, and to what extent.

2. Set your testing goal

Before you can put any plan into action, you need a clear idea of what goal you're trying to achieve. Write your hypothesis out, try to predict the outcome, and then use your chosen testing element to prove it. In the case of Brookdale Living (which we’ll get to in a minute), the company speculated that adding images or video would increase landing page conversions. The results can be found below.

3. Create separate 'Control' and 'Challenger' pages

In order to test your hypothesis, your test pages should consist of one unaltered page and one page with your chosen element. The results will conclusively determine if the alterations improved page performance or not.

4. Create equal, random testing groups

In order for the test results to be relevant, your testing groups should be chosen randomly and of the same size and general demographic. If your sample audience is 1,000 people, 500 should be shown to the control page and 500 the challenge page.Random sampling helps reduce unintentional biases. Don't let your test audience know that you're conducting A/B testing, as this could subconsciously affect their response.

5. Choose an adequate sample size

Your sample size should be large enough to get an accurate, statistically significant measure of how a wide audience responds to your content. Checking the behaviour of 20 random people who visit your website isn't going to tell you much.

6. Determine the significance of your results

Are you happy with a 50% increase or is 80% improvement what you need to make a decision? The answer to this question will help determine the scope and purpose of your testing. One metric to pay special attention to is the confidence level. The higher it is, the more confident you can be in the results. You can evaluate the significance of your results with an A/B testing calculator.

7. Run only one test per campaign

Testing multiple components of your way page can muddy your results. Run one campaign at a time, analyse the results, then test another element, if you feel the need.

8. Use a tool specifically for A/B testing

The most reliable way to measure your test results is to use an A/B testing tool. A tool like Google Analytics' Experiments will allow you to test up to 10 full versions of your web pages and provide a comparative analysis. It's much easier and more scientifically relevant than conducting a poll.

9. Test both versions simultaneously

In order to gain clear insight, both samples should be tested at the same time, using the same visitor sample size. Running them one after another won't tell you if the results are due to changes in content or simple fluctuations in interest due to season or other reasons.

10. Give the test enough time to bear fruit

It's impossible to gather any deep insight if you test your page content for a week or two. Consumer behaviour waxes and wanes over the short term for a variety of reasons.How long is long enough? That depends on your sample size and that amount of traffic you generate. The higher the number, the longer your testing should run.

11. Ask for feedback from real users

The best way to do this is to ask visitors to take a short survey before leaving your website. Questions should be short, few, and focus on the goal of the testing.Did they like the new headline? Were the images you added a factor in their opinion or decision-making process? What about the changes did they like or not like?

Pro tips and a few words of caution for testing content

You may be anxious to get your testing over and take your content live, but just remember: slow and steady wins the race. Give your testing enough time to really show results.Also make sure that your sample size is large enough. It should use a sample of at least 1,000 viewers and last for at least two months.You also need to make sure that you follow up your testing with action. Finding out that your setting your video to begin automatically is driving your visitors away means nothing if you don't remove it or set it to user-generated play.

Three case studies that demonstrate the effectiveness of targeted A/B testing

It's easy to read an article about something, but that won't really demonstrate how concepts translate into the real world. The following A/B testing case studies and how they helped these businesses refine their website content.

1: Bionic Gloves

This case study involved testing elements of the checkout page for Bionic Gloves, a sports accessory company. The A/B test was conducted on two groups from a total of 1,400 visitors over a period of 48 days.

One group was directed to a checkout page that included a promo code/gift card box, and the other was provided a basic checkout interface with no promo code box. It was determined that the presence of a promo code/gift card box resulted in a higher rate of shopping cart and page abandonment because customers would leave the website to find discount codes.

Removing this box increased page revenue by 24.7%, with a per visitor increase of 17.1%.

The main takeaway: Keep your checkout page free from clutter, distractions, and any links that lead customers away from your website.

2: Brookdale Living

Brookdale Living, an assisted living solution provider for seniors, wanted to upgrade their text-only landing page. They created one version that included video and another that incorporated images, working on the theory that richer landing page content would result in higher conversions.

The test was conducted from a pool of 3,000 website visitors over a period of two months.The company was able to increase their revenues by $106,000 per month and conversions by 3.93%, and their confidence level rose to 99.99%. Although both versions were well-received, the page that contained images won out over the video version due to the fact that many of their potential customers had low-speed internet connections, so the video didn't play as well.

The main takeaway: Adding richer content resulted in a better quality user experience (UX), but make sure that you know your target market.

3: 160 Driving Academy

The 160 Driving Academy wanted to increase their enrollment and inquiries into their truck driving school. Their A/B test involved creating two different truck driving classes pages, one with a stock photo, and another with a photo of an actual student posed in front of one of the driving academy's own trucks. The challenge page also included two more mentions of the school's name than the control page.

The new page resulted in a 161% rise in conversions and a 38.4% increase in class registrations; both netted a 98% confidence level.

Main takeaway: Generic doesn't cut it. People respond better to natural, realistic photos of people they can relate to, so use real photos on your eCommerce website.

If you're looking for more examples of how A/B testing can boost your conversion rates, revenues, and confidence levels, check out this Shopify case study.

The bottom line for testing content

How you organise your content, and the elements you add or remove, can be a game-changer for how effective your content is.But, doing it right involves more than guesswork. Now that you can see how meaningful A/B testing can help you boost traffic, engagement, and revenue, what are you waiting for? Those tests won’t run themselves.

Worksheet

User Journey Map

A tool to help you plan better content for your audience and map what users are thinking, feeling, and doing.

No items found.

About the author

Daniel Fries

Dan Fries is a freelance writer and full stack Rust developer. He looks for convergence in technology trends, with specific interests in cyber security and micromobility. Dan enjoys snowboarding and is based in Hong Kong with his pet beagle, Teddy. Reach him online at his personal blog.

Related posts you might like