A/B testing is a familiar practice to many marketers and businesses. But, if you’re just getting started, here’s a quick, dirty definition of A/B testing – it’s basically exactly what it sounds like.
You take two different variants of a test subject and categorize them as the control variant and the challenger variant. Then, you conduct a performance test against a set of specific success metrics to determine which variant is better.
Quiz any industry expert on the best way to optimize conversion rates, and we bet the answer you’ll get is A/B testing. In fact, TruConversion.com explains that correct A/B tests can boost conversions up to 300 percent.
But, there is a fair share of marketers and businesses that constantly choose not to use A/B test experiments. Don’t take our word for it. Check out these stats:
- For most businesses, conversion rates range from just 1 to 3%.
- 25% of marketers don’t know their conversion rates.
- Only 22% of businesses are satisfied with their conversion rates.
- 67% of companies use A/B testing when trying to optimize conversion rates.
- 85% of marketers plan to focus more on conversion rate optimization this year.
Given these stats, it’s not difficult to understand why most businesses and marketers steer clear of A/B testing for optimizing conversions. One of the major reasons marketers don’t A/B test is the string of myths associated with it.
So, before you dismiss running A/B tests, there are a few things you need to know. Here are 7 A/B testing myths we’ve busted to help you stay ahead of the curve in 2016:
Myth #1: A/B Testing and Conversion Optimization Are the Same Thing
A/B testing started way back in 1908 when William Sealy Gosset altered the Z-test to create Student’s t-test. However, it took a really long time for A/B testing to hit the mainstream. Until recently, it wasn’t a popular technique for boosting conversions.
By testing a feature or an element of your website, landing page, or even your emails, you can maximize conversions and minimize the impact of the ever-changing market dynamics. Take the example of Nature Air, a popular Costa Rican airline. Nature Air conducted an A/B test experiment on its landing pages to understand why its conversions were low. They found out that their CTA wasn’t prominently placed. So, they put a contextual CTA in the content area, and their conversions improved by 591%!
Given this example, it isn’t too difficult to understand why A/B testing has become a familiar practice for a large majority of marketers and businesses. In fact, Unbounce reports that 48% of businesses plan to spend more on A/B testing in 2016.
But, its superb efficiency in terms of conversion optimization and its ever-increasing popularity have also caused confusion. A large number of businesses and marketers believe that A/B testing and conversion optimization are the same. However, this is not true.
Let’s understand the difference between the two:
Conversion rate optimization is a structured and systematic process of increasing the percentage of website visitors who convert, meaning they take the action the website wants them to take. For example, conversion for an e-commerce website could be a product purchase. But, conversion isn’t necessarily always a purchase. Conversion for a media site could be a survey completion.
Commonly referred to as CRO, conversion rate optimization is a collective term for all the tools and methodologies marketers use to optimize their sites and campaigns. Pretty much every marketing tactic can fit under the CRO umbrella, including personalization, web data analysis, retargeting, user surveys, and A/B testing.
Remember, CRO is about using tools and methods to squeeze out the most conversions from your website. A/B testing is one of the many conversion optimization tools that help marketers gather data and understand what’s working and what’s not in order to help them make necessary changes and boost conversions.
Myth Busted: A/B testing is actually a small part of the conversion optimization process. There are a host of other activities included in conversion optimization strategies. In fact, ConversionXL has found that A/B testing is just 20% of the overall conversion process. The site describes conversion optimization as an elaborate process where:
- 60% of the total conversion optimization project time goes to managing the project and navigating client internal politics.
- 20% of the time is focused on understanding the problems on a page and getting insights about visitor behavior.
- 20% of the time goes to designing, developing, testing, and reporting. This is the part that involves A/B testing and the actual test setup.
Myth #2: Inconclusive Tests Are Total Failures
The success rate for A/B testing is marginal. And, many people believe that tests that make no real statistical difference are just a waste of time and resources. Indeed, given the following stats, it’s no wonder most entrepreneurs and marketers tend to steer clear of tests that provide inconclusive tests:
- Approximately 61% of companies run fewer than five tests every month.
- 5 out of every 7 tests fail because the results are inconclusive.
If you’re reading this, chances are you’re already aware of what an inconclusive test is. But, in case you’ve been living under a rock, here’s a quick, dirty definition of an inconclusive test – it’s a test that doesn’t show a solid, statistical difference in your conversions.
As an example of an inconclusive test, here’s a case study from Groove HQ. In a bid to lure in more customers and increase its credibility, the company decided to test customer logo examples from a variety of industries to determine which logo designs were most relatable to visitors.
But, the results of the test were inconclusive. Logo design of their customers and clients in the challenger version made no drastic difference in influencing prospects’ buying decisions.
Now, we’re sure you must be thinking it’s better to steer clear of inconclusive tests in the first place in order to save time, money, and resources. You are not alone. Most people think that inconclusive tests are a curse.
But, contrary to popular belief, inconclusive tests can actually point you in the right direction. They can direct your attention toward things you’ve missed and things that can make a radical difference.
Let’s take another example of an A/B test on Ruby Lane conducted by Grigoriy Kogan. Kogan tested the original checkout page against a simple checkout page that included the most essential information and prominent checkout buttons.
The hypothesis for the test was that prominent checkout buttons and fewer distractions would increase the checkout rate.
But, the test proved to be inconclusive. Taking a cue from the test, Kogan scrutinized the targeted audience data more closely and found that due to a minor difference in checkout flows between existing users and guests, some guests never saw the new variation but were included in the results anyway. After learning this, he conducted a retest with a more precise activation method. In the retest, the test variation improved checkout rates by 5%.
Myth Busted: Inconclusive tests aren’t failures. Instead, they give you an opportunity to rethink your interpretations. Inconclusive test results could mean either your customers did not like the changes you made, only a few of them liked the changes, or your test was measuring the wrong things.
If you devote a little time to mulling over the inconclusive test results, you may find where your interpretations are wrong and how you can fix them. The success of your A/B tests are essentially underpinned by what you learned from your previous tests. So, analyze the results and iterate. A day without a test should be considered a wasted day.
Myth #3: It Is Important to Test Everything and Anything
This one is perhaps the biggest myth of them all. But, why is that? Let’s look at some stats to understand this point a little better:
- Dell increased conversion rates by 300% by testing landing pages against website pages.
- Nature Air was able to improve its conversions by 591% by A/B testing its call-to-action.
- Sony was able to increase its CTR by 6% by A/B testing banner ads.
- CloudSponge was able to achieve a 33% conversion increase by A/B testing its website design.
- Adspresso was able to increase Facebook likes from 0 to 70 by A/B testing Facebook ads.
Given these stats, it’s not difficult to understand that A/B testing different elements can lead to a substantial increase in conversions. Unfortunately, it’s the lucrative benefits associated with A/B testing that tempt impatient marketers to run simultaneous tests and test everything and anything possible under the sun. As a result, they aren’t able to get the desired results. In fact, most marketers who test everything fail miserably.
Let’s take the example of Timothy Sykes as featured on Quick Sprout. He made the mistake of changing his video, headline, copy, and the form field design all at the same time. It resulted in a dramatic decrease in conversions. In addition, he didn’t find out if changes he made were worth implementing.
Now, the big question here is what should you test? The answer to this question is a little tricky. So, let us simplify it for you.
Though there isn’t really a silver bullet as to what you can test, it makes sense to test things that affect conversion. Testing and changing elements that do not affect your customer’s decision-making ability doesn’t make any sense.
Prioritize elements with high potential for improvement. Check analytics data to find the problem areas of your website. ConversionXL.com suggests you select test subjects based on value and cost. You must start with high-value, low-cost testing ideas.
In addition, testing and changing elements that aren’t backed by solid, actionable data won’t bring you any positive results. The major objective of A/B tests is to find out what your user wants and demands from your brand, and then make the changes accordingly.
Myth Busted: As a marketer, you may become infatuated with metrics, but remember that A/B test experiments are all about persuading your customers to take a desired action on your site. Testing everything, every time won’t be of any use in leading your clients or customers toward conversion.
Your test subjects must depend on your business objectives and goals. Though there are a lot of things you may test, Daniel Louis from TruConversion.com explains that the pages listed below are tested in almost all cases:
- Main landing pages
- Conversion (SEM) pages
- Page templates
Myth #4: A/B Testing Is for Everyone
A large percentage of marketers believe that everyone should run A/B test experiments to optimize conversion rates. Sadly, this is just a myth.
Though conversion is the highest leverage point for any business, it doesn’t mean that everyone should run A/B test experiments. Remember, A/B testing is just 20% of the overall conversion optimization process.
So, who shouldn’t run A/B tests? Well, if you don’t care about sample size, A/B tests aren’t the thing for you. An insufficient sample size may tilt your test results toward figures that wouldn’t be anywhere close to real life.
In addition, if you have solid data indicating there are some elements that aren’t working well, it makes sense to go ahead and change them without waiting to A/B test. You will optimize your conversion rate much quicker without diluting your information with a test. You can find elements that need changing in a host of other ways, including:
- Qualitative Insights – Gather insights from non-analytical sources such as heatmaps, user surveys, or personas in order to understand what’s stopping people from visiting your site or what’s putting them off at your site.
- Buyer Persona – Build buyer personas and develop your strategies accordingly.
- Personalization – Monetate.com reveals that marketers witness an average 20% increase in sales when using personalized web experiences. Remember, personalization is fundamental to your conversion optimization strategies.
- Marketing Automation – Auto Pilot HQ says brands that use marketing automation software witness 53% higher conversion rates. So, you can use marketing automation to convert better.
- Mobile Optimization – If traffic is your concern, you don’t need to A/B test. You probably need to ensure mobile optimization. Remember, visitors who have good experiences are the easiest to convert. So, you must aim to provide the best possible experiences for your visitors, no matter what device they’re using.
- Retargeting – Make sure visitors who visit your site return to your site. Hammad Akbar from TruConversion says that retargeting deeply impacts conversion rates.
Myth Busted: Contrary to popular belief, A/B testing isn’t for everyone. For small fixes and issues, such as broken pages and driving traffic, you do not necessarily need to run A/B test experiments. You can simply use one of the many other CRO strategies to make improvements effectively. In addition, testing with an insufficient sample size will produce inaccurate data anyway.
Myth #5: Once Your Test Is Over, Results Will Remain Constant
This is yet another myth that surrounds the phenomena of A/B testing. A considerably large segment of marketers are of the belief that test results remain constant after you complete your test.
However, test results are never constant, they keep on changing. No matter how well your website or landing pages are doing, they may always experience changes. Remember, test results are nothing but a picture of your A/B test’s performance for a specific time period. And, there are a host of factors that can make your conversion data change dramatically in the future.
It wouldn’t be an exaggeration to say that time impacts conversion rates in a big way, such that test results obtained in the month of October may vary dramatically from results of the same tests carried out six months earlier in April.
But, why is that? There may be a host of reasons. Maybe your visitors’ intentions about visiting your site or spending on your site are different during those two months. Therefore, it makes sense to periodically monitor the conversion rate of your A/B test’s winning variation.
It will help you track any imaginary lifts that materialize later. Imaginary lifts are common scenarios that occur due to insufficient sample sizes or unrealistic metrics. Besides helping you understand false positives, monitoring will also help you identify more optimization opportunities.
Check out the example below:
In this test, the joke version of the ad garnered more clicks than the carefully created one. Though the challenger version of the test generated more clicks, it doesn’t mean that the second ad “worked” better for the business. This is a false positive.
Other common factors that may change your conversion rate include:
- Change in source of traffic
- Impact of discounts and promotions
- Impact of design changes
It makes sense to monitor all these factors in order to get an idea of how your conversion rate changes.
Myth Busted: False positives may impact your test data. As a result, your data may gradually change over the course of time. So, don’t forget to monitor your data for a considerable period of time before making any projections. Remember, the only way to stay ahead in the game is to keep testing. And, if your tests are successful, don’t get complacent. Start a new test experiment and look for other optimization opportunities in order to minimize the impact of false positives.
Myth #6: A/B Testing Is Guaranteed to Boost Conversions
- For the last 2 years, A/B testing is the most used method for increasing conversions.
- Obama campaign raised $60 million by running A/B test experiments.
- The popularity of A/B testing has increased and now 38% of companies use A/B testing as compared to 27% recorded earlier.
Given these stats, we’re sure that almost every marketer with even a hint of experience in the industry is plagued by this myth. A large fraction of marketers believe that A/B testing can fix their conversion rate. Sadly, this isn’t true.
As discussed earlier, A/B testing is a part of the bigger conversion optimization process. It helps you gather data that facilitates decision making. A/B testing can help you find out what’s working and what’s not working on your site, but it can’t turn your site into a conversion machine.
Based on the insights you get by running A/B test experiments, you can devise a strategy or plan a solution to increase your conversions. But, you do this with the statistical data you get from the test, not the actual test itself.
Remember, you’re just one click away from oblivion. And, A/B testing won’t prove to be helpful if your website design, offers, and content fail to capture your target audience’s interest right from the get-go.
Myth Busted: A/B testing can’t fix your conversion optimization. It can only offer you insight into what’s not working on your site. You’ll need to devise strategies to fix your conversion based on the data or insights you gather from your A/B test experiments. In addition, your A/B tests won’t bear fruitful results if your site is broken.
In order to boost conversions, you’ll need to ensure that everything on your website – from the content to the overall voice of the design – comes together in a way that makes your customers and prospects want to know more about your brand and want to stay.
Myth #7: A/B Testing is Only for Tech-Savvy Marketers
This is another popular myth that constantly haunts marketers and has plagued the A/B testing industry. Many marketers believe they need to gain some technical knowledge before they run A/B test experiments. Some even believe that A/B testing is only for the tech-savvy marketers. Here’s a disturbing fact: A large number of marketers believe that A/B testing involves adding complex scripts and updating algorithms.
On the contrary, A/B testing isn’t rocket science to understand and put into practice. You don’t need to depend on your tech team or learn complex technologies to run A/B test experiments. A/B testing is what it sounds like: You take two different variations of a test subject and test them against defined success metrics. Then, depending on the results, you choose the winning variant. Simple, isn’t it?
Remember, testing technologies just reveal numbers. You are the one who is responsible for applying them to increase your conversions.
Myth Busted: A/B test experiments aren’t just for tech-savvy marketers. All you need is to know how to run A/B tests and have are the right tools in your arsenal. Here’s is a list of easy steps that will help you run A/B tests:
- Define your goals.
- Choose an A/B testing tool.
- Select what you want to test.
- Pick one variable or feature from your test subject that you wish to test.
- Segment your traffic.
- Start your A/B test.
- Analyze your results and retest.
Here are a few tools to help you run A/B tests without having to attend tech classes:
Now that you know the steps and resources for successfully running A/B test experiments, you are all set to take the plunge. Here’s a piece of advice from TruConversion’s Hammad Akbar: Refrain from content cloaking, use 302 redirects, and most important don’t be impatient.
Conclusion
So, there you have it: 7 of the deadliest myths that have long been keeping marketers away from A/B testing! But, fortunately, the world seems to be awakening to the power of A/B testing. It’s slowly becoming a fad among marketers who earlier were apprehensive about it. We’re sure these debunked A/B testing myths will help you recognize right from wrong and truth from lies!
So, if you’re really looking forward to converting the strangers to your site into paying customers, we suggest you get over these 7 A/B testing myths and start fresh.
As always, please share your thoughts in the comments. Don’t hesitate to hit us up with any questions, critiques, or feedback.
About the Author: Shruti Gupta works as a Digital Marketing Manager and Ghost writer at Designhill.com, world’s fastest-growing crowdsourcing platform for logo design contests, website design, and a host of other graphic designs. She live and breathe in digital marketing. With her profound professional, technical and people management skills, she has spent 8 years of extensive experience in SEO, affiliate marketing, digital marketing, blogging, and content marketing.
from The Kissmetrics Marketing Blog http://ift.tt/1S6OQE4
via IFTTT
No comments:
Post a Comment