By: Deborah O'Malley, M.Sc | Last updated June, 2024
It's funny to think about the fact that, in A/B testing, there are actually many different types of tests you can run -- beyond the standard A/B test.
In fact, there are four main types of tests you can choose to run.
To help you accurately run the right test for your project needs, this article breaks down the four test types, defining each term with an explanation, examples, and images.
The four different test type options are:
In an A/B test, you're testing the control, which can be thought of as the original, or latest version, against one, or more, new versions.
The goal of an A/B test is to find a new version that outperforms the control.
When running an A/B test, you can test the control against more than one variant. In this case, you have what's called an A/B/n test.
The "n" stands for any number of other variants.
If running an A/B/n test, you just need to make sure you have enough traffic to reach a significant result in a reasonable amount of time. To ensure statistically significant results, check out this article.
Here’s a visualization of a very simple A/B test.
As you can see, in version A, there's a green button. In version B, the button color has been changed to red.
If we were to run this A/B test, our goal would be to see if one button color outperforms:
While button color tests are often regarded as the simplest form of studies, they still can yield powerful results and valuable testing lessons as this GuessTheTest case study shows.
But beyond button color tests, the changes you make between the control and variants don't have to be big or drastic to yield a strong uplift.
As this powerful study shows, simply changing the orientation of a form, from the right to left side of a page, for example, can create a huge conversion gain. Which version do you think won? Go ahead and take your best guess on this formidable form test.
In an MVT test, you're testing multiple elements within a page to see which combination of elements performs best.
For example, referring back to the A/B button color test example (above), you might decide you'd also like to add a picture and test not only the button color, but also the shape of the picture, either square or circle.
With the addition of the picture shape, you're now modifying more than one element on the page.
An MVT test allows you to accurately compare the performance of all the combinations of changed elements.
This point is particularly important because in regular A/B testing, you're only able to assess the change of one version against another.
In traditional A/B testing, while you can indeed change multiple elements, the drawback is, you can't accurately account for which element change created the conversion effect.
However, with an MVT test, if set-up and run properly, you can change, test, and accurately assess any number of different combinations of elements to ascertain which elements performs best.
Here’s a visualization of a MVT test looking two different elements, across two different combinations, button color and picture shape:
A word of caution, since there are more variations, MVT tests require higher traffic, and take a longer time to run. They can also be more complex to accurately decode results.
So, it’s recommended you don’t run MVT tests unless you have adequate traffic and/or you’re a highly skilled tester with robust data analysis capabilities.
A redirect test is also called a split URL test.
The reason why is because you’re putting one test variant on one page, with its own URL, and testing it against another variant on a separate page, with its own URL.
While a redirect test is, in essence, an A/B test, the difference is that with an A/B test, version A and B sit on the same URL and traffic is directed to one variant or the other.
In contrast, in a redirect test, version A and B sit on different URLs.
A redirect test is typically used when you have an alternative page, or completely different design you’d like to test against the control.
For example, you might have a single-page form and want to test that form against a multi-step form.
To do so, you’d set-up a redirect test and send traffic either to the original form (version A, control), or the multi-step form, (version B, variant) to see which variant performs best.
Here’s a visualization of a redirect test:
Here’s a good case study example of a multi-step form test, set-up as a split URL test. This test looked at whether a multi-step or single-page registration form converted better. Which version do you think won? Take your best guess.
As a general rule of thumb, it’s best to run redirects test when:
A personalization test is a bit different than a standard A/B test.
A personalization experiment allows you to specify changes for a specific group of visitors.
Unlike traditional experiments, it can run forever, and doesn’t have variants. So, it’s more of a change implemented on a website than an actual “test”.
An example of a personalization would be testing the effect of displaying a different hero image to desktop viewers and mobile viewers. Here’s a visualization of that concept:
In this example, you can see, the desktop and mobile experience are different. The mobile version shows a different shape, or image, than on desktop.
The view is personalized depending on the device type the user is on.
Personalization experiments can be as simple as showing an image on desktop, but not mobile. Or they can be very complex.
Here’s a great real-life personalization case study showing a complex personalization test based on different user cohorts. Can you guess the right test?
With this primer, you're now ready to set-up your test and begin to testing!
Hope you found this article helpful.
Do you have any thoughts, comments, questions? Share them in the Comments section below.
Join the Best in Test awards ceremony. Submit your best tests and see who wins the testing awards.
One of the most debated testing topics is how large does my sample size need to be to get trustworthy test results? Some argue samples of more than 120,000 visitors per variant are needed to begin to see trustworthy test results. Ishan Goel of VWO disagrees. What does he think is needed to get trustworthy test results? Listen to this webinar recording to find out.
To get users clicking your content, which format works best: buttons or links. A series of 8 real-life A/B tests suggests one format consistently outperforms. Can you guess which version wins? Checkout the mini meta analysis to find out.