By: Deborah O'Malley, M.Sc | Last updated January, 2023
It’s official!
Google Optimize, the free A/B testing platform, is shutting down.
According to an email statement released by Google, as of September 30, 2023, Google Optimize and Optimize 360 will no longer be available:
As an experimenter running tests on Google Optimize, this news could spell bad news for you -- especially if you’re heavily invested in using Optimize.
What should you do now?
Here's the top-3 next steps I recommend you take. Now.
If you're a testing agency, or an organization who has run loads of tests on Google Optimize, you may have a large testing repository housed within Optimize.
Now is the time to take those test results and start documenting them in a place where you'll be able to access them in the future.
You may be thinking, why would I need old tests?
The reason is you're going to want to be able to look back on tests you've run, see what worked, what you've already tried, and show clients or management already vetted ideas.
Documenting your old tests may take a lot of time, but it will be one of the most valuable things you can do, long-term.
To begin the documentation process, Google recommends you export your Optimize data by exporting a CSV file. They provide more info here.
But I recommend you take it a step further and capture screenshots of the test images, plus all your test data.
Yes, it'll take some time, but it'll be worth it in the end!
Cause, trust me, there's nothing worse than having all the tests you want access to just gone!
To make this job easiest for you, I've created a free test repository template you can use. Simply download and fill in the data relevant to your tests.
You can access the free template here. It looks like this:
Note, if you don't have time to be filling in the testing template yourself, Fiverr and Upwork are great places to hire someone to do this task cheaply.
Right now, it remains in question if and when Google will provide a viable alternative to Optimize.
Officially, Google has said they're "committed” to bringing effective solutions to customers:
But what that *really* means is unclear.
As Johann Van Tonder eloquently put it in this LinkedIn thread, that statement is light on detail and heavy on vague BS:
Analytics experimentation experts like, Shawn David, say, to work, any new Optimize alternative will have to be structured radically differently because the current Optimize format is cookie-based, but anything new will be sessions-based, so won't work the same way:
However, Data Analytics Architect, Juliana Jackson of Media.Monks suggests Google's statement means you'll be able to make Google Analytics 4 (GA4) your hub to measure and analyze test results, but will need to find a new testing platform altogether. See this LinkedIn thread:
At this point, nobody really knows for certain.
We can only speculate and wait till the almighty Google tells all.
But waiting too long is a dangerous move. You risk getting left far behind.
So, it’s highly recommended you start exploring alternative testing platform options now.
As optimizer, Rich Page, outlined in this helpful LinkedIn post, right now both Convert and VWO are offering new plans to help you migrate from and replace Google Optimize:
And, if you’re willing to go paid, there’s certainly plenty of options out there for you.
Although not an exhaustive list, in this post, Speero CEO, Ben Labay, suggests several A/B testing vendors to consider, including Optimizely, Kameleoon, AB Tasty, Adobe Target, and SiteSpect, to name just a few:
The best testing platform for you depends on your unique testing needs.
Most platforms are priced on traffic, so the more traffic you have, the more you’re likely to pay.
But, behind cost, there’s factors to consider like ease of migration, features and functionality, and trustworthiness of the data collection methods.
In this post, Ben suggests you should make your platform choice by evaluating how it assigns variables and metrics:
Some experimenters are delighting at the downfall of Google Optimize.
They’ve stated things like only amateurs incorrectly used the free testing platforms and, according to stats guru Georgi Georgiev, don’t get accurate results anyway:
While some have argued, the market share for Google Optimize tends to be amateur, with only junior testing teams reliant on the free platform, Convert has compelling evidence showing 61% of experimenters have at the very least signed up for an Optimize account – even if it’s not active:
And, according to this post, Optmize users include big name brands like Swarovski crystals and Garnter:
That said, not every experimentation team using Optimize is a big-name brand. Many are, most certainly, smaller.
If you’re one of them, now is the time to be evaluating if A/B testing is truly for you.
As I’ve shared before, truly trustworthy test results are hard to obtain. Properly-powered A/B tests need huge sample sizes, typically of +120 thousand visitors per variant.
If you’re in the 88% of organizations that don’t have anywhere near this type of traffic, you might want to consider other experimentation methods, including user experience testing, consumer surveys, exit polls or customer interviews.
None of which need any A/B testing software whatsoever.
I also recommend engaging in what I’ve coined “B/A” testing.
B/A testing is the opposite of A/B testing.
In B/A testing, you take the Baseline conversion rate, form data-driven hypotheses, implement changes, and Assess the conversion rate over a reasonable period of time, based on your sales cycle.
If the conversion rate is flat or down, your assessment shows you tweaks need to be made.
You then go back to the drawing board, form data-driven hypotheses on what you think is lagging, and change those aspects.
Then, again, assess, comparing the Before to the After.
Baseline/assess. Before/after. Baseline/assess. Before/after. B/A. B/A.
You continue this iterative process until you see conversion rates increase.
This solution is particularly good for lower traffic sites that don’t have the volume to run trustworthy tests, but still want to experiment and optimize conversions.
The downside is, all traffic is exposed to the “variant” – the change(s) you’re implementing on the site. But, the upside is, by exclusively monitoring the conversion rate, you know for certain sales are increasing. Or not.
There’s no more guesswork or relying on data you can’t fully trust. Just results.
Here’s a real-life client example of B/A testing in action:
As you can see, over the time period measured, the eCommerce conversion rate went from 2.32% to 3.14%, contributing to a solid 35.48% increase for the company -- which resulted in many more thousands of dollars over the sales period! And counting.
And this example isn't isolated. I’ve used this method for many happy clients and have seen conversion rates and sales figures increase +30% in just a couple months.
So, B/A testing may be a viable alternative when A/B testing just doesn’t make sense.
If you’re interested in learning more about how B/A testing could help you, get in touch for a free conversion audit.
These results show, A/B testing isn't the only way to optimize and increase conversion rates.
The testing landscape is quickly changing. Experimenters need to be on their toes, and ready to spring into action.
In the immediate term, experimenters should:
While the sun may be setting on Google Optimize, the sun also rises, providing innovative experimenters the chance to experiment with new optimization opportunities.
Hope you found this article helpful!
Do you have any thoughts, comments, questions?
Share them in the Comments section below.
Join the Best in Test awards ceremony. Submit your best tests and see who wins the testing awards.
A primer explaining the 4 different types of tests you can run, what they mean, and how you can use each to improve your competitive testing advantage.
One of the most debated testing topics is how large does my sample size need to be to get trustworthy test results? Some argue samples of more than 120,000 visitors per variant are needed to begin to see trustworthy test results. Ishan Goel of VWO disagrees. What does he think is needed to get trustworthy test results? Listen to this webinar recording to find out.