2. About Me
● About 13 years of experience
across digital marketing, analytics
and experimentation
● I presented here on “Building a
Sustainable Experimentation
Strategy” in 2019
● Parent, pet lover, nature lover,
traveler
Melanie Bowles
Head of Industry, Infotrust
3. 1. General Trends in Experimentation
2. Experimentation Platforms
3. AI Enablement
4. Closing Thoughts
What We’ll Cover:
5. Fewer Free Tools
● Google Optimize was sunset in
September of 2023
● Other platforms that previously
had a free/freemium tier moved
to paid only or more restrictive
pricing
● Made A/B testing less accessible
for smaller orgs, pushing them
toward paid solutions or requiring
more technical resources to build
and maintain in-house
experimentation frameworks.
6. Impact of Fewer Free Tools
● Could the lack of approachable free tools be
affecting the prevalence of A/B testing and
experimentation?
● To Jason’s point, no clear winner is emerging
when it comes to a replacement for Google
Optimize
● Yes, companies could be using server-side tools
but we’re actually seeing this in industry as
well…
7. All-In-One Platforms
● Companies are reducing budgets and
seeking to reduce redundancy in
technology platforms
● Many martech platforms in other
categories are adding experimentation
to their features and roadmaps
● With more all-in-one platforms being
used, analysts are expected to be
all-in-one experts
CDP
Product
Analytics
CRM
8. Privacy Regulation Impacts
● More users are declining cookies and/or
declining to be tracked altogether
● Cookies stored for shorter periods of time,
means returning users may be bucketed
incorrectly into different test variations, skewing
test data
● Experimentation platforms can be difficult to
integrate with CMPs
● Some organizations are declining any type of
experimentation due to compliance risks
9. Incrementality &
Feature Flagging
Incrementality Testing: Helps companies
understand if their marketing actions are
truly making a difference or if customers
would have bought anyway. It does this by
comparing a group that sees the ads to a
group that doesn’t.
Feature Flagging: Allows companies to
turn new features on or off for different
users/cohorts without changing the code.
It’s like a light switch for software updates,
helping teams test features safely before
rolling them out to everyone.
10. Incrementality vs. A/B Testing
Aspect Incrementality Testing A/B Testing
Purpose
Measures the true impact of a
marketing action
Compares performance between
two variations to optimize
effectiveness
Key Questions
"What would have happened if we
didn't run this campaign?"
"Which variation performs better?"
Use Case
Determining if an ad campaign
drives additional revenue beyond
organic behavior
Optimizing ad creatives, website
layouts, emails, etc.
Methodology
Uses control groups that are
completely withheld from
exposure
Both groups receive a variation,
and the best performer is chosen
11. Feature Flagging vs. A/B Testing
Aspect Feature Flagging A/B Testing
Purpose
Controls feature releases and
enables gradual rollouts
Compares performance between
two variations to optimize
effectiveness
Key Questions
"Should this feature be enabled
for this user or group?"
"Which variation performs better?"
Use Case
Gradual rollouts, canary releases,
and risk mitigation for new
features
Optimizing ad creatives, website
layouts, emails, etc.
Methodology
Uses flags/toggles in the code to
turn features on or off for different
users
Splits traffic between different
versions and analyzes performance
14. All-In-One Platforms
These platforms integrate experiment design and deployment along with other
forms data collection, activation and analysis in a single ecosystem. Can enable
faster insights and data-driven decision-making.
15. Free/Freemium Options
Some platforms offer free trials or very limited access versions of their tool. The
following few platforms appear to offer a true FREE version.
16. Factors to Consider When Choosing a Platform
Easy to set up and use,
with a user-friendly
interface.
Integrates with the
organization’s existing
systems and tools
Technical support,
documentation and
tutorials
Cost-effective and
flexible pricing plans
Analysis and reporting
features meet the needs
of the organization
Compliance with any
regulatory requirements
that apply
Ease of use Integration Support
Cost Reporting Privacy
18. Gaining Efficiency with AI
Automated Test Analysis – AI can quickly
analyze experiment results, identifying
statistically significant trends and reducing
the time spent on manual data crunching.
Intelligent Experiment Prioritization –
Machine learning models can help
prioritize tests by predicting potential
impact, reducing wasted effort on
low-value experiments.
Data-Driven CTA Optimization – AI can
analyze past test results and user
engagement patterns to suggest
high-performing CTA wording, design, and
placement for future experiments.
AI-Powered Copy Generation – AI can
generate and optimize test variations for
headlines, product descriptions, and CTAs
based on engagement data, streamlining
content creation.
Automated Image Generation & Selection
– AI can create or suggest high-performing
images tailored to different audience
segments, improving test efficiency and
conversion rates.
AI-Enhanced Audience Targeting – AI can
analyze historical data to identify key
audience segments and recommend
tailored test variations based on past
behavior and preferences.
19. Persona Driven - AI Powered
Gap Analysis
1. Collect Your Inputs
a. Persona for the landing page (don’t have one? Use AI)
b. Full -page screenshot of the landing page
2. Enter prompt to understand how your page may perform for your persona
a. Ex: You are a conversion optimization expert skilled in evaluating pages for
their ability to both inform and persuade. Create a list showing the ways
the page copy does and does not meet the information needs of the
persona. Provide a list of suggested changes to the attached landing page
that would make the page more helpful and compelling to the attached
persona.
25. How to move forward with
experimentation in 2025
● The experimentation landscape has shifted with fewer free tools available since
Google Optimize's sunset
○ Consider all-in-one platforms that integrate experimentation with your
existing martech stack to maximize efficiency and budget as needed
● Incrementality testing and feature flagging are complementary approaches to
traditional A/B testing that sometimes are easier to get buy-in for
● When selecting a testing platform, consider ease of use, integration capabilities,
support, cost, reporting features, and privacy compliance
● Use AI to enhance experimentation through automated analysis, test
prioritization, content generation, and audience targeting
26. Resources
● 25 of the Best A/B Testing Tools for 2025
● Four A/B Testing Platforms to Consider When Planning for the Sunset
of Google Optimize
● The Value of Web Experimentation: Optimizely and Google Analytics
● Building a Sustainable Experimentation Strategy
27. CREDITS: This presentation template was created by Slidesgo, and
includes icons by Flaticon, and infographics & images by Freepik
Thanks!
Any questions?
melanie@infotrust.com
infotrust.com