SlideShare a Scribd company logo
opticon2017
What Does a Successful Experimentation
Program Look Like?
Beth Foster
Principal Product Planner,
Experimentation, Microsoft
opticon2017
“Sunsets”
- Deception pass
opticon2017
“Something new”
- Steamboat rock
opticon2017
“Hideouts”
- Birch Bay
opticon2017
“Beach combing”
- Bayview
opticon2017
“The view”
- Fort Casey
Opticon 2017 Advanced Program Management
opticon2017
Why do you experiment?
Test my new
icon.
Tell me which
video carousel
works best.
Find out if mobile
customers like our
new menus!
https://flic.kr/p/6rKxaH
opticon2017
1. Improve the customer
experience
2. Optimize conversion
events
3. Validate new features
4. Build an experimentation
culture
Program goals
opticon2017
Framework for Program Success
Assess
opportunities
Prioritize
backlog
Measure
program
$
Prioritize backlog
• Business strategy
• Customer impact
• Opportunity
• Level of effort to implement
https://flic.kr/p/7VFBK2
opticon2017
Prioritize backlog
Use an objective, consistent model to score experiment ideas from your teams.
Does this align to
the core initiatives?
Business strategy
Key business priority
Committed engineering
work
Will this impact a
customers broadly?
Customer impact
Scope of update
worldwide
Potential to impact
multiple categories and
platforms
Direct response to
customer pain point
Is the impact to
business high?
Opportunity
Estimated amount of
impact to revenue or
conversion activity
Impact to overall
category
How much time does it
take to build or plan?
Effort
Amount of estimated
development time
opticon2017
Prioritize backlog
Test ideas
submitted
Cross-group
team approves
concepts
Experimentation
team scores test
Prioritization
model score
added to request
Measure progress
# of tests launched
% of traffic tested
% of tests deployed globally
Test win rate
Revenue impact
opticon2017
Measure progress
Track key program measurements to track month over month business progress.
Metric Description Monthly
Target
Actual YTD
# of tests launched Measures test velocity 5 4 12
% of traffic tested Tracks testing expected traffic volumes 1M MUV .5 4M
% of tests globally
deployed
% of campaigns tested outside US 50% 50% 40%
% of INTL tests % of total volume 60% 50% 55%
Test win rate Specific to Optimization tests 30% 40% 35%
Revenue impact Estimated annual impact to revenue 5M 6M 12M
Example scorecard
Assess
opportunities
Define potential impact
Set targets
Reevaluate quarterly
opticon2017
Assess Opportunity
Home
Explore
Marketing CLEs
Download
Details pages
Buy
PDP pages
How-to
Help pages
1 Identify conversion by
page or page groups
2 Calculate estimated
growth
3 Measure overall
estimated impact
Use to define realistic targets that moves a programs from what’s been done, to what’s possible.
4 Set targets
Home
PDP Pages
Home
opticon2017
Framework for Program Success
Assess
opportunities
Prioritize
backlog
Measure
program
$
“Our goals can only be reached
through a vehicle of a plan”
- Pablo Picasso
opticon2017
What Does a Successful Experimentation
Program Look Like?
Claire Vo
Sr. Director of Product,
Optimizely
opticon2017opticon2017
Building Successful Programs with
Optimizely Program Management
Opticon 2017 Advanced Program Management
opticon2017
Framework for Program Success
Assess
opportunities
Prioritize
backlog
Measure
program
opticon2017
PRIORITIZE BACKLOG
Capture Ideas
Always start with a
hypothesis – give
teams examples of
what a good one
looks like.
Give structure to
testing backlog by
categorizing ideas
by site / page Naming
conventions can
help keep things
organized and
easy to report
(Beth is an
expert!)
opticon2017
PRIORITIZE BACKLOG
Scoring
Potential
of the experiment
to positively
impact your goal
Impact
to the business is
the experiment
does win
Level of Effort
to implement the
experiment in
Optimizely
Love
Strategic importance,
executive support,
biased opinions
opticon2017
PRIORITIZE BACKLOG
Selecting & Accepting Ideas
Active status lets
everyone know
you’ve selected this
for experimentation
and are beginning
work
Backlog status lets
your team know it’s
still under review by
the central group
opticon2017
Democratize
(good) ideation
• Train teams on how to write a
hypothesis
• Create a process (like
Microsoft’s) for submission,
scoring, and acceptance
• Give constructive feedback
and let teammates share in
successes
opticon2017
Framework for Program Success
Assess
opportunities
Prioritize
backlog
Measure
program
opticon2017
MEASURE PROGRAM
Velocity & Win Rate
Testing Velocity lets
you measure your
teams operational
performance
Win Rate helps teams
measure the quality of
their hypotheses and the
impact to the business
opticon2017
Using Program
Reporting Right
Don’t just look at these
reports!
• Set monthly, quarterly, and
annual goals for performance
• Measure and report on a
monthly basis
• Reassess on an annual basis
• Integrated the learnings into
the operations of your
experimentation program
opticon2017
ASSESS OPPORTUNITIES
Improving Operational Performance
Filter your program
reporting to uncover
interesting insights
opticon2017
Insights on
your program
Some things you may learn:
• Some teams are more
effective than others in
moving an idea to experiment
quickly
• Win rates may be significantly
lower on certain pages of your
site
• Testing velocity may slow
around holidays
• Certain experiment strategies
work more effectively
opticon2017
You can’t improve if
you don’t measure.
opticon2017
Framework for Program Success
Assess
opportunities
Prioritize
backlog
Measure
program
opticon2017
ASSESS OPPORTUNITIES
Expected Value
$1 per roll of the die
Every time you roll a 3
I pay you $5
If you want to play this game, come see me after this presentation!
Win Rate – 1/6 = 16%
Value of Win = $5
Expected Value of Roll = 83 Cents
opticon2017
ASSESS OPPORTUNITIES
Expected Value of an Experiment
Win Rate
x
Average Lift of Winning Test
x
Revenue Value of Test
opticon2017
ASSESS OPPORTUNITIES
Expected Value of an Experiment
Every time you win, you get 5% increase
on your revenue, which is $10,000,000
and you win 10% of the time
IF IT COSTS LESS THAN $50K TO RUN A TEST, RUN IT!
Win Rate = 10%
Value of Win = 5% x $10,000,000 = $500,000
Expected Value of Test - $50,000
opticon2017
ASSESS OPPORTUNITIES
Expected Value of an Experiment
Expected Value of Experiment
x
Annual Testing Velocity
opticon2017
ASSESS OPPORTUNITIES
Annual Expected Value of Program
High Complexity
Low Velocity
Low Complexity
High Velocity
Type of Tests Take longer, bigger changes Easier, smaller changes
Win Rate 20% 10%
Avg Lift 30% 10%
Expected Value of Test $300,000 $100,000
Tests / Year 25 100
Annual Expected Value of
Testing Program $15M $10M
opticon2017
Assessing
Opportunity
As you begin to understand
your program:
• Track how win rate and
velocity impact your overall
program value
• Think about ROI and where
you can maximize
• Take a “portfolio” approach to
your experimentation program
opticon2017
Framework for Program Success
Assess
opportunities
Prioritize
backlog
Measure
program
opticon2017
Q&A
opticon2017
THANK YOU
opticon2017

More Related Content

PPTX
Opticon 2017 Achieving Success with a Lean Experimentation Team
PPTX
Opticon 2017 Pushing the Boundaries of Experimentation
PPTX
Opticon 2017 Experimenting in Modern Web Applications
PPTX
Opticon 2017 Hooked: How to Succeed
PPTX
Opticon 2017 Cracking the Code
PPTX
Opticon 2017 Beyond CRO
PPTX
Opticon 2017 How Developers Can Take Experimentation
PDF
A/B Testing for WordPress & Drupal
Opticon 2017 Achieving Success with a Lean Experimentation Team
Opticon 2017 Pushing the Boundaries of Experimentation
Opticon 2017 Experimenting in Modern Web Applications
Opticon 2017 Hooked: How to Succeed
Opticon 2017 Cracking the Code
Opticon 2017 Beyond CRO
Opticon 2017 How Developers Can Take Experimentation
A/B Testing for WordPress & Drupal

What's hot (20)

PDF
Optimizely x Live Demo
PDF
Successful Testing with a Lean Team
PDF
Optimizely X Seminar Amsterdam Nov 10
PDF
Optimizely Experience Customer Story - Atlassian
PDF
Introducing Program Management
PDF
[Webinar] Innovate Faster by Adopting The Modern Growth Stack
PDF
A/B Mythbusters: Common Optimization Objections Debunked
PDF
Meet Optimizely X Web Experimentation
PDF
Optimizely Workshop 1: Prioritize your roadmap
PPTX
Full Stack Experimentation
PDF
The Art of the Start --Success in the first 100 Days
PDF
Optimizely Workshop: Mobile Walkthrough
PPTX
Losing is the New Winning
PPTX
Improve your content: The What, Why, Where and How about A/B Testing
PPTX
VWO Webinar: How To Plan Your Optimisation Roadmap
 
PPTX
VWO Webinar: Scaling Your Testing Program
 
KEY
Crafting Software Products
PDF
Under the Hood: Experiment-Driven Product Design
PPTX
Experimentation through Clients' Eyes
PDF
The anatomy of an A/B Test - JSConf Colombia Workshop
Optimizely x Live Demo
Successful Testing with a Lean Team
Optimizely X Seminar Amsterdam Nov 10
Optimizely Experience Customer Story - Atlassian
Introducing Program Management
[Webinar] Innovate Faster by Adopting The Modern Growth Stack
A/B Mythbusters: Common Optimization Objections Debunked
Meet Optimizely X Web Experimentation
Optimizely Workshop 1: Prioritize your roadmap
Full Stack Experimentation
The Art of the Start --Success in the first 100 Days
Optimizely Workshop: Mobile Walkthrough
Losing is the New Winning
Improve your content: The What, Why, Where and How about A/B Testing
VWO Webinar: How To Plan Your Optimisation Roadmap
 
VWO Webinar: Scaling Your Testing Program
 
Crafting Software Products
Under the Hood: Experiment-Driven Product Design
Experimentation through Clients' Eyes
The anatomy of an A/B Test - JSConf Colombia Workshop
Ad

Similar to Opticon 2017 Advanced Program Management (20)

PPTX
Program Management 101: Best Practices from Optimizely-on-Optimizely
PDF
Lunch & Learn - Secret to Successful Experimentation
PDF
[Webinar] Scaling experimentation: 5 key pillars of maturity by Nick So
PPTX
Opticon 2017 Do the Thing That Makes the Money
PDF
Opticon SF - Customer Innovation Showcase
PPTX
Opticon 2017 Day in the Life of a Modern Experimenter
PPTX
Optimism Webinar 3: Lessons from Digital Leaders - How to go beyond A/B testing
PDF
10 Tactics for Building an Optimization Culture
PDF
Opticon 2015-Scaling Your Testing Program for Maximum Impact
PDF
[Webinar] Visa's Journey to a Culture of Experimentation
PDF
Building Blocks of a strong Experimentation Program (1).pdf
 
PDF
Cultivating a Culture of Experimentation
PPSX
Testing for business benefits
PDF
Turning Business Challenges into Testable Ideas - 29 Nov '23.pdf
 
PDF
Execute: Develop a Long-term Strategy for Experimentation | Optimizely ANZ We...
PDF
Craig Sullivan - Oh Boy! These A/B tests look like total bullshit! MKTFEST 2014
PDF
Opticon 2017 Decisions at Scale
PPTX
20 Ways to Shaft your Split Tesring : Conversion Conference
PPTX
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
PDF
Clover Rings Up Digital Growth to Drive Experimentation
Program Management 101: Best Practices from Optimizely-on-Optimizely
Lunch & Learn - Secret to Successful Experimentation
[Webinar] Scaling experimentation: 5 key pillars of maturity by Nick So
Opticon 2017 Do the Thing That Makes the Money
Opticon SF - Customer Innovation Showcase
Opticon 2017 Day in the Life of a Modern Experimenter
Optimism Webinar 3: Lessons from Digital Leaders - How to go beyond A/B testing
10 Tactics for Building an Optimization Culture
Opticon 2015-Scaling Your Testing Program for Maximum Impact
[Webinar] Visa's Journey to a Culture of Experimentation
Building Blocks of a strong Experimentation Program (1).pdf
 
Cultivating a Culture of Experimentation
Testing for business benefits
Turning Business Challenges into Testable Ideas - 29 Nov '23.pdf
 
Execute: Develop a Long-term Strategy for Experimentation | Optimizely ANZ We...
Craig Sullivan - Oh Boy! These A/B tests look like total bullshit! MKTFEST 2014
Opticon 2017 Decisions at Scale
20 Ways to Shaft your Split Tesring : Conversion Conference
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
Clover Rings Up Digital Growth to Drive Experimentation
Ad

More from Optimizely (20)

PPTX
Make Every Touchpoint Count: How to Drive Revenue in an Increasingly Online W...
PPTX
The Science of Getting Testing Right
PDF
Atlassian's Mystique CLI, Minimizing the Experiment Development Cycle
PPTX
Autotrader Case Study: Migrating from Home-Grown Testing to Best-in-Class Too...
PPTX
Zillow + Optimizely: Building the Bridge to $20 Billion Revenue
PDF
The Future of Optimizely for Technical Teams
PPTX
Empowering Agents to Provide Service from Anywhere: Contact Centers in the Ti...
PPTX
Experimentation Everywhere: Create Exceptional Online Shopping Experiences an...
PDF
Building an Experiment Pipeline for GitHub’s New Free Team Offering
PPTX
AMC Networks Experiments Faster on the Server Side
PDF
Evolving Experimentation from CRO to Product Development
PDF
Overcoming the Challenges of Experimentation on a Service Oriented Architecture
PPTX
How The Zebra Utilized Feature Experiments To Increase Carrier Card Engagemen...
PPTX
Making Your Hypothesis Work Harder to Inform Future Product Strategy
PPTX
Kick Your Assumptions: How Scholl's Test-Everything Culture Drives Revenue
PPTX
Shipping to Learn and Accelerate Growth with GitHub
PPTX
Test Everything: TrustRadius Delivers Customer Value with Experimentation
PDF
Optimizely Agent: Scaling Resilient Feature Delivery
PDF
The Future of Software Development
PPTX
Practical Use Case: How Dosh Uses Feature Experiments To Accelerate Mobile De...
Make Every Touchpoint Count: How to Drive Revenue in an Increasingly Online W...
The Science of Getting Testing Right
Atlassian's Mystique CLI, Minimizing the Experiment Development Cycle
Autotrader Case Study: Migrating from Home-Grown Testing to Best-in-Class Too...
Zillow + Optimizely: Building the Bridge to $20 Billion Revenue
The Future of Optimizely for Technical Teams
Empowering Agents to Provide Service from Anywhere: Contact Centers in the Ti...
Experimentation Everywhere: Create Exceptional Online Shopping Experiences an...
Building an Experiment Pipeline for GitHub’s New Free Team Offering
AMC Networks Experiments Faster on the Server Side
Evolving Experimentation from CRO to Product Development
Overcoming the Challenges of Experimentation on a Service Oriented Architecture
How The Zebra Utilized Feature Experiments To Increase Carrier Card Engagemen...
Making Your Hypothesis Work Harder to Inform Future Product Strategy
Kick Your Assumptions: How Scholl's Test-Everything Culture Drives Revenue
Shipping to Learn and Accelerate Growth with GitHub
Test Everything: TrustRadius Delivers Customer Value with Experimentation
Optimizely Agent: Scaling Resilient Feature Delivery
The Future of Software Development
Practical Use Case: How Dosh Uses Feature Experiments To Accelerate Mobile De...

Recently uploaded (20)

PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
KodekX | Application Modernization Development
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
NewMind AI Monthly Chronicles - July 2025
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
PPT
Teaching material agriculture food technology
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Electronic commerce courselecture one. Pdf
PDF
Machine learning based COVID-19 study performance prediction
PPTX
MYSQL Presentation for SQL database connectivity
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
cuic standard and advanced reporting.pdf
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
Review of recent advances in non-invasive hemoglobin estimation
Digital-Transformation-Roadmap-for-Companies.pptx
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
KodekX | Application Modernization Development
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
NewMind AI Monthly Chronicles - July 2025
The AUB Centre for AI in Media Proposal.docx
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
20250228 LYD VKU AI Blended-Learning.pptx
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
Teaching material agriculture food technology
Building Integrated photovoltaic BIPV_UPV.pdf
Chapter 3 Spatial Domain Image Processing.pdf
Electronic commerce courselecture one. Pdf
Machine learning based COVID-19 study performance prediction
MYSQL Presentation for SQL database connectivity
Understanding_Digital_Forensics_Presentation.pptx
cuic standard and advanced reporting.pdf
Advanced methodologies resolving dimensionality complications for autism neur...

Opticon 2017 Advanced Program Management

Editor's Notes

  • #8: Because I know why my family and friends want to travel, I know what to look for – I know how to measure success. So, let’s begin.
  • #9: Customer experience improvements, Conversion Optimization, Validation of new features? Measuring program success is not about measuring how many tests you did. You first need to start with, why your program exists, what function is it serving. At Microsoft, we have teams across the company who test. Why they test is different, but at the foundation – all of the teams are working to deliver customer insights and direction to different business groups across the company. So, before we begin, you first have to decide, why your program exists?
  • #10: https://www.flickr.com/photos/jsloss/3573190903/ And, if you aren’t careful, to figure out why your are testing, you can quickly find your self at a music festival – where everyone wants something different from you!
  • #11: All of the program goals tie back to delivering customer insights. This is not
  • #13: https://flic.kr/p/7VFBK2
  • #14: A prioritization model creates transparency and focus within testing programs. Ours is a point based model where we score each category (which has a few questions/criteria) and then we total the points. The points map to priority levels which we add to our intake forms, which creates transparency for the overall score.
  • #15: A prioritization model creates transparency and focus within testing programs. Ours is a point based model where we score each category (which has a few questions/criteria) and then we total the points. The points map to priority levels which we add to our intake forms, which creates transparency for the overall score.
  • #16: https://www.amazon.com/Camping-Campers-Journal-Natural-Brown/dp/1892033011/ref=sr_1_1?ie=UTF8&qid=1507688717&sr=8-1&keywords=camping+log+book
  • #17: When measuring, advise is to keep is simple. Look at how you are tracking monthly, and set targets. Some other ideas to track are: bust rate, implementation rate, time to launch, % of inconclusive tests.
  • #18: https://www.amazon.com/Camping-Campers-Journal-Natural-Brown/dp/1892033011/ref=sr_1_1?ie=UTF8&qid=1507688717&sr=8-1&keywords=camping+log+book
  • #21: Thank you – and here’s to structuring, building and reporting on your Successful experimentation program.
  • #35: Customer experience improvements, Conversion Optimization, Validation of new features? Measuring program success is not about measuring how many tests you did. You first need to start with, why your program exists, what function is it serving. At Microsoft, we have teams across the company who test. Why they test is different, but at the foundation – all of the teams are working to deliver customer insights and direction to different business groups across the company. So, before we begin, you first have to decide, why your program exists? >>>> Our goals can only be reached through a vehicle of a plan; Pablo Picasso