SlideShare a Scribd company logo
Metrics 3.0
2017 Mile High Agile - Denver
Presented by Andy Cleff
Co-Author Ralph van Rosmalen
Based on work of Jurgen Appelo
Metrics 3.0 • Andy Cleff • @JustSitThere
Overview
Who’s Here Today?
Twelve Rules 

for Measurement
Group Brainstorming
Group Presentations
Keeping in Touch
Metrics 3.0 • Andy Cleff • @JustSitThere
12 Rules for Measurement
1: Measure for a purpose
2: Shrink the unknown
3. Seek to improve
4: Delight all stakeholders
5: Distrust all numbers
6: Set imprecise targets
7: Own your metrics
8: Don’t connect metrics to rewards
9: Promote values and transparency
10: Visualize and humanize
11: Measure early and often
12: Try something else
change and innovation practices
MANAGEMENT 3.0
12 Rules for Measurement
When selecting metrics, ask:
Rule 1: Measure for a purpose
You must always understand why you are measuring.
The metric is not a goal in itself. Never forget that it’s
just a means to an end. It all starts with why.
Rule 2: Shrink the unknown
A metric is just a surrogate for what you really want to
know. Don’t jump to conclusions. Always try to reduce
the size of what is still unknown.
Rule 3. Seek to improve
Don’t only measure things that will make you look good.
There is plenty of data around, but you must focus on
what enables you to do better work.
Rule 4: Delight all stakeholders
Your work depends on others, and others depend on
you. Never optimize for just one stakeholder. Instead,
measure your work from multiple perspectives.
Rule 5: Distrust all numbers
Observers usually influence their own metrics, and they
suffer from all kinds of biases. Have a healthy, skeptical
attitude towards any reported numbers.
Rule 6: Set imprecise targets
When people have targets, they have an inclination to
focus on the targets instead of the real purpose. Avoid
this tendency by keeping your targets vague.
Rule 7: Own your metrics
Everyone is responsible for their own work, and metrics
help us improve that work. Therefore, everyone should
be responsible for their own metrics.
Rule 8: Don’t connect metrics to rewards
Rewards often kill intrinsic motivation and lead
to dysfunctional behaviors in organizations. Don’t
incentivize people to do work they should like doing.
Rule 9: Promote values and transparency
Human beings are smart and able to game any system.
To prevent gaming, be transparent about values,
intentions, and the metrics everyone is using.
Rule 10: Visualize and humanize
Numbers tend to dehumanize everything. Replace digits
with colors and pictures, and keep the measurements
close to where the actual work is done.
Rule 11: Measure early and often
Most people don’t measure often enough. Measure
sooner and faster to prevent risks and problems from
growing too big for you to handle.
Rule 12: Try something else
It’s rarely a good idea to do the same things over and
over. The environment changes all the time. The same
should apply to how and what you measure.
› Why “this metric?” – Why does it matter?
› What insights might we gain from it?
› What is expected to change? What is expected variability,
consistency – are we looking for trends or absolute values?
› How might it be gamed, misused (or abused)?
› What are some for trade offs / costs of improvement -
Working to improve one thing may temporarily reduce
another (e.g., predictability may increase at the expense
of throughput)
› How often would we like to “take a data point”?
› How long will we run the experiment? (What is the half-life?)
› How when we know when we’re “done” with this metric
(and it’s served its purpose, and it’s time to retire it and
consider another…)?
› How will we make our measurements transparent – to
promote knowledge sharing, collaboration with other
teams and trust with our sponsors?
› Is this metric a leading or lagging indicator?
Rule 1- Measure
for a purpose
You must always understand what
you are measuring. The metric is
not a goal in itself. Never forget that
it’s just a means to an end. It all
starts with why.
“If all we have are opinions, let’s go with mine.”
Jim Barksdale
“…Analysis without numbers is only an opinion.”
Atkins Law #1
Metrics 3.0 • Andy Cleff • @JustSitThere
Reasons why we do measure
To see revenues to drive resource & people allocation
Monitor alignment with mission / vision / goal
Observe quality of product / process
Judge customer happiness / employee satisfaction
To make decisions that are not based on gut feelings
Metrics 3.0 • Andy Cleff • @JustSitThere
Reasons why we don’t measure
Measurements might be used as weapons
Lame metrics that would not useful or actionable
Implementing measures would cost too much time / effort
Some things might just not be immeasurable
Rule 2 - Shrink
the unknown
A metric is just a surrogate for what
you really want to know. Don’t jump
to conclusions. Always try to reduce
the size of what is still unknown.
Metrics 3.0 • Andy Cleff • @JustSitThere
Cynefin Framework
Metrics 3.0 • Andy Cleff • @JustSitThere
Cynefin Framework
Ordered
Domains
Metrics 3.0 • Andy Cleff • @JustSitThere
Cynefin Framework
Here Things 

Get Interesting…
Rule 3 - Seek to
improve
Don’t only measure things that will
make you look and feel good. There
is plenty of data around, but you
must focus on what enables you to
do better work.
Metrics 3.0 • Andy Cleff • @JustSitThere
Actionable Metrics
“A good metric changes the way you behave. This is by far
the most important criterion for a metric: what will you do
differently based on changes in the metric?”
Lean Analytics, Alistair Croll and Benjamin Yoskovitz
Metrics 3.0 • Andy Cleff • @JustSitThere
Vanity Metrics
“When we rely on vanity metrics, a funny thing happens.
When the numbers go up, I've personally witnessed everyone
in the company naturally attributing that rise to whatever they
were working on at the time. That's not too bad, except for this
correlate: when the numbers go down, we invariably blame
someone else”
Eric Ries
Rule 4 - Delight
all stakeholders
Your work depends on others, and
others depend on you. Never
optimize for just one stakeholder.
Instead, measure your work from
multiple perspectives.
Metrics 3.0 • Andy Cleff • @JustSitThere
It is impossible to please everyone, but you would like to
know who is pleased at certain moments and who is not.
Rule 5 - Distrust
all numbers
Observers usually influence their
own metrics, and they suffer from
all kinds of biases. Have a healthy,
skeptical attitude towards any
reported numbers.
Metrics 3.0 • Andy Cleff • @JustSitThere
Story
Hawthorne Works, Chicago, ca. 1925.
Story
Story
Rule 6 - Set
imprecise targets
When people have targets, they
have an inclination to focus on the
targets instead of the real purpose.
Avoid this tendency by keeping
your targets vague.
"When a measure becomes a target, it ceases to be a
good measure."
Goodhart's law
Rule 7 - Own
your metrics
Everyone is responsible for their
own work, and metrics help us
improve that work. Therefore,
everyone should be responsible for
their own metrics.
Metrics 3.0 • Andy Cleff • @JustSitThere
Important Considerations
How many metrics should a team use?
Which ones to use?
How long should they use the ones selected?
Metrics 3.0 • Andy Cleff • @JustSitThere
Anti-Patterns
Looking at a single metric (Hawthorn)
Striving for ever increasing values instead of striving for
consistency and stability (Goodhart)
Correlation is not necessarily causation (Milton Friedman’s
Thermostat)
Comparing metrics across teams that are very different
Rule 8 - Don’t
connect metrics
to rewards
Rewards often kill intrinsic
motivation and lead to dysfunctional
behaviors in organizations. Don’t
incentivize people to do work they
should like doing.
Metrics 3.0 • Andy Cleff • @JustSitThere
DILBERT © 1999 Scott Adams.
Used By permission of ANDREWS MCMEEL SYNDICATION. All rights reserved.
Rule 9 - Promote
values and
transparency
Human beings are smart and able
to game any system. To prevent
gaming, be transparent about
values, intentions, and the metrics
everyone is using.
Metrics 3.0 • Andy Cleff • @JustSitThere
Values, Intention, Purpose
Do we get paid a sustainable value for what we do?
Are we great at what we do in the eyes of our customers?
Do our employees / team mates love what we do and the
way we do it?
Will what we do make the world a better place for our
grandchildren?
Metrics 3.0 • Andy Cleff • @JustSitThere
Rule 10 -
Visualize and
humanize
Numbers tend to dehumanize
everything. Replace digits with
colors and pictures, and keep the
measurements close to where the
actual work is done.
Story
Story
Story
Story
Story
Rule 11 -
Measure early
and often
Most people don’t measure often
enough. Measure sooner and faster
to prevent risks and problems from
growing too big for you to handle.
“The only way to win is to learn faster than anyone
else”
Eric Ries
“What you want to do as a company is maximize the
number of experiments you can do per unit of time.”
Jeff Bezos
Rule 12 - Try
something else
It’s rarely a good idea to do the
same things over and over. The
environment changes all the time.
The same should apply to how and
what you measure.
Metrics 3.0 • Andy Cleff • @JustSitThere
Limited Lifespan of all Metrics
That which is measured will improve, at a cost.
When a measure becomes a target, it ceases to be a good
measure.
Correlation is not causation, but it sure is a hint.
Use multiple viewpoints - technical as well as human - to get
a holistic perspective
Metrics 3.0 • Andy Cleff • @JustSitThere
The Twelve Rules for Metrics
1: Measure for a purpose
2: Shrink the unknown
3. Seek to improve
4: Delight all stakeholders
5: Distrust all numbers
6: Set imprecise targets
7: Own your metrics
8: Don’t connect metrics to rewards
9: Promote values and transparency
10: Visualize and humanize
11: Measure early and often
12: Try something else
Collective
Brainstorming
Metrics 3.0 • Andy Cleff • @JustSitThere
Five Categories of Metrics
1. Process Health Metrics - assess day-to-day delivery team activities and
evaluates process changes.
2. Release Metrics - focus on identifying impediments to continuous delivery.
3. Product Development Metrics - help measure alignment of product
features to user needs.
4. Technical / Code Metrics - help determine quality of implementation and
architecture.
5. People/Team - reveal issues that impact a team’s sustainable pace and
level of engagement.
Shout out to Jason Tice @theagilefactor
How to Choose?
Metrics 3.0 • Andy Cleff • @JustSitThere
Review the Options….
1. Why “this metric?” – Why does it
matter? Who does it matter to?
2. What insights might we gain from it?
3. What is expected to change? Are we
looking for variability, consistency,
trends or absolute values?
4. How might it be gamed, misused (or
abused)?
5. What are some trade offs / costs of
improvement?
6. How often would we like to “take a
data point”?
7. How long will we run the experiment?
8. How when we know when we’re “done”
with this metric?
9. How will we make our measurements
transparent?
10.Is this metric a leading or lagging
indicator?
Debrief / Preso’s
Let’s keep the conversation going…
Andy Cleff
andycleff@icloud.com
andycleff.com
linkedin.com/in/andycleff
@JustSitThere
coalition.agileuprising.com
Metrics 3.0 - Meaningful Measurements for Agile Software Development
40+ Metrics for Software Teams
People/Team: Human Elements
This group of metrics reveals issues that impact a team’s
sustainable place and level of engagement.
›› Team Happiness / Morale / Mood
›› Gallop Q12
›› Team / Manager / Organization NPS
›› Percentage of time w/o interruptions
›› Trust between Leadership and Team
›› Learning Log
›› Team Tenure
›› Phone-a-Friend Stats
›› Whole Team Contribution
›› Transparency (access to data, access to customers,
sharing of learning, successes and failures)
›› Comparative Agility: Team mapping against the 12 agile
principles (Geoff Watt’s “Scrum Mastery”)
Process Health Metrics
This category assess day-to-day delivery team activities
and evaluates process changes.
›› Cumulative Flow Diagrams
›› Control Charts
›› Cycle Time
›› Percent Complete and Accurate
›› Time Blocked per Work Item
›› Story/Epic Lead Time
›› Successful Iteration Completion
›› Escaped Defect Resolution Time
Release Metrics
This group directs focus on identifying impediments to
continuous delivery.
›› Escaped Defects
›› Release Success Rate
›› Release Time
›› Time Since Last Release
›› Cost Per Release
›› Release Net Promoter Score
›› Release Adoption / Install Rate
Product Development Metrics
These help measure alignment of product features to user
needs.
›› Customer / Business Value Delivered
›› Risk Burndown
›› Value Stream Mapping
›› Sales Velocity
›› Product Forecast
›› Product Net Promoter Score (NPS)
›› User Analytics
Technical/Code Metrics
The following help determine quality of implementation
and architecture.
›› Test Coverage
›› Unit/Regression Test Coverage
›› Build Time
›› Defect Density
›› Code Churn
›› Code Ownership
›› Code Complexity
›› Coding Standards Adherence
›› Crash Rate
›› Build Breaks
›› Technical Debt
›› Ratio of Fixing Work vs Feature Work
Andy Cleff
Andy is an experienced and pragmatic agile practitioner that
takes teams beyond getting agile to embracing agile. His chief
weapons are well asked questions, insightful retrospectives and
an ability to withstand awkward silences. And if all else fails, beer.
	andycleff@icloud.com
	andycleff.com
	 linkedin.com/in/andycleff
	 @JustSitThere
	 agileuprising.com
The following listing is intended as a starting point for conversation and discussion. Choose one or two that make sense for
your team / organization and add them to your current dashboard. Then rinse and repeat over time.
change and innovation practices
MANAGEMENT 3.0
12 Rules for Measurement
When selecting metrics, ask:
Rule 1: Measure for a purpose
You must always understand why you are measuring.
The metric is not a goal in itself. Never forget that it’s just a
means to an end. It all starts with why.
Rule 2: Shrink the unknown
A metric is just a surrogate for what you really want to
know. Don’t jump to conclusions. Always try to reduce the
size of what is still unknown.
Rule 3. Seek to improve
Don’t only measure things that will make you look good.
There is plenty of data around, but you must focus on what
enables you to do better work.
Rule 4: Delight all stakeholders
Your work depends on others, and others depend on you.
Never optimize for just one stakeholder. Instead, measure
your work from multiple perspectives.
Rule 5: Distrust all numbers
Observers usually influence their own metrics, and they
suffer from all kinds of biases. Have a healthy, skeptical
attitude towards any reported numbers.
Rule 6: Set imprecise targets
When people have targets, they have an inclination to
focus on the targets instead of the real purpose. Avoid this
tendency by keeping your targets vague.
Rule 7: Own your metrics
Everyone is responsible for their own work, and metrics
help us improve that work. Therefore, everyone should be
responsible for their own metrics.
Rule 8: Don’t connect metrics to rewards
Rewards often kill intrinsic motivation and lead to
dysfunctional behaviors in organizations. Don’t incentivize
people to do work they should like doing.
Rule 9: Promote values and transparency
Human beings are smart and able to game any system. To
prevent gaming, be transparent about values, intentions,
and the metrics everyone is using.
Rule 10: Visualize and humanize
Numbers tend to dehumanize everything. Replace digits
with colors and pictures, and keep the measurements close
to where the actual work is done.
Rule 11: Measure early and often
Most people don’t measure often enough. Measure sooner
and faster to prevent risks and problems from growing too
big for you to handle.
Rule 12: Try something else
It’s rarely a good idea to do the same things over and over.
The environment changes all the time. The same should
apply to how and what you measure.
›› Why “this metric?” – Why does it matter? Who does it
matter to?
›› What insights might we gain from it?
›› What is expected to change? What is the expected variability
and consistency – are we looking for trends or absolute
values?
›› How might it be gamed, misused (or abused)?
›› What are some for trade offs / costs of improvement?
›› How often would we like to “take a data point”?
›› How long will we run the experiment? (What is the half-life?)
›› How when we know when we’re “done” with this metric?
›› Are we adding to the dashboard or replacin/retiring
something else?
›› How will we make our measurements transparent – to
promote knowledge sharing, collaboration with other
teams and trust with our sponsors?
›› Is this metric a leading or lagging indicator?
Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com
Good Ideas / Actionable Metrics
›› Team / Manager / Organization NPS
›› Gallop Q12
Bad / Nasty / Vanity Metrics
›› Time to hire
›› Lines of code per individual
People/Team: Human Elements
This group of metrics reveals issues that impact a team’s sustainable pace and level of engagement.
Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com
Process Health Metrics
Good Ideas / Actionable Metrics
›› Cumulative Flow Diagrams
›› Lead & Cycle Time
Bad / Nasty / Vanity Metrics
›› Velocity
›› Story points per developer
This category assess day-to-day delivery team activities and evaluates process changes.
Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com
Release Metrics
Good Ideas / Actionable Metrics
›› Time to first release
›› Frequency of release
Bad / Nasty / Vanity Metrics
›› Lines of code pushed
›› Story points per release
This group directs focus on identifying impediments to continuous delivery.
Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com
Product Development Metrics
Good Ideas / Actionable Metrics
›› Product Net Promoter Score (NPS)
›› Risk Burndown
Bad / Nasty / Vanity Metrics
›› Number of new features
›› Customer satisfaction survey conducted by sales agent
These help measure alignment of product features to user needs.
Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com
Technical/Code Metrics
Good Ideas / Actionable Metrics
›› Ratio of Fixing Work vs Feature Work
›› Test coverage
Bad / Nasty / Vanity Metrics
›› Lines of code
›› Causal Analysis
The following help determine quality of implementation and architecture.

More Related Content

PPTX
Strengths Finder Presentation
PDF
PPT
Agile estimation and planning peter saddington
PPTX
Top 10 Agile Metrics
PPTX
deep work.pptx
PPTX
Management 3.0 - Empower Teams
PPT
Leader as a Coach
PDF
Strengths-Based Leadership Handout
Strengths Finder Presentation
Agile estimation and planning peter saddington
Top 10 Agile Metrics
deep work.pptx
Management 3.0 - Empower Teams
Leader as a Coach
Strengths-Based Leadership Handout

What's hot (20)

PPTX
Pass the pennies - Lean game simulation
PDF
Agile Transformation v1.27
PDF
Beyond Budgeting - Bjarte Bogsnes
PPTX
Scrum Master Interview Questions SlideShare
PDF
Influence without Authority
PDF
Practical Guide to Scrum
PPTX
OKR - Measure What Matters
PDF
Metrics for Agile Teams Forget Velocity: 42 Other Things to Ponder
PPTX
Introduction to Recipes for Agile Governance in the Enterprise (RAGE)
PDF
PDF
Agile Performance Metrics
 
PPTX
Product Backlog Management
PPTX
Agile Leaders and Agile Managers
PDF
Having a Difficult Conversation
PDF
Business Agility
PDF
Atelier Story Map
PPTX
How to estimate in scrum
PDF
Agile metrics
PPTX
The Guide to Objectives and Key Results (OKRs)
PDF
What makes a leader truly great?
Pass the pennies - Lean game simulation
Agile Transformation v1.27
Beyond Budgeting - Bjarte Bogsnes
Scrum Master Interview Questions SlideShare
Influence without Authority
Practical Guide to Scrum
OKR - Measure What Matters
Metrics for Agile Teams Forget Velocity: 42 Other Things to Ponder
Introduction to Recipes for Agile Governance in the Enterprise (RAGE)
Agile Performance Metrics
 
Product Backlog Management
Agile Leaders and Agile Managers
Having a Difficult Conversation
Business Agility
Atelier Story Map
How to estimate in scrum
Agile metrics
The Guide to Objectives and Key Results (OKRs)
What makes a leader truly great?
Ad

Similar to Metrics 3.0 - Meaningful Measurements for Agile Software Development (20)

PPTX
Using Metrics to Define Success
PPTX
Pin the tail on the metric v00 75 min version
PDF
Measuring Up - Agile Team Metrics - DevUp 2022.pdf
PDF
Measuring Up - PMI Agile Conference 2022.pdf
PDF
ANIn Kochi July 2024 | Overcoming Anti-Patterns and Pitfalls of Metrics by Ha...
PDF
VS Liv MSHQ 2022 - Measuring Up! How To Choose Agile Metrics - Dugan.pdf
PPT
Measuring the value of KM
PPTX
Three baseline metrics & what they can tell you about your team.
PDF
AgileCamp Silicon Valley 2015: Unlock Excellence with Agile Metrics
PDF
Supply chain performance measurement trends
PDF
One Metric to Rule Them All: Effectively Measure Your Teams Without Subjugati...
PDF
VS Live 2021 VST09 agile team metrics Fast Focus - angela dugan
PDF
Growing a Culture of Data-Driven Continuous Improvement
PPTX
Mastering Goals & Metrics
PPTX
Performance-Management (KPIs)
PDF
Larry Maccherone: "Probabilistic Decision Making"
PPTX
Pin the tail on the metric v01 2016 oct
PDF
2016 metrics-as-culture
PPTX
Agile Metrics...That Matter
PPTX
Agile Reporting for PM Brain
Using Metrics to Define Success
Pin the tail on the metric v00 75 min version
Measuring Up - Agile Team Metrics - DevUp 2022.pdf
Measuring Up - PMI Agile Conference 2022.pdf
ANIn Kochi July 2024 | Overcoming Anti-Patterns and Pitfalls of Metrics by Ha...
VS Liv MSHQ 2022 - Measuring Up! How To Choose Agile Metrics - Dugan.pdf
Measuring the value of KM
Three baseline metrics & what they can tell you about your team.
AgileCamp Silicon Valley 2015: Unlock Excellence with Agile Metrics
Supply chain performance measurement trends
One Metric to Rule Them All: Effectively Measure Your Teams Without Subjugati...
VS Live 2021 VST09 agile team metrics Fast Focus - angela dugan
Growing a Culture of Data-Driven Continuous Improvement
Mastering Goals & Metrics
Performance-Management (KPIs)
Larry Maccherone: "Probabilistic Decision Making"
Pin the tail on the metric v01 2016 oct
2016 metrics-as-culture
Agile Metrics...That Matter
Agile Reporting for PM Brain
Ad

Recently uploaded (20)

PDF
Odoo Companies in India – Driving Business Transformation.pdf
PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PPTX
ai tools demonstartion for schools and inter college
PDF
Wondershare Filmora 15 Crack With Activation Key [2025
PDF
Adobe Illustrator 28.6 Crack My Vision of Vector Design
PPTX
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PPTX
ManageIQ - Sprint 268 Review - Slide Deck
PDF
System and Network Administration Chapter 2
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PPTX
ISO 45001 Occupational Health and Safety Management System
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PPTX
Transform Your Business with a Software ERP System
PDF
Understanding Forklifts - TECH EHS Solution
PPTX
L1 - Introduction to python Backend.pptx
PDF
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PDF
top salesforce developer skills in 2025.pdf
PPTX
Online Work Permit System for Fast Permit Processing
PDF
medical staffing services at VALiNTRY
Odoo Companies in India – Driving Business Transformation.pdf
VVF-Customer-Presentation2025-Ver1.9.pptx
ai tools demonstartion for schools and inter college
Wondershare Filmora 15 Crack With Activation Key [2025
Adobe Illustrator 28.6 Crack My Vision of Vector Design
CHAPTER 12 - CYBER SECURITY AND FUTURE SKILLS (1) (1).pptx
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
ManageIQ - Sprint 268 Review - Slide Deck
System and Network Administration Chapter 2
Design an Analysis of Algorithms I-SECS-1021-03
ISO 45001 Occupational Health and Safety Management System
Design an Analysis of Algorithms II-SECS-1021-03
Transform Your Business with a Software ERP System
Understanding Forklifts - TECH EHS Solution
L1 - Introduction to python Backend.pptx
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
top salesforce developer skills in 2025.pdf
Online Work Permit System for Fast Permit Processing
medical staffing services at VALiNTRY

Metrics 3.0 - Meaningful Measurements for Agile Software Development

  • 1. Metrics 3.0 2017 Mile High Agile - Denver Presented by Andy Cleff Co-Author Ralph van Rosmalen Based on work of Jurgen Appelo
  • 2. Metrics 3.0 • Andy Cleff • @JustSitThere Overview Who’s Here Today? Twelve Rules 
 for Measurement Group Brainstorming Group Presentations Keeping in Touch
  • 3. Metrics 3.0 • Andy Cleff • @JustSitThere 12 Rules for Measurement 1: Measure for a purpose 2: Shrink the unknown 3. Seek to improve 4: Delight all stakeholders 5: Distrust all numbers 6: Set imprecise targets 7: Own your metrics 8: Don’t connect metrics to rewards 9: Promote values and transparency 10: Visualize and humanize 11: Measure early and often 12: Try something else change and innovation practices MANAGEMENT 3.0 12 Rules for Measurement When selecting metrics, ask: Rule 1: Measure for a purpose You must always understand why you are measuring. The metric is not a goal in itself. Never forget that it’s just a means to an end. It all starts with why. Rule 2: Shrink the unknown A metric is just a surrogate for what you really want to know. Don’t jump to conclusions. Always try to reduce the size of what is still unknown. Rule 3. Seek to improve Don’t only measure things that will make you look good. There is plenty of data around, but you must focus on what enables you to do better work. Rule 4: Delight all stakeholders Your work depends on others, and others depend on you. Never optimize for just one stakeholder. Instead, measure your work from multiple perspectives. Rule 5: Distrust all numbers Observers usually influence their own metrics, and they suffer from all kinds of biases. Have a healthy, skeptical attitude towards any reported numbers. Rule 6: Set imprecise targets When people have targets, they have an inclination to focus on the targets instead of the real purpose. Avoid this tendency by keeping your targets vague. Rule 7: Own your metrics Everyone is responsible for their own work, and metrics help us improve that work. Therefore, everyone should be responsible for their own metrics. Rule 8: Don’t connect metrics to rewards Rewards often kill intrinsic motivation and lead to dysfunctional behaviors in organizations. Don’t incentivize people to do work they should like doing. Rule 9: Promote values and transparency Human beings are smart and able to game any system. To prevent gaming, be transparent about values, intentions, and the metrics everyone is using. Rule 10: Visualize and humanize Numbers tend to dehumanize everything. Replace digits with colors and pictures, and keep the measurements close to where the actual work is done. Rule 11: Measure early and often Most people don’t measure often enough. Measure sooner and faster to prevent risks and problems from growing too big for you to handle. Rule 12: Try something else It’s rarely a good idea to do the same things over and over. The environment changes all the time. The same should apply to how and what you measure. › Why “this metric?” – Why does it matter? › What insights might we gain from it? › What is expected to change? What is expected variability, consistency – are we looking for trends or absolute values? › How might it be gamed, misused (or abused)? › What are some for trade offs / costs of improvement - Working to improve one thing may temporarily reduce another (e.g., predictability may increase at the expense of throughput) › How often would we like to “take a data point”? › How long will we run the experiment? (What is the half-life?) › How when we know when we’re “done” with this metric (and it’s served its purpose, and it’s time to retire it and consider another…)? › How will we make our measurements transparent – to promote knowledge sharing, collaboration with other teams and trust with our sponsors? › Is this metric a leading or lagging indicator?
  • 4. Rule 1- Measure for a purpose You must always understand what you are measuring. The metric is not a goal in itself. Never forget that it’s just a means to an end. It all starts with why.
  • 5. “If all we have are opinions, let’s go with mine.” Jim Barksdale “…Analysis without numbers is only an opinion.” Atkins Law #1
  • 6. Metrics 3.0 • Andy Cleff • @JustSitThere Reasons why we do measure To see revenues to drive resource & people allocation Monitor alignment with mission / vision / goal Observe quality of product / process Judge customer happiness / employee satisfaction To make decisions that are not based on gut feelings
  • 7. Metrics 3.0 • Andy Cleff • @JustSitThere Reasons why we don’t measure Measurements might be used as weapons Lame metrics that would not useful or actionable Implementing measures would cost too much time / effort Some things might just not be immeasurable
  • 8. Rule 2 - Shrink the unknown A metric is just a surrogate for what you really want to know. Don’t jump to conclusions. Always try to reduce the size of what is still unknown.
  • 9. Metrics 3.0 • Andy Cleff • @JustSitThere Cynefin Framework
  • 10. Metrics 3.0 • Andy Cleff • @JustSitThere Cynefin Framework Ordered Domains
  • 11. Metrics 3.0 • Andy Cleff • @JustSitThere Cynefin Framework Here Things 
 Get Interesting…
  • 12. Rule 3 - Seek to improve Don’t only measure things that will make you look and feel good. There is plenty of data around, but you must focus on what enables you to do better work.
  • 13. Metrics 3.0 • Andy Cleff • @JustSitThere Actionable Metrics “A good metric changes the way you behave. This is by far the most important criterion for a metric: what will you do differently based on changes in the metric?” Lean Analytics, Alistair Croll and Benjamin Yoskovitz
  • 14. Metrics 3.0 • Andy Cleff • @JustSitThere Vanity Metrics “When we rely on vanity metrics, a funny thing happens. When the numbers go up, I've personally witnessed everyone in the company naturally attributing that rise to whatever they were working on at the time. That's not too bad, except for this correlate: when the numbers go down, we invariably blame someone else” Eric Ries
  • 15. Rule 4 - Delight all stakeholders Your work depends on others, and others depend on you. Never optimize for just one stakeholder. Instead, measure your work from multiple perspectives.
  • 16. Metrics 3.0 • Andy Cleff • @JustSitThere It is impossible to please everyone, but you would like to know who is pleased at certain moments and who is not.
  • 17. Rule 5 - Distrust all numbers Observers usually influence their own metrics, and they suffer from all kinds of biases. Have a healthy, skeptical attitude towards any reported numbers.
  • 18. Metrics 3.0 • Andy Cleff • @JustSitThere
  • 20. Story
  • 21. Story
  • 22. Rule 6 - Set imprecise targets When people have targets, they have an inclination to focus on the targets instead of the real purpose. Avoid this tendency by keeping your targets vague.
  • 23. "When a measure becomes a target, it ceases to be a good measure." Goodhart's law
  • 24. Rule 7 - Own your metrics Everyone is responsible for their own work, and metrics help us improve that work. Therefore, everyone should be responsible for their own metrics.
  • 25. Metrics 3.0 • Andy Cleff • @JustSitThere Important Considerations How many metrics should a team use? Which ones to use? How long should they use the ones selected?
  • 26. Metrics 3.0 • Andy Cleff • @JustSitThere Anti-Patterns Looking at a single metric (Hawthorn) Striving for ever increasing values instead of striving for consistency and stability (Goodhart) Correlation is not necessarily causation (Milton Friedman’s Thermostat) Comparing metrics across teams that are very different
  • 27. Rule 8 - Don’t connect metrics to rewards Rewards often kill intrinsic motivation and lead to dysfunctional behaviors in organizations. Don’t incentivize people to do work they should like doing.
  • 28. Metrics 3.0 • Andy Cleff • @JustSitThere DILBERT © 1999 Scott Adams. Used By permission of ANDREWS MCMEEL SYNDICATION. All rights reserved.
  • 29. Rule 9 - Promote values and transparency Human beings are smart and able to game any system. To prevent gaming, be transparent about values, intentions, and the metrics everyone is using.
  • 30. Metrics 3.0 • Andy Cleff • @JustSitThere Values, Intention, Purpose Do we get paid a sustainable value for what we do? Are we great at what we do in the eyes of our customers? Do our employees / team mates love what we do and the way we do it? Will what we do make the world a better place for our grandchildren?
  • 31. Metrics 3.0 • Andy Cleff • @JustSitThere
  • 32. Rule 10 - Visualize and humanize Numbers tend to dehumanize everything. Replace digits with colors and pictures, and keep the measurements close to where the actual work is done.
  • 33. Story
  • 34. Story
  • 35. Story
  • 36. Story
  • 37. Story
  • 38. Rule 11 - Measure early and often Most people don’t measure often enough. Measure sooner and faster to prevent risks and problems from growing too big for you to handle.
  • 39. “The only way to win is to learn faster than anyone else” Eric Ries “What you want to do as a company is maximize the number of experiments you can do per unit of time.” Jeff Bezos
  • 40. Rule 12 - Try something else It’s rarely a good idea to do the same things over and over. The environment changes all the time. The same should apply to how and what you measure.
  • 41. Metrics 3.0 • Andy Cleff • @JustSitThere Limited Lifespan of all Metrics That which is measured will improve, at a cost. When a measure becomes a target, it ceases to be a good measure. Correlation is not causation, but it sure is a hint. Use multiple viewpoints - technical as well as human - to get a holistic perspective
  • 42. Metrics 3.0 • Andy Cleff • @JustSitThere The Twelve Rules for Metrics 1: Measure for a purpose 2: Shrink the unknown 3. Seek to improve 4: Delight all stakeholders 5: Distrust all numbers 6: Set imprecise targets 7: Own your metrics 8: Don’t connect metrics to rewards 9: Promote values and transparency 10: Visualize and humanize 11: Measure early and often 12: Try something else
  • 44. Metrics 3.0 • Andy Cleff • @JustSitThere Five Categories of Metrics 1. Process Health Metrics - assess day-to-day delivery team activities and evaluates process changes. 2. Release Metrics - focus on identifying impediments to continuous delivery. 3. Product Development Metrics - help measure alignment of product features to user needs. 4. Technical / Code Metrics - help determine quality of implementation and architecture. 5. People/Team - reveal issues that impact a team’s sustainable pace and level of engagement. Shout out to Jason Tice @theagilefactor
  • 46. Metrics 3.0 • Andy Cleff • @JustSitThere Review the Options…. 1. Why “this metric?” – Why does it matter? Who does it matter to? 2. What insights might we gain from it? 3. What is expected to change? Are we looking for variability, consistency, trends or absolute values? 4. How might it be gamed, misused (or abused)? 5. What are some trade offs / costs of improvement? 6. How often would we like to “take a data point”? 7. How long will we run the experiment? 8. How when we know when we’re “done” with this metric? 9. How will we make our measurements transparent? 10.Is this metric a leading or lagging indicator?
  • 48. Let’s keep the conversation going… Andy Cleff andycleff@icloud.com andycleff.com linkedin.com/in/andycleff @JustSitThere coalition.agileuprising.com
  • 50. 40+ Metrics for Software Teams People/Team: Human Elements This group of metrics reveals issues that impact a team’s sustainable place and level of engagement. ›› Team Happiness / Morale / Mood ›› Gallop Q12 ›› Team / Manager / Organization NPS ›› Percentage of time w/o interruptions ›› Trust between Leadership and Team ›› Learning Log ›› Team Tenure ›› Phone-a-Friend Stats ›› Whole Team Contribution ›› Transparency (access to data, access to customers, sharing of learning, successes and failures) ›› Comparative Agility: Team mapping against the 12 agile principles (Geoff Watt’s “Scrum Mastery”) Process Health Metrics This category assess day-to-day delivery team activities and evaluates process changes. ›› Cumulative Flow Diagrams ›› Control Charts ›› Cycle Time ›› Percent Complete and Accurate ›› Time Blocked per Work Item ›› Story/Epic Lead Time ›› Successful Iteration Completion ›› Escaped Defect Resolution Time Release Metrics This group directs focus on identifying impediments to continuous delivery. ›› Escaped Defects ›› Release Success Rate ›› Release Time ›› Time Since Last Release ›› Cost Per Release ›› Release Net Promoter Score ›› Release Adoption / Install Rate Product Development Metrics These help measure alignment of product features to user needs. ›› Customer / Business Value Delivered ›› Risk Burndown ›› Value Stream Mapping ›› Sales Velocity ›› Product Forecast ›› Product Net Promoter Score (NPS) ›› User Analytics Technical/Code Metrics The following help determine quality of implementation and architecture. ›› Test Coverage ›› Unit/Regression Test Coverage ›› Build Time ›› Defect Density ›› Code Churn ›› Code Ownership ›› Code Complexity ›› Coding Standards Adherence ›› Crash Rate ›› Build Breaks ›› Technical Debt ›› Ratio of Fixing Work vs Feature Work Andy Cleff Andy is an experienced and pragmatic agile practitioner that takes teams beyond getting agile to embracing agile. His chief weapons are well asked questions, insightful retrospectives and an ability to withstand awkward silences. And if all else fails, beer. andycleff@icloud.com andycleff.com linkedin.com/in/andycleff @JustSitThere agileuprising.com The following listing is intended as a starting point for conversation and discussion. Choose one or two that make sense for your team / organization and add them to your current dashboard. Then rinse and repeat over time.
  • 51. change and innovation practices MANAGEMENT 3.0 12 Rules for Measurement When selecting metrics, ask: Rule 1: Measure for a purpose You must always understand why you are measuring. The metric is not a goal in itself. Never forget that it’s just a means to an end. It all starts with why. Rule 2: Shrink the unknown A metric is just a surrogate for what you really want to know. Don’t jump to conclusions. Always try to reduce the size of what is still unknown. Rule 3. Seek to improve Don’t only measure things that will make you look good. There is plenty of data around, but you must focus on what enables you to do better work. Rule 4: Delight all stakeholders Your work depends on others, and others depend on you. Never optimize for just one stakeholder. Instead, measure your work from multiple perspectives. Rule 5: Distrust all numbers Observers usually influence their own metrics, and they suffer from all kinds of biases. Have a healthy, skeptical attitude towards any reported numbers. Rule 6: Set imprecise targets When people have targets, they have an inclination to focus on the targets instead of the real purpose. Avoid this tendency by keeping your targets vague. Rule 7: Own your metrics Everyone is responsible for their own work, and metrics help us improve that work. Therefore, everyone should be responsible for their own metrics. Rule 8: Don’t connect metrics to rewards Rewards often kill intrinsic motivation and lead to dysfunctional behaviors in organizations. Don’t incentivize people to do work they should like doing. Rule 9: Promote values and transparency Human beings are smart and able to game any system. To prevent gaming, be transparent about values, intentions, and the metrics everyone is using. Rule 10: Visualize and humanize Numbers tend to dehumanize everything. Replace digits with colors and pictures, and keep the measurements close to where the actual work is done. Rule 11: Measure early and often Most people don’t measure often enough. Measure sooner and faster to prevent risks and problems from growing too big for you to handle. Rule 12: Try something else It’s rarely a good idea to do the same things over and over. The environment changes all the time. The same should apply to how and what you measure. ›› Why “this metric?” – Why does it matter? Who does it matter to? ›› What insights might we gain from it? ›› What is expected to change? What is the expected variability and consistency – are we looking for trends or absolute values? ›› How might it be gamed, misused (or abused)? ›› What are some for trade offs / costs of improvement? ›› How often would we like to “take a data point”? ›› How long will we run the experiment? (What is the half-life?) ›› How when we know when we’re “done” with this metric? ›› Are we adding to the dashboard or replacin/retiring something else? ›› How will we make our measurements transparent – to promote knowledge sharing, collaboration with other teams and trust with our sponsors? ›› Is this metric a leading or lagging indicator?
  • 52. Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com Good Ideas / Actionable Metrics ›› Team / Manager / Organization NPS ›› Gallop Q12 Bad / Nasty / Vanity Metrics ›› Time to hire ›› Lines of code per individual People/Team: Human Elements This group of metrics reveals issues that impact a team’s sustainable pace and level of engagement.
  • 53. Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com Process Health Metrics Good Ideas / Actionable Metrics ›› Cumulative Flow Diagrams ›› Lead & Cycle Time Bad / Nasty / Vanity Metrics ›› Velocity ›› Story points per developer This category assess day-to-day delivery team activities and evaluates process changes.
  • 54. Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com Release Metrics Good Ideas / Actionable Metrics ›› Time to first release ›› Frequency of release Bad / Nasty / Vanity Metrics ›› Lines of code pushed ›› Story points per release This group directs focus on identifying impediments to continuous delivery.
  • 55. Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com Product Development Metrics Good Ideas / Actionable Metrics ›› Product Net Promoter Score (NPS) ›› Risk Burndown Bad / Nasty / Vanity Metrics ›› Number of new features ›› Customer satisfaction survey conducted by sales agent These help measure alignment of product features to user needs.
  • 56. Metrics 3.0 • Andy Cleff • @JustSitThere • AndyCleff.com Technical/Code Metrics Good Ideas / Actionable Metrics ›› Ratio of Fixing Work vs Feature Work ›› Test coverage Bad / Nasty / Vanity Metrics ›› Lines of code ›› Causal Analysis The following help determine quality of implementation and architecture.