SlideShare a Scribd company logo
 
 
 
nt Session 
 
Presented by: 
Gerie Owen, Northeast Utilities, Inc 
 
 
Brought to you by: 
 
 
340 Corporate Way, Suite   Orange Park, FL 32073 
888‐2
T12 
Concurre
4/8/2014   
2:00 PM 
 
 
 
 
“How Did I Miss That Bug?  
Managing Cognitive Bias in Testing” 
 
 
Peter Varhol, Telerik 
 
 
 
 
 
 
300,
68‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com 
Gerie Owen
Northeast Utilities, Inc
 
QA Consultant Gerie Owen specializes in developing and managing offshore test
teams. Gerie has implemented the offshore model—developing, training, and
mentoring new teams from their inception. She manages large, complex projects
involving multiple applications; coordinates test teams across multiple time zones;
and delivers high quality products. Gerie’s most successful project team wrote,
automated, and executed more than 80,000 test cases for two suites of web
applications, allowing only one defect to escape into production. In her everyday life,
Gerie enjoys mentoring new test leads and brings a cohesive team approach to
testing.
Peter Varhol
Telerik
 
A testing evangelist at Telerik, Peter Varhol is a writer and speaker on software
development and quality topics. Peter has authored dozens of articles on software
tools and techniques for building applications, and has given conference presentations
and webcasts on a variety of topics—user-centered design, integrating testing into
agile development, and building the right software in an era of changing
requirements. He has held key roles on engineering teams that have produced award-
winning, quality tools such as BoundsChecker and SoftICE. Peter’s past roles include
technology journalist, software product manager, software developer, and university
professor.
2/4/2014
1
How Did I Miss That Bug?How Did I Miss That Bug?
OOOvercome Cognitive Bias Overcome Cognitive Bias 
In TestingIn Testing
Peter Varhol
Peter.@petervarhol.com
Gerie Owen
Gerie.owen@gerieowen.com
Why are we talking about Why are we talking about 
missing bugs?missing bugs?g gg g
• Have you ever missed a bug?
• Have you ever been asked how you missed a bug?
• Have you ever wondered how you missed a bug?
2/4/2014
2
Consequences of Missed Consequences of Missed 
BugsBugs
• Possible Consequences of Missed Bugs:
o Negative Publicity
o Lost Sales
o Lost Customers
o Even Loss of Life
MISSED BUGS CAUSE MAYHEM
My JourneyMy Journey
The “HOW” is more important than the “WHY”The HOW is more important than the WHY
And now , I invite you to join with me
into the journey of
How Did I Miss that Bug?How Did I Miss that Bug?
2/4/2014
3
How Do We Miss Bugs?How Do We Miss Bugs?
• Missed test casesMissed test cases
• Misunderstanding of requirements
• Misjudgment in risk-based testing
• Inattention
• Fatigue
• Burnout
• Multi-tasking
How Do We Test?How Do We Test?
• What is Software Testing?g
o Software testing is making judgments about
the quality of the software under test.
o Involves:
• Objective comparisons of code to
specifications,
• AND
• Subjective assessments regarding usability,
functionality etc
2/4/2014
4
What IS What IS aa Missed Bug?Missed Bug?
An Error in Judgment!
To determine how testers miss bugs, we need to
understand how humans make judgments, especially
in complex situationsin complex situations.
2/4/2014
5
How do we make How do we make 
judgments?judgments?
• Thinking, Fast and Slow – Daniel Kahneman
o System 1 thinking – fast, intuitive, and
sometimes wrong
o System 2 thinking – slower, more deliberate,
more accuratemore accurate
System 1 vs. System 2 System 1 vs. System 2 
ThinkingThinking
• System 1 thinking keeps us functioning
o Fast decisions, usually right enough
o Gullible and biased
• System 2 makes deliberate, thoughtful
decisions
o It is in charge of doubt ando It is in charge of doubt and
unbelieving
o But is often lazy
o Difficult to engage
2/4/2014
6
How Do We Apply System 1 How Do We Apply System 1 
and System 2 Thinking?and System 2 Thinking?
• System 1 thinking:• System 1 thinking:
o Is applied in our initial reactions to situations.
o May employ Heuristics or rules of thumb
• System 2 thinking:
o Is applied when we analyze a problem, for example
when calculating the answer to a math problem.
S t 1 d S t 2 b i fli t• System 1 and System 2 can be in conflict:
o lead to biases in decision-making.
The Lily Pad QuestionThe Lily Pad Question
In a lake, there is a patch of lily pads. Every day, the
patch doubles in size.
If it takes 48 days for the patch to cover the entire
lake, how long would it take for the patch to cover
f fhalf of the lake?
2/4/2014
7
How do Biases Impact How do Biases Impact 
Testing?Testing?
• We maintain certain beliefs in testing practice• We maintain certain beliefs in testing practice
o Which may or may not be factually true
o Those biases can affect our testing results
o We may be predisposed to believe something that
affects our work and our conclusions
Ho do bias and error ork?• How do bias and error work?
o We may test the wrong things
o Not find errors, or find false errors
The Representative BiasThe Representative Bias
• Happens when we judge the likelihood of anHappens when we judge the likelihood of an
occurrence in a particular situation by how closely
the situation resembles similar situations.
• Testers may be influenced by this bias when
designing data matrices, perhaps not testing data
in all states or not testing enough types of datain all states or not testing enough types of data.
• Case Study: Completed Order bug
2/4/2014
8
The Curse of KnowledgeThe Curse of Knowledge
• Happens when we are so knowledgeable aboutpp g
something, that our ability to address it from a less
informed, more neutral perspective is diminished.
• When testers develop so much domain knowledge
that they fail to test from the perspective of a new
user. Usability bugs are often missed due to thisuser. Usability bugs are often missed due to this
bias.
• Case Study: Date of Death Bug
The Congruence The Congruence BBiasias
• The tendency of experimenters to plan and executeThe tendency of experimenters to plan and execute
tests on just their own hypotheses without
considering alternative hypotheses.
• This bias is often the root cause of missed negative
test cases. Testers write test cases to validate that
the functionality works according to the
ifi ti d l t t lid t th t thspecifications and neglect to validate that the
functionality doesn’t work in ways that it should not.
• Case Study: Your negative test case or boundary
miss
2/4/2014
9
The Confirmation BiasThe Confirmation Bias
• The tendency to search for and interpretThe tendency to search for and interpret
information in a way that confirms one’s initial
perceptions.
• Testers’ initial perceptions of the quality of code, the
quality of the requirements and the capabilities of
developers can impact the ways in which they testdevelopers can impact the ways in which they test.
• Case Study: Ability to print more than once bug
The Anchoring The Anchoring EEffectffect
• The tendency to become locked on and rely tooy y
heavily on one piece of information and therefore
exclude other ideas or evidence that contradicts
the initial information.
• Software testers do this often when they validate
code to specifications exclusively withoutcode to specifications exclusively without
considering ambiguities or errors in the
requirements.
2/4/2014
10
Inattentional BlindnessInattentional Blindness
• Chabris and Simon conducted experiments on how
focusing on one thing makes us blind to othersfocusing on one thing makes us blind to others
o Invisible gorilla on the basketball court
o Images on a lung x-ray
Inattentional BlindnessInattentional Blindness
• a psychological lack of attention
• the tendency to miss obvious inconsistencies when
focusing specifically on a particular task.
• This happens in software testing when testers miss
the blatantly obvious bugs
2/4/2014
11
WWhy Do We hy Do We Develop Develop 
BiasesBiases??
• The Blind Spot Bias
o We evaluate our own decision-making process
differently than we evaluate how others make
decisions.
• West Meserve and StanovichWest, Meserve and Stanovich
Are you subject to Bias?Are you subject to Bias?
A Test Planning Exercise
This application is used to send a series of letters toThis application is used to send a series of letters to
homeowners and renters to inform them of required
inspections and get them to book appointments.
The series are different based on whether the residence is a
single or multi family and ends with a certified letter.
The bug is that if the renter to whom the initial letter was
sent moved the new renter would get the next letter in asent moved, the new renter would get the next letter in a
series, potentially a strongly-worded certified letter when
the series should start again from the first letter.
This is a medium-severity defect; but high priority as it
negatively affects customer experience.
2/4/2014
12
How Does This Apply To How Does This Apply To 
Testing?Testing?
W t th• We must manage the way we
think throughout the test
process.
oAs individual testersoAs individual testers
oAs test managers
oAs a professional community
How Can Testers Manage How Can Testers Manage 
Their Thought Processes?Their Thought Processes?
• Use more System 1 thinking?
OR
• Use more System 2 thinking?
2/4/2014
13
Test Methodology and Test Methodology and 
System 2 ThinkingSystem 2 Thinking
• Test methodology is the analyticalTest methodology is the analytical
framework of testing; it invokes our system
2 thinking and places the tester under
cognitive load.
Th d t i ti f h th th t l• The determination of whether the actual
results match the expected results
becomes an objective assessment.
HHow ow DDo o WWe e FFind ind BBugs?ugs?
Focus on System 1 thinkingFocus on System 1 thinking,
intuition and emotion
2/4/2014
14
Focus On System 1 Focus On System 1 
ThinkingThinking
He ristics sed ith Oracles• Heuristics used with Oracles
• Recognize our emotions as indicators
of potential bugs
• Exploratory Testing
How Should We Use How Should We Use 
Exploratory Testing?Exploratory Testing?
• Unstructured
o Before beginning test case execution
• Minimizes preconceived notions about
the application under test
o Oracle basedo Oracle based
• Users’ perspectives
• Data flow
2/4/2014
15
How Should We Use How Should We Use 
Exploratory Testing?Exploratory Testing?
• Structured
o Use to create additional test cases
• May be done earlier, possible as
modules are developed
o Session-Based
• Time-boxed charters
• Multiple testers
• Post test review session
What Can Test Managers What Can Test Managers 
Do?Do?
• Foster an environment in which the testers feel
comfortable and empowered to use System 1
thinking.
o Plan for exploratory testing in the test schedule
o Encourage Testers to take risks
o Reward for Quality of bugs rather than quantity
of test cases executed
2/4/2014
16
What Can the QA Profession Do?What Can the QA Profession Do?
A Paradigm Shift A Paradigm Shift 
o Shift our focus from requirements coverage
based test execution to a more intuitive
approach
o Exploratory testing and business process flow
testing becomes the norm rather than the
exceptionexception
o Develop new testing frameworks where risk-
based testing is executed through targeted
exploratory testing and is balanced with scripted
testing
Question Test ResultsQuestion Test Results
• Is there any reason to suspect we are evaluating
our test results based on self-interest,
overconfidence, or attachment to past
experiences?
• Have we fallen in love with our test results?
• Were there any differences of opinion among the
team reviewing the test results?
2/4/2014
17
How Do We Find The How Do We Find The 
Obvious Bugs?Obvious Bugs?
• Focus less
• Use intuition
• Believe what we can’t believe

More Related Content

PDF
Exploratory Testing: Make It Part of Your Test Strategy
PDF
A Taste of Exploratory Testing
PPTX
Exploratory Testing Explained and Experienced
PDF
Rapid Software Testing: Reporting
PPT
Introduction to Exploratory Testing
PPTX
Santa Barbara Agile: Exploratory Testing Explained and Experienced
PDF
Rapid Software Testing
PDF
Get Involved Early: A Tester’s Experience with Requirements
Exploratory Testing: Make It Part of Your Test Strategy
A Taste of Exploratory Testing
Exploratory Testing Explained and Experienced
Rapid Software Testing: Reporting
Introduction to Exploratory Testing
Santa Barbara Agile: Exploratory Testing Explained and Experienced
Rapid Software Testing
Get Involved Early: A Tester’s Experience with Requirements

What's hot (18)

PPT
Fabian Scarano - Preparing Your Team for the Future
PPTX
Testing Intelligence
PPTX
Exploratory testing workshop
PPT
Shrini Kulkarni - Software Metrics - So Simple, Yet So Dangerous
PDF
70 270 q & a
PDF
Exploratory Testing Explained
PDF
Exploratory Testing in Practice
PDF
Core define and_win_cmd_line gr
PDF
A Rapid Introduction to Rapid Software Testing
PDF
Rapid software testing
PPTX
Usability Testing Basics: What's it All About? at Web SIG Cleveland
PPTX
Injecting Threat Modeling into the SDLC by Susan Bradley
PPTX
Pragmatic programmer
PPT
Tech. Writing Usability Presentation
PDF
How to not suck at an audit-2.pdf
DOCX
Introduction To Pc Security
PDF
Mindful Metrics (QAotHW 2018)
PDF
The Thinking Tester, Evolved
Fabian Scarano - Preparing Your Team for the Future
Testing Intelligence
Exploratory testing workshop
Shrini Kulkarni - Software Metrics - So Simple, Yet So Dangerous
70 270 q & a
Exploratory Testing Explained
Exploratory Testing in Practice
Core define and_win_cmd_line gr
A Rapid Introduction to Rapid Software Testing
Rapid software testing
Usability Testing Basics: What's it All About? at Web SIG Cleveland
Injecting Threat Modeling into the SDLC by Susan Bradley
Pragmatic programmer
Tech. Writing Usability Presentation
How to not suck at an audit-2.pdf
Introduction To Pc Security
Mindful Metrics (QAotHW 2018)
The Thinking Tester, Evolved
Ad

Viewers also liked (16)

PDF
Seven Keys to Navigating Your Agile Testing Transition
PDF
Why Software Drives Us Crazy
PDF
Take a Test Drive of Acceptance Test-Driven Development
PDF
Key Test Design Techniques
PDF
Influence and Authority: Use Your Personal Power to Get Things Done
PDF
Keynote: Asking the Right Questions? What Journalism Can Teach Testers
PDF
Innovation Thinking: Evolve and Expand Your Capabilities
PDF
An Ounce of Prevention...
PDF
Testing Lessons Learned from Monty Python
PDF
Security Testing for Test Professionals
PDF
Measurement and Metrics for Test Managers
PDF
Top Challenges in Testing Requirements
PDF
DevOps: Where in the World Is Test?
PDF
Scaling Agile Up to the Enterprise and Staying Lean
PDF
Become a Big Data Quality Hero
PDF
The Three Pillars Approach to Your Agile Test Strategy
Seven Keys to Navigating Your Agile Testing Transition
Why Software Drives Us Crazy
Take a Test Drive of Acceptance Test-Driven Development
Key Test Design Techniques
Influence and Authority: Use Your Personal Power to Get Things Done
Keynote: Asking the Right Questions? What Journalism Can Teach Testers
Innovation Thinking: Evolve and Expand Your Capabilities
An Ounce of Prevention...
Testing Lessons Learned from Monty Python
Security Testing for Test Professionals
Measurement and Metrics for Test Managers
Top Challenges in Testing Requirements
DevOps: Where in the World Is Test?
Scaling Agile Up to the Enterprise and Staying Lean
Become a Big Data Quality Hero
The Three Pillars Approach to Your Agile Test Strategy
Ad

Similar to How Did I Miss That Bug? Managing Cognitive Bias in Testing (20)

PPTX
How did i miss that bug rtc
PPTX
Don’t Let Missed Bugs Cause Mayhem in your Organization!
PDF
Testing trapeze-2014-april
PPTX
Оксана Шатабилова. Предубеждения в тестировании. DataArt
PDF
XBOSoft webinar - How Did I Miss That Bug - Cognitive Biases in Software Testing
PDF
Applying Psychology To The Estimation of QA
PPTX
How do we fix testing
PDF
What's Next: Understanding User Research Bias
PDF
Lightning Strikes the Keynotes
PDF
Becoming a software testing expert
PPTX
John Fodeh - Spend Wisely, Test Well
PPTX
Moneyball peter varhol_starwest2012
PPTX
Agile Testing Agile Ottawa April 2015
PDF
Thinking Fast and Slow
PPT
Software testing _mod_9
PPT
PPTX
Heuristics, bias and critical thinking in testing distribution
PDF
Cognitive Biases and the User Experience
PPTX
Surviving as a Tester, Even in Difficult Circumstances with Randy Rice
PPTX
Surviving as a Software Tester, Even in Difficult Circumstances
How did i miss that bug rtc
Don’t Let Missed Bugs Cause Mayhem in your Organization!
Testing trapeze-2014-april
Оксана Шатабилова. Предубеждения в тестировании. DataArt
XBOSoft webinar - How Did I Miss That Bug - Cognitive Biases in Software Testing
Applying Psychology To The Estimation of QA
How do we fix testing
What's Next: Understanding User Research Bias
Lightning Strikes the Keynotes
Becoming a software testing expert
John Fodeh - Spend Wisely, Test Well
Moneyball peter varhol_starwest2012
Agile Testing Agile Ottawa April 2015
Thinking Fast and Slow
Software testing _mod_9
Heuristics, bias and critical thinking in testing distribution
Cognitive Biases and the User Experience
Surviving as a Tester, Even in Difficult Circumstances with Randy Rice
Surviving as a Software Tester, Even in Difficult Circumstances

More from TechWell (20)

PDF
Failing and Recovering
PDF
Instill a DevOps Testing Culture in Your Team and Organization
PDF
Test Design for Fully Automated Build Architecture
PDF
System-Level Test Automation: Ensuring a Good Start
PDF
Build Your Mobile App Quality and Test Strategy
PDF
Testing Transformation: The Art and Science for Success
PDF
Implement BDD with Cucumber and SpecFlow
PDF
Develop WebDriver Automated Tests—and Keep Your Sanity
PDF
Ma 15
PDF
Eliminate Cloud Waste with a Holistic DevOps Strategy
PDF
Transform Test Organizations for the New World of DevOps
PDF
The Fourth Constraint in Project Delivery—Leadership
PDF
Resolve the Contradiction of Specialists within Agile Teams
PDF
Pin the Tail on the Metric: A Field-Tested Agile Game
PDF
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
PDF
A Business-First Approach to DevOps Implementation
PDF
Databases in a Continuous Integration/Delivery Process
PDF
Mobile Testing: What—and What Not—to Automate
PDF
Cultural Intelligence: A Key Skill for Success
PDF
Turn the Lights On: A Power Utility Company's Agile Transformation
Failing and Recovering
Instill a DevOps Testing Culture in Your Team and Organization
Test Design for Fully Automated Build Architecture
System-Level Test Automation: Ensuring a Good Start
Build Your Mobile App Quality and Test Strategy
Testing Transformation: The Art and Science for Success
Implement BDD with Cucumber and SpecFlow
Develop WebDriver Automated Tests—and Keep Your Sanity
Ma 15
Eliminate Cloud Waste with a Holistic DevOps Strategy
Transform Test Organizations for the New World of DevOps
The Fourth Constraint in Project Delivery—Leadership
Resolve the Contradiction of Specialists within Agile Teams
Pin the Tail on the Metric: A Field-Tested Agile Game
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
A Business-First Approach to DevOps Implementation
Databases in a Continuous Integration/Delivery Process
Mobile Testing: What—and What Not—to Automate
Cultural Intelligence: A Key Skill for Success
Turn the Lights On: A Power Utility Company's Agile Transformation

Recently uploaded (20)

PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
MYSQL Presentation for SQL database connectivity
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Network Security Unit 5.pdf for BCA BBA.
DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
Cloud computing and distributed systems.
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
MIND Revenue Release Quarter 2 2025 Press Release
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Building Integrated photovoltaic BIPV_UPV.pdf
Chapter 3 Spatial Domain Image Processing.pdf
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
The Rise and Fall of 3GPP – Time for a Sabbatical?
Spectral efficient network and resource selection model in 5G networks
MYSQL Presentation for SQL database connectivity
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
20250228 LYD VKU AI Blended-Learning.pptx
Network Security Unit 5.pdf for BCA BBA.
The AUB Centre for AI in Media Proposal.docx
Cloud computing and distributed systems.
Dropbox Q2 2025 Financial Results & Investor Presentation

How Did I Miss That Bug? Managing Cognitive Bias in Testing

  • 2. Gerie Owen Northeast Utilities, Inc   QA Consultant Gerie Owen specializes in developing and managing offshore test teams. Gerie has implemented the offshore model—developing, training, and mentoring new teams from their inception. She manages large, complex projects involving multiple applications; coordinates test teams across multiple time zones; and delivers high quality products. Gerie’s most successful project team wrote, automated, and executed more than 80,000 test cases for two suites of web applications, allowing only one defect to escape into production. In her everyday life, Gerie enjoys mentoring new test leads and brings a cohesive team approach to testing.
  • 3. Peter Varhol Telerik   A testing evangelist at Telerik, Peter Varhol is a writer and speaker on software development and quality topics. Peter has authored dozens of articles on software tools and techniques for building applications, and has given conference presentations and webcasts on a variety of topics—user-centered design, integrating testing into agile development, and building the right software in an era of changing requirements. He has held key roles on engineering teams that have produced award- winning, quality tools such as BoundsChecker and SoftICE. Peter’s past roles include technology journalist, software product manager, software developer, and university professor.
  • 5. 2/4/2014 2 Consequences of Missed Consequences of Missed  BugsBugs • Possible Consequences of Missed Bugs: o Negative Publicity o Lost Sales o Lost Customers o Even Loss of Life MISSED BUGS CAUSE MAYHEM My JourneyMy Journey The “HOW” is more important than the “WHY”The HOW is more important than the WHY And now , I invite you to join with me into the journey of How Did I Miss that Bug?How Did I Miss that Bug?
  • 6. 2/4/2014 3 How Do We Miss Bugs?How Do We Miss Bugs? • Missed test casesMissed test cases • Misunderstanding of requirements • Misjudgment in risk-based testing • Inattention • Fatigue • Burnout • Multi-tasking How Do We Test?How Do We Test? • What is Software Testing?g o Software testing is making judgments about the quality of the software under test. o Involves: • Objective comparisons of code to specifications, • AND • Subjective assessments regarding usability, functionality etc
  • 7. 2/4/2014 4 What IS What IS aa Missed Bug?Missed Bug? An Error in Judgment! To determine how testers miss bugs, we need to understand how humans make judgments, especially in complex situationsin complex situations.
  • 8. 2/4/2014 5 How do we make How do we make  judgments?judgments? • Thinking, Fast and Slow – Daniel Kahneman o System 1 thinking – fast, intuitive, and sometimes wrong o System 2 thinking – slower, more deliberate, more accuratemore accurate System 1 vs. System 2 System 1 vs. System 2  ThinkingThinking • System 1 thinking keeps us functioning o Fast decisions, usually right enough o Gullible and biased • System 2 makes deliberate, thoughtful decisions o It is in charge of doubt ando It is in charge of doubt and unbelieving o But is often lazy o Difficult to engage
  • 9. 2/4/2014 6 How Do We Apply System 1 How Do We Apply System 1  and System 2 Thinking?and System 2 Thinking? • System 1 thinking:• System 1 thinking: o Is applied in our initial reactions to situations. o May employ Heuristics or rules of thumb • System 2 thinking: o Is applied when we analyze a problem, for example when calculating the answer to a math problem. S t 1 d S t 2 b i fli t• System 1 and System 2 can be in conflict: o lead to biases in decision-making. The Lily Pad QuestionThe Lily Pad Question In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover f fhalf of the lake?
  • 10. 2/4/2014 7 How do Biases Impact How do Biases Impact  Testing?Testing? • We maintain certain beliefs in testing practice• We maintain certain beliefs in testing practice o Which may or may not be factually true o Those biases can affect our testing results o We may be predisposed to believe something that affects our work and our conclusions Ho do bias and error ork?• How do bias and error work? o We may test the wrong things o Not find errors, or find false errors The Representative BiasThe Representative Bias • Happens when we judge the likelihood of anHappens when we judge the likelihood of an occurrence in a particular situation by how closely the situation resembles similar situations. • Testers may be influenced by this bias when designing data matrices, perhaps not testing data in all states or not testing enough types of datain all states or not testing enough types of data. • Case Study: Completed Order bug
  • 11. 2/4/2014 8 The Curse of KnowledgeThe Curse of Knowledge • Happens when we are so knowledgeable aboutpp g something, that our ability to address it from a less informed, more neutral perspective is diminished. • When testers develop so much domain knowledge that they fail to test from the perspective of a new user. Usability bugs are often missed due to thisuser. Usability bugs are often missed due to this bias. • Case Study: Date of Death Bug The Congruence The Congruence BBiasias • The tendency of experimenters to plan and executeThe tendency of experimenters to plan and execute tests on just their own hypotheses without considering alternative hypotheses. • This bias is often the root cause of missed negative test cases. Testers write test cases to validate that the functionality works according to the ifi ti d l t t lid t th t thspecifications and neglect to validate that the functionality doesn’t work in ways that it should not. • Case Study: Your negative test case or boundary miss
  • 12. 2/4/2014 9 The Confirmation BiasThe Confirmation Bias • The tendency to search for and interpretThe tendency to search for and interpret information in a way that confirms one’s initial perceptions. • Testers’ initial perceptions of the quality of code, the quality of the requirements and the capabilities of developers can impact the ways in which they testdevelopers can impact the ways in which they test. • Case Study: Ability to print more than once bug The Anchoring The Anchoring EEffectffect • The tendency to become locked on and rely tooy y heavily on one piece of information and therefore exclude other ideas or evidence that contradicts the initial information. • Software testers do this often when they validate code to specifications exclusively withoutcode to specifications exclusively without considering ambiguities or errors in the requirements.
  • 13. 2/4/2014 10 Inattentional BlindnessInattentional Blindness • Chabris and Simon conducted experiments on how focusing on one thing makes us blind to othersfocusing on one thing makes us blind to others o Invisible gorilla on the basketball court o Images on a lung x-ray Inattentional BlindnessInattentional Blindness • a psychological lack of attention • the tendency to miss obvious inconsistencies when focusing specifically on a particular task. • This happens in software testing when testers miss the blatantly obvious bugs
  • 14. 2/4/2014 11 WWhy Do We hy Do We Develop Develop  BiasesBiases?? • The Blind Spot Bias o We evaluate our own decision-making process differently than we evaluate how others make decisions. • West Meserve and StanovichWest, Meserve and Stanovich Are you subject to Bias?Are you subject to Bias? A Test Planning Exercise This application is used to send a series of letters toThis application is used to send a series of letters to homeowners and renters to inform them of required inspections and get them to book appointments. The series are different based on whether the residence is a single or multi family and ends with a certified letter. The bug is that if the renter to whom the initial letter was sent moved the new renter would get the next letter in asent moved, the new renter would get the next letter in a series, potentially a strongly-worded certified letter when the series should start again from the first letter. This is a medium-severity defect; but high priority as it negatively affects customer experience.
  • 15. 2/4/2014 12 How Does This Apply To How Does This Apply To  Testing?Testing? W t th• We must manage the way we think throughout the test process. oAs individual testersoAs individual testers oAs test managers oAs a professional community How Can Testers Manage How Can Testers Manage  Their Thought Processes?Their Thought Processes? • Use more System 1 thinking? OR • Use more System 2 thinking?
  • 16. 2/4/2014 13 Test Methodology and Test Methodology and  System 2 ThinkingSystem 2 Thinking • Test methodology is the analyticalTest methodology is the analytical framework of testing; it invokes our system 2 thinking and places the tester under cognitive load. Th d t i ti f h th th t l• The determination of whether the actual results match the expected results becomes an objective assessment. HHow ow DDo o WWe e FFind ind BBugs?ugs? Focus on System 1 thinkingFocus on System 1 thinking, intuition and emotion
  • 17. 2/4/2014 14 Focus On System 1 Focus On System 1  ThinkingThinking He ristics sed ith Oracles• Heuristics used with Oracles • Recognize our emotions as indicators of potential bugs • Exploratory Testing How Should We Use How Should We Use  Exploratory Testing?Exploratory Testing? • Unstructured o Before beginning test case execution • Minimizes preconceived notions about the application under test o Oracle basedo Oracle based • Users’ perspectives • Data flow
  • 18. 2/4/2014 15 How Should We Use How Should We Use  Exploratory Testing?Exploratory Testing? • Structured o Use to create additional test cases • May be done earlier, possible as modules are developed o Session-Based • Time-boxed charters • Multiple testers • Post test review session What Can Test Managers What Can Test Managers  Do?Do? • Foster an environment in which the testers feel comfortable and empowered to use System 1 thinking. o Plan for exploratory testing in the test schedule o Encourage Testers to take risks o Reward for Quality of bugs rather than quantity of test cases executed
  • 19. 2/4/2014 16 What Can the QA Profession Do?What Can the QA Profession Do? A Paradigm Shift A Paradigm Shift  o Shift our focus from requirements coverage based test execution to a more intuitive approach o Exploratory testing and business process flow testing becomes the norm rather than the exceptionexception o Develop new testing frameworks where risk- based testing is executed through targeted exploratory testing and is balanced with scripted testing Question Test ResultsQuestion Test Results • Is there any reason to suspect we are evaluating our test results based on self-interest, overconfidence, or attachment to past experiences? • Have we fallen in love with our test results? • Were there any differences of opinion among the team reviewing the test results?