SlideShare a Scribd company logo
T1
Test Management
5/8/2014 9:45:00 AM
A Funny Thing Happened on the
Way to User Acceptance Testing
Presented by:
Randy Rice
Rice Consulting Services, Inc.
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
Randy Rice
Rice Consulting Services, Inc.
A leading author, speaker, and consultant with more than thirty years of experience in the field
of software testing and software quality, Randy Rice has worked with organizations worldwide to
improve the quality of their information systems and optimize their testing processes. He is
coauthor (with William E. Perry) of Surviving the Top Ten Challenges of Software Testing and
Testing Dirty Systems. Randy is an officer of the American Software Testing Qualifications
Board (ASTQB). Founder, principal consultant, and trainer at Rice Consulting Services, Randy
can be contacted at riceconsulting.com where he publishes articles, newsletters, and other
content about software testing and software quality. Visit Randy’s blog.
4/26/2014
1
A FUNNY THING
HAPPENED ON THE
WAY TO THE
ACCEPTANCE TEST
RANDALL W. RICE, CTAL
RICE CONSULTING SERVICES, INC.
WWW.RICECONSULTING.COM
2
THIS PRESENTATION
• The account of four different
acceptance tests, in three
organizations.
• The names have been withheld and
the data generalized to protect
privacy.
• One project was in-house developed
and the other three were vendor-
developed systems.
• So, a more traditional UAT approach
was taken.
4/26/2014
2
3
A COMMON
PERCEPTION OF UAT
• UAT is often seen as that last golden moment or phase of
testing, where
• Users give feedback/acceptance
• Minor problems are identified and fixed
• The project is implemented on time
• High fives, all around
4
IN REALITY…
• UAT is one of the most risky and
explosive levels of testing.
• UAT is greatly needed, but happens at
the worst time to find major defects –
at the end of the project.
• Users may be unfriendly to the new
system
• They like the current one just fine,
thank you.
• Much of your UAT planning may be
ignored.
• People tend to underestimate how
many cycles of regression testing are
needed.
4/26/2014
3
5
THERE ARE MANY
QUESTIONS ABOUT UAT
• Who plans it?
• Who performs it?
• Should it only be manual in nature?
• What is the basis for test design and
evaluation?
• When should it be performed?
• Where should it be performed?
• Who leads it?
• How much weight should be given to it?
6
PROJECT #1
• Medical laboratory testing business
that closely resembles a manufacturing
environment.
• New Technology for the company and
for the industry.
• The previous project had failed
• The company almost went out of
business because of it!
• Very high growth in terms of both
business and employees.
• Company at risk of failure.
• This project was truly mission-critical.
4/26/2014
4
7
PROJECT #1 – CONT’D.
• Few functional requirements.
• 8 pages for over 400 modules
• Test team had little knowledge of:
• subject matter,
• test design,
• or testing at all.
• Very little unit or integration testing
being done by developers.
• Some system testing was done.
• UAT was the focus.
8
DEFECT DISCOVERY
AND BACKLOG
System Test UAT 1st Deploy
2nd Deploy
3rd Deploy
4 weeks 3 weeks
4/26/2014
5
9
PROJECT #1 RESULTS
• Very high defect levels in testing.
• Many were resolved before implementation.
• Severe performance problems.
• Old system could process 8,000 units/day
• New system could process 400 units/day
• Many problems due to the new technology being used
• “bleeding edge” issues
• “Deadline or else” attitude
• The business was under extreme pressure to deploy due to
increased processing volumes.
• System was de-installed/re-installed 3 times before
performance was acceptable to deploy.
10
WHAT WE LEARNED
• Requirements are important.
• Even if you have to create some form of
them after the software has been
written.
• Early testing is important.
• That would have caught early
performance bottlenecks.
• Teamwork is critical.
• Things got so bad we had to have a “do
over.”
• The deadline is a bad criteria for
implementation.
• Always have a “Plan B”.
4/26/2014
6
11
UAT LESSONS
• Build good relationships with subject
matter experts.
• They often determine acceptance
• Listen to the end-users.
• Understand what’s important
• Don’t rely on UAT for defect detection.
• Interesting factoid
• A similar project with the exact same
technology failed due to performance
errors two years later for a city water
utility. $1.8 million lawsuit lost by the
vendor.
12
PROJECT #2
• Same company as before, but two
years later
• Integration of a vendor-developed
and customized accounting
system
• Lots of defects in the vendor
system
• Implemented two months late with
practically zero defects.
4/26/2014
7
13
WHAT MADE THE
DIFFERENCE?
• Same people – testers, IT manager, developer
• Different project manager who was a big
supporter of testing
• More experience with the technology
• Better understanding of testing and test design
• A repeatable process
• Less pressure to implement
• Having a contingency plan
• Having the courage to delay deployment in favor
of quality.
• The financials had to be right.
14
PROJECT #3
• New government-sponsored entity.
• Everything was new – building, people,
systems
• System was a vendor-developed
workers compensation system.
• Some customization
• Little documentation
• Designed all tests based on business
scenarios.
• We had no idea of the UI design.
4/26/2014
8
15
KEY FACTORS
• No end-users in place at first to help with
any UAT planning.
• In fact, we had to train the end-users in
the system and the business.
• Lots of test planning was involved
• 50% or more effort in planning and
optimizing tests.
• This paid off big in test execution and
training
16
RESULTS
• Tested 700 modules with 250 business
scenario tests.
• We had designed over 300 tests
• The management and test team felt
confident after 250 tests we had covered
enough of the system.
• Found many defects in a system that
had been in use in other companies for
years.
• Reused a lot of the testware as training
aids.
• Successful launch of the organization
and system.
4/26/2014
9
17
HARD LESSONS
LEARNED
• “You don’t know what you don’t know”
AND “You sometimes don’t know what you
think you know.”
• Newly hired SME with over 30 years
workers comp experience provided
information that was different (correct)
than what we had been told during test
design.
• We had to assign two people for two weeks
to create new tests.
• These were complex financial functions –
we couldn’t make it up on the fly.
18
HARD LESSONS
LEARNED (2)
• Real users are needed for UAT.
• Sometimes the heavy lifting of test
design may be done by other testers,
but users need heavy involvement.
4/26/2014
10
19
PROJECT #4
• State government, Legal application
• Vendor-developed and customized
• Highly complex system purchased to replace two co-
existing systems.
• Half of the counties in the state used one system, the other
half used another.
• Usability factors were low on the new system
• Data conversion correctness was critical
20
THE GOOD SIDE
• Well-defined project processes
• Highly engaged management and
stakeholders
• Good project planning and tracking
• Incremental implementation strategy
• The entire system was implemented,
only one county at a time.
• Heavy system testing
• Good team attitude
4/26/2014
11
21
THE CHALLENGES
• The system’s learning curve was very high.
• The key stakeholders set a high bar for acceptance.
• The actual users were few in number and were only able to
perform a few of the planned tests.
• Very high defect levels.
22
LEADING UP TO
VENDOR SELECTION
• Over 2 years of meeting with users and
stakeholders to determine business
needs.
• Included:
• JAD sessions
• Creation of “as-is” and “to-be” use
cases
• Set of acceptance criteria
(approximately 350 acceptance criteria
items)
4/26/2014
12
23
THE STRATEGY
• Create test scenarios that described
the trail to follow in testing a task,
but not to the level of keystrokes.
• Based on use cases.
• The problem turned out to be that
even the BAs and trainers had
difficulty in performing the scenarios.
• System complexity was high.
• Training had not been conducted.
• Usability was low
24
DEFECT DISCOVERY
AND BACKLOG
System Test UAT 1st Deploy
10 weeks 4 weeks
750
250
4/26/2014
13
25
WHAT WAS
VALIDATED
• The precise “point and click” scripts provided
by the vendor were long and difficult to
perform.
• Each one took days.
• Plus, there were errors in the scripts and
differences between what the script indicated
and what the system did.
26
THE BIG SURPRISES
• We planned the system test to be a practice run for UAT.
• It turned out to be the most productive phase of testing in
terms of finding defects.
• We planned for a 10 week UAT effort with 10 users
• It turned out to be a 2 week effort with 4 users.
• First sense of trouble: initial users were exhausted after 3
days of a pre-test effort.
4/26/2014
14
27
THE BIG SURPRISES (2)
• We used none of the planned tests (around 350 scenarios)
in UAT.
• Instead, it was a guided “happy path” walkthrough, noting
problems along the way.
• Defects were found, but the majority of defects had been
found in system test.
28
LESSONS LEARNED
• The early system test was invaluable in
finding defects.
• Learning the system is critical for users in
new systems before they are able to test it.
• The test documentation is not enough to
provide context of how the system works.
• It took a lot of flexibility on the part of
everyone (client, vendor, testers, users,
stakeholders) to make it to the first
implementation.
• Sometimes actual users just aren’t able to
perform a rigorous test.
4/26/2014
15
29
WHAT CAN WE LEARN FROM
ALL THESE PROJECTS?
• UAT is a much-needed test, but happens at the worst
possible time – just before implementation.
• You can take some of the late defect impact away with
system testing and reviews.
• You can lessen the risk of deployment by implementing to
a smaller and lower risk user base first.
• Actual end-users are good for performing UAT, but much
depends on what you are testing and the capabilities of
the users.
• The reality is the users are going to have to use the system
in real-life anyway.
• However, not all users are good testers!
30
WHAT CAN WE
LEARN? (2)
• Be careful how much time and effort
you invest in planning for UAT before
the capabilities are truly known.
• That is, senior management may want
actual users to test for 8 weeks, but if
the people aren’t available or can’t
handle the load, then it probably isn’t
going to happen.
• Don’t place all the weight of testing on
UAT.
• In project #4 our system testing found
a majority of the defects.
4/26/2014
16
31
WHAT CAN WE
LEARN? (3)
• UAT test planning isn’t bad, just expect
changes.
• People, software, business, timelines –
they all change.
• Try to optimize and prioritize.
• Example: If you have 500 points of
acceptance criteria, can they be
validated with 200 tests?
• Which of the acceptance criteria are
critical, needed and “nice to have”?
32
4/26/2014
17
33
BIO - RANDALL W. RICE
• Over 35 years experience in building and testing
information systems in a variety of industries
and technical environments
• ASTQB Certified Tester – Foundation level,
Advanced level (Full)
• Director, American Software Testing Qualification
Board (ASTQB)
• Chairperson, 1995 - 2000 QAI’’’’s annual software
testing conference
• Co-author with William E. Perry, Surviving the
Top Ten Challenges of Software Testing and
Testing Dirty Systems
• Principal Consultant and Trainer, Rice
Consulting Services, Inc.
34
CONTACT INFORMATION
Randall W. Rice, CTAL
Rice Consulting Services, Inc.
P.O. Box 892003
Oklahoma City, OK 73170
Ph: 405-691-8075
Fax: 405-691-1441
Web site: www.riceconsulting.com
e-mail: rrice@riceconsulting.com

More Related Content

PDF
Software UAT Case study - Finserv
PPTX
Fundamentals OF UAT
PPTX
Getting Ready for UAT
PDF
Sample User Acceptance Test
PPTX
User acceptance testing checklist (uat)
PDF
How to accelerate UAT & Regression Testing
DOCX
NAM Q4a 2011 UAT Strategy Document v1 0
PDF
Software Testing Principles and  Techniques
Software UAT Case study - Finserv
Fundamentals OF UAT
Getting Ready for UAT
Sample User Acceptance Test
User acceptance testing checklist (uat)
How to accelerate UAT & Regression Testing
NAM Q4a 2011 UAT Strategy Document v1 0
Software Testing Principles and  Techniques

What's hot (20)

PPSX
Principles of Software testing
PDF
[HCMC STC Jan 2015] Practical Experiences In Test Automation
PDF
Analytical Risk-based and Specification-based Testing - Bui Duy Tam
PPTX
Tips and tricks for successful uat testing 2.1
PPT
Testing Practice: Lera Technologies
PDF
Test Environment Management
PDF
Deliver Fast, Break Nothing Via Effective Building Developer and Tester Colla...
PPTX
Importance of a Test Management Tool for Your Project
PPTX
Software test automation
PDF
Pairwise testing
PDF
Case study on Banking Software Testing - FINACLE : UAT
PPTX
Negative Testing
PPT
Basics of Software Testing
DOC
38475471 qa-and-software-testing-interview-questions-and-answers
DOCX
Qa interview questions and answers
PDF
What is Software Testing | Edureka
PPT
Software Inspection And Defect Management
PDF
Qa interview questions and answers
PDF
A Novel Approach of Automation Test for Software Monitoring Solution - Tran S...
PDF
[Vu Van Nguyen] Value-based Software Testing an Approach to Prioritizing Tests
Principles of Software testing
[HCMC STC Jan 2015] Practical Experiences In Test Automation
Analytical Risk-based and Specification-based Testing - Bui Duy Tam
Tips and tricks for successful uat testing 2.1
Testing Practice: Lera Technologies
Test Environment Management
Deliver Fast, Break Nothing Via Effective Building Developer and Tester Colla...
Importance of a Test Management Tool for Your Project
Software test automation
Pairwise testing
Case study on Banking Software Testing - FINACLE : UAT
Negative Testing
Basics of Software Testing
38475471 qa-and-software-testing-interview-questions-and-answers
Qa interview questions and answers
What is Software Testing | Edureka
Software Inspection And Defect Management
Qa interview questions and answers
A Novel Approach of Automation Test for Software Monitoring Solution - Tran S...
[Vu Van Nguyen] Value-based Software Testing an Approach to Prioritizing Tests
Ad

Similar to A Funny Thing Happened on the Way to User Acceptance Testing (20)

PDF
Testing without defined requirements
PPTX
Strategy vs. Tactical Testing: Actions for Today, Plans for Tomorrow​
PPT
AiTi Education Software Testing Session 02 a
PDF
PAC 2019 virtual Joerek Van Gaalen
PPTX
The Challenge of Accepting Software
PDF
DevOpsDays Houston 2019 - Lee Barnes - Effective Test Automation in DevOps - ...
PPT
SoftwareTesting-Lets learn about document handling
PDF
Effective Test Automation in DevOps
PPTX
10-3 Clinical Informatics System Selection & Implementation
PPT
System development
PPTX
Creating Functional Testing Strategy.pptx
PDF
Quality Assurance in Modern Software Development
PDF
Delivering A Great End User Experience
PPT
SQA_Lec#01-1.ppt
PPTX
suruuuuuuuuxdvvvvvvvvvvvvvv ssssssrnbn bvcbvc
PDF
DevOps Workshop - Addressing Quality Challenges of Highly Complex and Integra...
PDF
How to Migrate Drug Safety and Pharmacovigilance Data Cost-Effectively and wi...
PDF
How to build confidence in your release cycle
PPTX
6 Ways to Speed Up App Testing
PPTX
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
Testing without defined requirements
Strategy vs. Tactical Testing: Actions for Today, Plans for Tomorrow​
AiTi Education Software Testing Session 02 a
PAC 2019 virtual Joerek Van Gaalen
The Challenge of Accepting Software
DevOpsDays Houston 2019 - Lee Barnes - Effective Test Automation in DevOps - ...
SoftwareTesting-Lets learn about document handling
Effective Test Automation in DevOps
10-3 Clinical Informatics System Selection & Implementation
System development
Creating Functional Testing Strategy.pptx
Quality Assurance in Modern Software Development
Delivering A Great End User Experience
SQA_Lec#01-1.ppt
suruuuuuuuuxdvvvvvvvvvvvvvv ssssssrnbn bvcbvc
DevOps Workshop - Addressing Quality Challenges of Highly Complex and Integra...
How to Migrate Drug Safety and Pharmacovigilance Data Cost-Effectively and wi...
How to build confidence in your release cycle
6 Ways to Speed Up App Testing
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
Ad

More from TechWell (20)

PDF
Failing and Recovering
PDF
Instill a DevOps Testing Culture in Your Team and Organization
PDF
Test Design for Fully Automated Build Architecture
PDF
System-Level Test Automation: Ensuring a Good Start
PDF
Build Your Mobile App Quality and Test Strategy
PDF
Testing Transformation: The Art and Science for Success
PDF
Implement BDD with Cucumber and SpecFlow
PDF
Develop WebDriver Automated Tests—and Keep Your Sanity
PDF
Ma 15
PDF
Eliminate Cloud Waste with a Holistic DevOps Strategy
PDF
Transform Test Organizations for the New World of DevOps
PDF
The Fourth Constraint in Project Delivery—Leadership
PDF
Resolve the Contradiction of Specialists within Agile Teams
PDF
Pin the Tail on the Metric: A Field-Tested Agile Game
PDF
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
PDF
A Business-First Approach to DevOps Implementation
PDF
Databases in a Continuous Integration/Delivery Process
PDF
Mobile Testing: What—and What Not—to Automate
PDF
Cultural Intelligence: A Key Skill for Success
PDF
Turn the Lights On: A Power Utility Company's Agile Transformation
Failing and Recovering
Instill a DevOps Testing Culture in Your Team and Organization
Test Design for Fully Automated Build Architecture
System-Level Test Automation: Ensuring a Good Start
Build Your Mobile App Quality and Test Strategy
Testing Transformation: The Art and Science for Success
Implement BDD with Cucumber and SpecFlow
Develop WebDriver Automated Tests—and Keep Your Sanity
Ma 15
Eliminate Cloud Waste with a Holistic DevOps Strategy
Transform Test Organizations for the New World of DevOps
The Fourth Constraint in Project Delivery—Leadership
Resolve the Contradiction of Specialists within Agile Teams
Pin the Tail on the Metric: A Field-Tested Agile Game
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
A Business-First Approach to DevOps Implementation
Databases in a Continuous Integration/Delivery Process
Mobile Testing: What—and What Not—to Automate
Cultural Intelligence: A Key Skill for Success
Turn the Lights On: A Power Utility Company's Agile Transformation

Recently uploaded (20)

PDF
Chapter 3 Spatial Domain Image Processing.pdf
PPTX
Spectroscopy.pptx food analysis technology
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Encapsulation theory and applications.pdf
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PDF
Machine learning based COVID-19 study performance prediction
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
sap open course for s4hana steps from ECC to s4
PPTX
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
Cloud computing and distributed systems.
PDF
Empathic Computing: Creating Shared Understanding
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
Chapter 3 Spatial Domain Image Processing.pdf
Spectroscopy.pptx food analysis technology
Network Security Unit 5.pdf for BCA BBA.
Encapsulation theory and applications.pdf
Spectral efficient network and resource selection model in 5G networks
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Assigned Numbers - 2025 - Bluetooth® Document
Machine learning based COVID-19 study performance prediction
Unlocking AI with Model Context Protocol (MCP)
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
sap open course for s4hana steps from ECC to s4
ACSFv1EN-58255 AWS Academy Cloud Security Foundations.pptx
Review of recent advances in non-invasive hemoglobin estimation
“AI and Expert System Decision Support & Business Intelligence Systems”
Cloud computing and distributed systems.
Empathic Computing: Creating Shared Understanding
Encapsulation_ Review paper, used for researhc scholars
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Digital-Transformation-Roadmap-for-Companies.pptx

A Funny Thing Happened on the Way to User Acceptance Testing

  • 1. T1 Test Management 5/8/2014 9:45:00 AM A Funny Thing Happened on the Way to User Acceptance Testing Presented by: Randy Rice Rice Consulting Services, Inc. Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • 2. Randy Rice Rice Consulting Services, Inc. A leading author, speaker, and consultant with more than thirty years of experience in the field of software testing and software quality, Randy Rice has worked with organizations worldwide to improve the quality of their information systems and optimize their testing processes. He is coauthor (with William E. Perry) of Surviving the Top Ten Challenges of Software Testing and Testing Dirty Systems. Randy is an officer of the American Software Testing Qualifications Board (ASTQB). Founder, principal consultant, and trainer at Rice Consulting Services, Randy can be contacted at riceconsulting.com where he publishes articles, newsletters, and other content about software testing and software quality. Visit Randy’s blog.
  • 3. 4/26/2014 1 A FUNNY THING HAPPENED ON THE WAY TO THE ACCEPTANCE TEST RANDALL W. RICE, CTAL RICE CONSULTING SERVICES, INC. WWW.RICECONSULTING.COM 2 THIS PRESENTATION • The account of four different acceptance tests, in three organizations. • The names have been withheld and the data generalized to protect privacy. • One project was in-house developed and the other three were vendor- developed systems. • So, a more traditional UAT approach was taken.
  • 4. 4/26/2014 2 3 A COMMON PERCEPTION OF UAT • UAT is often seen as that last golden moment or phase of testing, where • Users give feedback/acceptance • Minor problems are identified and fixed • The project is implemented on time • High fives, all around 4 IN REALITY… • UAT is one of the most risky and explosive levels of testing. • UAT is greatly needed, but happens at the worst time to find major defects – at the end of the project. • Users may be unfriendly to the new system • They like the current one just fine, thank you. • Much of your UAT planning may be ignored. • People tend to underestimate how many cycles of regression testing are needed.
  • 5. 4/26/2014 3 5 THERE ARE MANY QUESTIONS ABOUT UAT • Who plans it? • Who performs it? • Should it only be manual in nature? • What is the basis for test design and evaluation? • When should it be performed? • Where should it be performed? • Who leads it? • How much weight should be given to it? 6 PROJECT #1 • Medical laboratory testing business that closely resembles a manufacturing environment. • New Technology for the company and for the industry. • The previous project had failed • The company almost went out of business because of it! • Very high growth in terms of both business and employees. • Company at risk of failure. • This project was truly mission-critical.
  • 6. 4/26/2014 4 7 PROJECT #1 – CONT’D. • Few functional requirements. • 8 pages for over 400 modules • Test team had little knowledge of: • subject matter, • test design, • or testing at all. • Very little unit or integration testing being done by developers. • Some system testing was done. • UAT was the focus. 8 DEFECT DISCOVERY AND BACKLOG System Test UAT 1st Deploy 2nd Deploy 3rd Deploy 4 weeks 3 weeks
  • 7. 4/26/2014 5 9 PROJECT #1 RESULTS • Very high defect levels in testing. • Many were resolved before implementation. • Severe performance problems. • Old system could process 8,000 units/day • New system could process 400 units/day • Many problems due to the new technology being used • “bleeding edge” issues • “Deadline or else” attitude • The business was under extreme pressure to deploy due to increased processing volumes. • System was de-installed/re-installed 3 times before performance was acceptable to deploy. 10 WHAT WE LEARNED • Requirements are important. • Even if you have to create some form of them after the software has been written. • Early testing is important. • That would have caught early performance bottlenecks. • Teamwork is critical. • Things got so bad we had to have a “do over.” • The deadline is a bad criteria for implementation. • Always have a “Plan B”.
  • 8. 4/26/2014 6 11 UAT LESSONS • Build good relationships with subject matter experts. • They often determine acceptance • Listen to the end-users. • Understand what’s important • Don’t rely on UAT for defect detection. • Interesting factoid • A similar project with the exact same technology failed due to performance errors two years later for a city water utility. $1.8 million lawsuit lost by the vendor. 12 PROJECT #2 • Same company as before, but two years later • Integration of a vendor-developed and customized accounting system • Lots of defects in the vendor system • Implemented two months late with practically zero defects.
  • 9. 4/26/2014 7 13 WHAT MADE THE DIFFERENCE? • Same people – testers, IT manager, developer • Different project manager who was a big supporter of testing • More experience with the technology • Better understanding of testing and test design • A repeatable process • Less pressure to implement • Having a contingency plan • Having the courage to delay deployment in favor of quality. • The financials had to be right. 14 PROJECT #3 • New government-sponsored entity. • Everything was new – building, people, systems • System was a vendor-developed workers compensation system. • Some customization • Little documentation • Designed all tests based on business scenarios. • We had no idea of the UI design.
  • 10. 4/26/2014 8 15 KEY FACTORS • No end-users in place at first to help with any UAT planning. • In fact, we had to train the end-users in the system and the business. • Lots of test planning was involved • 50% or more effort in planning and optimizing tests. • This paid off big in test execution and training 16 RESULTS • Tested 700 modules with 250 business scenario tests. • We had designed over 300 tests • The management and test team felt confident after 250 tests we had covered enough of the system. • Found many defects in a system that had been in use in other companies for years. • Reused a lot of the testware as training aids. • Successful launch of the organization and system.
  • 11. 4/26/2014 9 17 HARD LESSONS LEARNED • “You don’t know what you don’t know” AND “You sometimes don’t know what you think you know.” • Newly hired SME with over 30 years workers comp experience provided information that was different (correct) than what we had been told during test design. • We had to assign two people for two weeks to create new tests. • These were complex financial functions – we couldn’t make it up on the fly. 18 HARD LESSONS LEARNED (2) • Real users are needed for UAT. • Sometimes the heavy lifting of test design may be done by other testers, but users need heavy involvement.
  • 12. 4/26/2014 10 19 PROJECT #4 • State government, Legal application • Vendor-developed and customized • Highly complex system purchased to replace two co- existing systems. • Half of the counties in the state used one system, the other half used another. • Usability factors were low on the new system • Data conversion correctness was critical 20 THE GOOD SIDE • Well-defined project processes • Highly engaged management and stakeholders • Good project planning and tracking • Incremental implementation strategy • The entire system was implemented, only one county at a time. • Heavy system testing • Good team attitude
  • 13. 4/26/2014 11 21 THE CHALLENGES • The system’s learning curve was very high. • The key stakeholders set a high bar for acceptance. • The actual users were few in number and were only able to perform a few of the planned tests. • Very high defect levels. 22 LEADING UP TO VENDOR SELECTION • Over 2 years of meeting with users and stakeholders to determine business needs. • Included: • JAD sessions • Creation of “as-is” and “to-be” use cases • Set of acceptance criteria (approximately 350 acceptance criteria items)
  • 14. 4/26/2014 12 23 THE STRATEGY • Create test scenarios that described the trail to follow in testing a task, but not to the level of keystrokes. • Based on use cases. • The problem turned out to be that even the BAs and trainers had difficulty in performing the scenarios. • System complexity was high. • Training had not been conducted. • Usability was low 24 DEFECT DISCOVERY AND BACKLOG System Test UAT 1st Deploy 10 weeks 4 weeks 750 250
  • 15. 4/26/2014 13 25 WHAT WAS VALIDATED • The precise “point and click” scripts provided by the vendor were long and difficult to perform. • Each one took days. • Plus, there were errors in the scripts and differences between what the script indicated and what the system did. 26 THE BIG SURPRISES • We planned the system test to be a practice run for UAT. • It turned out to be the most productive phase of testing in terms of finding defects. • We planned for a 10 week UAT effort with 10 users • It turned out to be a 2 week effort with 4 users. • First sense of trouble: initial users were exhausted after 3 days of a pre-test effort.
  • 16. 4/26/2014 14 27 THE BIG SURPRISES (2) • We used none of the planned tests (around 350 scenarios) in UAT. • Instead, it was a guided “happy path” walkthrough, noting problems along the way. • Defects were found, but the majority of defects had been found in system test. 28 LESSONS LEARNED • The early system test was invaluable in finding defects. • Learning the system is critical for users in new systems before they are able to test it. • The test documentation is not enough to provide context of how the system works. • It took a lot of flexibility on the part of everyone (client, vendor, testers, users, stakeholders) to make it to the first implementation. • Sometimes actual users just aren’t able to perform a rigorous test.
  • 17. 4/26/2014 15 29 WHAT CAN WE LEARN FROM ALL THESE PROJECTS? • UAT is a much-needed test, but happens at the worst possible time – just before implementation. • You can take some of the late defect impact away with system testing and reviews. • You can lessen the risk of deployment by implementing to a smaller and lower risk user base first. • Actual end-users are good for performing UAT, but much depends on what you are testing and the capabilities of the users. • The reality is the users are going to have to use the system in real-life anyway. • However, not all users are good testers! 30 WHAT CAN WE LEARN? (2) • Be careful how much time and effort you invest in planning for UAT before the capabilities are truly known. • That is, senior management may want actual users to test for 8 weeks, but if the people aren’t available or can’t handle the load, then it probably isn’t going to happen. • Don’t place all the weight of testing on UAT. • In project #4 our system testing found a majority of the defects.
  • 18. 4/26/2014 16 31 WHAT CAN WE LEARN? (3) • UAT test planning isn’t bad, just expect changes. • People, software, business, timelines – they all change. • Try to optimize and prioritize. • Example: If you have 500 points of acceptance criteria, can they be validated with 200 tests? • Which of the acceptance criteria are critical, needed and “nice to have”? 32
  • 19. 4/26/2014 17 33 BIO - RANDALL W. RICE • Over 35 years experience in building and testing information systems in a variety of industries and technical environments • ASTQB Certified Tester – Foundation level, Advanced level (Full) • Director, American Software Testing Qualification Board (ASTQB) • Chairperson, 1995 - 2000 QAI’’’’s annual software testing conference • Co-author with William E. Perry, Surviving the Top Ten Challenges of Software Testing and Testing Dirty Systems • Principal Consultant and Trainer, Rice Consulting Services, Inc. 34 CONTACT INFORMATION Randall W. Rice, CTAL Rice Consulting Services, Inc. P.O. Box 892003 Oklahoma City, OK 73170 Ph: 405-691-8075 Fax: 405-691-1441 Web site: www.riceconsulting.com e-mail: rrice@riceconsulting.com