SlideShare a Scribd company logo
Anders Claesson 
Copyright © 2010, Enea AB 
1(29) 
Test Strategies in Agile Projects
2(29) 
Contents 
Agile development and testing 
Test methods, tools and planning 
Definition of Done 
User Stories 
Test ideas and guidelines 
Test charters and exploratory testing 
Logging and reporting 
When to stop testing
3(29) 
Agile Development 
Focus on rapid delivery of business value 
Reducing risks 
Continuous planning and feedback 
Value demonstrated 
Self-organizing teams
4(29) 
Agile Testing 
An iterative process from a customer's perspective 
Testing is performed early and often 
Testers are part of the development team 
“User Stories” are tested 
Close cooperation with developers and customers 
Continuous integration and regression tests 
All test results are logged 
Defects are reported
5(29) 
Scrum Overview 
Product Backlog 
Sprint Backlog 
Daily 
Sprint 
Meeting 
Sprint 
Sprint 2 – 4 Weeks 
Plan 
Risk 
Analysis 
Retrospective 
Release 
Plan Definition 
of Done 
Task Board 
Burndown Chart 
Release 
Criteria
6(29) 
Agile Testing Quadrants 
Functional Tests 
Examples 
Story Tests 
Prototypes 
Simulations 
Exploratory Testing 
Scenarios 
Usability Testing 
UAT 
Alpha/Beta 
Q2 
Q3Q1Q4 
Unit Tests 
Component Tests 
Performance & Load TestingSecurity Testing“ility” Testing 
Automated 
& ManualAutomated 
Tools 
Business-FacingTechnology-Facing 
Supporting the Team 
Evaluate Product Manual Brian Marick's agile testing matrix
7(29) 
Find out what the Customer Values the Most 
Importance Weight % 
Customer 
Satisfaction 
Quality AttributeUser/Customer Value Analysis 
Priority 
IW / CS 
4. Performance201,1 
18 
2. Availability 
150,818 
1. Reliability 
300,9 
33 
5. Service Level 
101,0 
10 
6. Maintainability50,8 
6 
3. Usability 
201,118 
Summary: 
100 %X=0,96 
Also analyze the complexity of the system including all valid and prioritized combinations of features and platforms 
Risk/Cost of failure 
SuppliercostsCustomer costs 
X < 1: Less satisfiedX > 1: More satisfied 
HighCriticalHigh 
Critical 
Medium 
HighHighCritical 
High 
HighMedium 
Low
8(29) 
Test Methods and Techniques 
Requirements based testing 
Design based testing 
Risk based testing 
Exploratory testing 
Error guessing 
Taxonomy based testing 
Attack based testing 
Model based testing 
Scenario based testing 
Combinatorial testing 
Value based testing 
Prototyping
9(29) 
Three Different Views 
UserWhat do the user want to do with the system? 
SystemWhat should the system be capable of doing? 
RisksWhat problems may occur?
10(29) 
How to Explore and Learn
11(29) 
Test Planning 
What 
Why 
Who 
Where 
When 
How 
Dependencies 
Risks 
Prio 
Time
12(29) 
Useful Tools for Agile Testing 
http://www.opensourcetesting.org/http://www.satisfice.com/tools.shtml 
AllpairsTest Case Generation Tool for combinatory testing 
PICTGeneration of combinatorial tests using orthogonal arrays http://www.pairwise.org/tools.asp 
PerlclipTesting of text fields or documents with different kinds of stressful inputs 
SpectorProLogging/recording of all activities on a PChttp://www.spectorsoft.com/ 
TestExplorerSession based ET http://www.testexplorer.com 
Session TesterAn exploratory testing tool for managing and recordingsession-based testing http://sessiontester.openqa.org/ 
Resource ViewerIs intended for viewing of resources in executable fileshttp://www.glocksoft.com/resource_viewer.htm 
RastaKeyword Driven Test Automationhttp://rasta.rubyforge.org/index.html 
List of Testing Tools:http://www.aptest.com/resources.html
13(29) 
Definition of Done 
FeatureStory(-ies) or product backlog item(s) 
SprintA collection of features developed within a Sprint 
ReleasePotentially shippable parts 
Example, feature overview: User managementAdd new 
Delete 
List 
SortPagingSearch 
Tags 
Smart search 
Regular expressions 
CloneEdit
14(29) 
Definition of Done –Unit Level 
Test Aspect 
Criteria for Done 
Structure based testing 
100% Software Design and Module Specifications covered 
100% Statement Coverage of the source code 
Boundary Value Analysis Testing 
Equivalence Classes and Input Partitioning Testing 
All Test Cases passed and no remaining faults to be corrected 
All code reviewed 
Known weaknesses described 
Component testing reported (including obtained test coverage) 
Integration testing 
Internal and external interfaces in the sub-system covered by verifying protocols and syntax in all messages 
More than 40% of all tests are negative test cases
15(29) 
Definition of Done –Functional Level 
Test Aspect 
Criteria for Done 
Functional testing 
100% requirements coverage. 
100% coverage of the main flows in the operational scenarios. 
100% of the highest risks covered. 
100% of externally observable system states covered. 
100% of externally observable failure modes covered. 
Operational manuals tested. 
All failures found are reported. 
Boundary Values, Equivalence Classes and Input partitioning testing made for all input data. 
All combinations of input and output parameters and values covered (pair-wise coverage).
16(29) 
User Stories 1(2) 
Card: 
As a registered user, I want to log in, so I can access subscriber content 
Conversation: 
User Login 
Username: 
Password: 
Login 
Remember me 
[Message] 
Forgot password? Store cookie if ticked and login successful 
User’s email address, 
Validate format 
Authenticate against SRS using the new web service 
Go to forgotten password page 
Display message here if not successful. (See confirmation scenarios on the next page)
17(29) 
User Stories 2(2) 
Confirmation: 
SuccessValid user logged in and referred to the home pagea)Valid user name and passwordb)“Remember me” ticked –Store cookie/automatic login next time c)“Remember me” not ticked –Manual login next timed)Password forgotten and a correct one is sent via email 
FailureDisplay message: a)“Email address in wrong format” b)“Unrecognized user name, please try again” c)“Incorrect password, please try again” d)“Service unavailable, please try again” e)Account has expired –refer to account renewal sales page
18(29) 
Test Questions 
-Which user/usage goals should be met? 
-What user problems should be solved? 
-Which user benefits should be achieved? 
-Why does the orderer/customer want the system? 
-Who are the customer(s) and the target user group? 
-Which functions and characteristics are included? 
-What are the most common and critical parts of the functionality from the users point of view? 
-Are there any performance requirements included? 
-What is an acceptable response time for the users? 
-How tolerant should the system be to faulty input or user actions?
19(29) 
Test IdeasWhat do we need to find out about the system? ------------------------------ 1What happens if …………………………………..? 2What should happen when ………………………..? 3Will the system be able to fulfil all its requirements? 4What are the expectations and needs from the customer? 5In what way may the system fail? 6What problems were found in the previous release? 7Are the requirements and the input specifications possible to understand and test (sufficient testability)? 8Will the system be reliable and resist failure in all situations? 9Will the system be safe in all configurations and situations it will be used? 10How easy is it for real users to use the system? 11How fast and responsive is the system? 12Is it easy to install (and configure) onto its target platform? 13How well does it work with external components and configurations? 14How effectively can the system be tested (e.g. can log files be read)?
20(29) 
Guidelines for Agile Testing 1(2) 
1.Test in pairs 
2.Prepare test charters in advance 
3.Use exploratory testing 
4.Build tests incrementally 
5.Use test design patterns 
6.Perform keyword/data driven tests
21(29) 
Guidelines for Agile Testing 2(2) 
7.End-to-end testing 
8.Scenario based testing 
9.Use automation for test data generation and execution 
10.Frequent regression testing 
11.Documentation testing 
12.Log everything you do
22(29) 
Test Charter 1(3) 
Actor 
< Type of user > 
Purpose 
< Describe the function, web page , test idea, … to be tested > 
Setup 
< Preconditions i.e. concerning HW, content of data base(s),.. > 
Priority 
< Importance of risk, function, web page, …. > 
Reference(s) 
< Requirement, risk, test ideas, … > 
Data 
< Whatever is needed for the activities, files > 
Activities 
< A list of actions and test ideas > 
Oracle notes 
< How to evaluate the product for correct results > 
Variations 
< Alternative actions and evaluations > 
Test Charter no # and <title>
23(29) 
Test Charter 2(3) 
Test Charter 1: Analyze the copy/paste function of pictures 
Actor:Normal user. 
Purpose:To evaluate if the copy/paste function of pictures works in our word processor together with the most commonly used word processors on the market (Word, Power Point., etc.) and other programs where pictures can be inserted in the copy buffer for copy/paste operations in the PC. The purpose is also to see that no information is lost or corrupted. 
Setup:A Dell PC with 2 Gb memory, our word processor, the Microsoft Office package professional and the home edition 2003 patch level xx, PDF reader version 7.1, Notepad version 4, Internet Explorer version 7, Opera version 5, Mozilla……etc. (the setup might be common for several charters and can therefore be described and referred to instead of repeating the same information in every charter). 
Priority:High, because this function is used very frequently both within our own word processor, but also between other word processors and programs, the user may want to copy/paste pictures with ours. 
Reference:Requirementabc_1in document Def-07:034 revision R1.1.Risk number 3from the risk assessment held on April 4 2007 regarding the copy function, which is documented and followed up in the document Rsk-07:012. 
Data:A variety of pictures with different resolutions, both vector graphics as well as bit mapped pictures. The pictures could be photos or figures in web browsers for example. Complex pictures are also included and pictures which might be copy protected in some way.
24(29) 
Test Charter 1: Analyze the copy/paste function of pictures, continued…. 
Activities:1.Copy/paste a picture in our word processor from one location to another in the same document. 
2.Copy/paste a picture to and from our word processor into/from Word, PowerPoint and Excel. 
3.Copy a picture into our word processor from a variety of the mostly used Web browsers. 
4.Copy/paste a picture to/from the most commonly used web site building tools such as Dreamweaver from Adobe. 
5. Copy a picture from a PDF document using Acrobat Reader into our word processor. 
6.Try to copy a write protected picture from the web or other source into our word processor. 
7.Try to copy/paste a variety of non readable and some readable ASCII-characters or corrupted pictures. 
Oracle Look if the size of the pasted picture changed on the screen (it should not). 
notes:Check if there is any loss in the resolution of the picture especially when you copy and paste from other programs to or from ours. 
Check which is the highest resolution picture that can be copied and how that affect the system. 
Check for memory leaks and how much memory the copy/paste operation takes from the system and how that affects the use of other programs. Other programs should not slow down or be affected in any way. 
Print pasted pictures to see if there are any differences in color, resolution or any other anomaly. 
Variations:Try out (and find) the boundaries of how large pictures are possible to copy/paste. 
Perform copy/paste with a large number of items in the copy/paste buffer in your PC. 
Try to fill the copy/paste buffer to its limit and then copy/paste a picture and see what happens. 
Try the longest file name of the picture you can type before making the copy/paste. You may use the tool Perlclip (download from http://www.satisfice.com/tools/perlclip.zip) to generate file names with a million characters or more if you want. 
Test Charter 3(3)
25(29) 
Exploratory Test Execution 
1.Observe 
2.State questions 
3.Form hypothesis 
4.Design the experiment 
5.Test the hypothesis 
6.Draw conclusions 
7.State additional questions 
The scientific approach
26(29) 
A Program can Fail in many WaysSystemundertest 
Program stateSystem state 
Intended inputsMonitored outputs 
Configuration and system resources 
Input from other cooperating 
processes, clients or servers 
Program state, including 
uninspected outputs 
System state 
Impacts on connected 
devices/system resources 
Output to other cooperating 
processes, clients or servers
27(29) 
Exploratory Test Execution Logging 
Use a logging tool: 
Log everything you do 
Information is traceable 
Complement with an API logger/sniffer 
Your notes show your reasoning while testing 
Exploratory testing requires good logging
28(29) 
Medium 
Medium 
High 
Low 
Low 
Current risk level 
Q ass. 
3 
2 
1+ 
2+ 
0 
W 6 
High 
Pause 
Blocked 
High 
None 
Current test effort 
Medium 
High 
High 
Medium 
Low 
Initial Risk Level 
Medium 
High 
High 
Medium 
Low 
Needed test effort 
IR1212 under investigation. 
1+ 
1+ 
1+ 
1 
1 
Area 4 
Crashes, IR12345 
1+ 
1 
1 
Area 3 
On track, no faults. 
2 
2 
1+ 
1 
1 
Area 2 
Feature(s) not yet delivered from design and integration. Definition of Done not fulfilled for functional testing. No testing possible. 
0 
0 
Area 1 
3 
W 5 
2+ 
W 4 
2+ 
W 3 
2 
W 2 
1 
W 1 
Configuration problems. 
Area 5 
Comments 
Test area 
Test effort and perceived Quality Level –Including risks and test coverage 
Test Reporting
29(29) 
All planned/required Test Charters/sessions and characteristics tests have been run and passed according to the current risk areas/levelsand where faults have been found. The coverage objectives have been reached that were stated in the test goals (e.g. System Requirements, User Stories). 
CoverageTesting should stop when: 
•The probability of remaining faults has been reduced to a level that can be accepted by the customer. 
•No open priority A Incident Reports. 
•The systems’ risk levelis within acceptable limits (i.e. no critical risksremain unsolved). 
•The Definition of Done for all testing activities have been fulfilled. 
•The product values have been demonstrated and accepted (i.e. implicit and explicit quality attributes are satisfied). 
Quality 
When the agreed ship date has been reached. 
Time 
When to Stop Testing

More Related Content

PPT
John Brennen - Red Hot Testing in a Green World
PPT
Wim Demey - Regression Testing in a Migration Project
PPTX
'Growing to a Next Level Test Organisation' by Tim Koomen
PPTX
T19 performance testing effort - estimation or guesstimation revised
PPTX
Ben Walters - Creating Customer Value With Agile Testing - EuroSTAR 2011
PPT
Michael Snyman - Software Test Automation Success
PDF
Julie Gardiner - Branch out using Classification Trees for Test Case Design -...
PPT
Gitte Ottosen - Agility and Process Maturity, Of Course They Mix!
John Brennen - Red Hot Testing in a Green World
Wim Demey - Regression Testing in a Migration Project
'Growing to a Next Level Test Organisation' by Tim Koomen
T19 performance testing effort - estimation or guesstimation revised
Ben Walters - Creating Customer Value With Agile Testing - EuroSTAR 2011
Michael Snyman - Software Test Automation Success
Julie Gardiner - Branch out using Classification Trees for Test Case Design -...
Gitte Ottosen - Agility and Process Maturity, Of Course They Mix!

What's hot (20)

PPT
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
PPT
Edwin Van Loon - Exploitation Testing revised
PDF
Christian Bk Hansen - Agile on Huge Banking Mainframe Legacy Systems - EuroST...
PDF
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
PPT
James Brodie - Outsourcing Partnership - Shared Perspectives
PPTX
Test management
PPT
John Kent - An Entity Model for Software Testing
PPT
Vipul Kocher - Software Testing, A Framework Based Approach
PPT
Robert Magnusson - TMMI Level 2 - A Practical Approach
PPT
Graham Bath - SOA: Whats in it for Testers?
PPT
Ane Clausen - Success with Automated Regression Test revised
PDF
Andrew Goslin - TMMi, What is Not in the Text Book - EuroSTAR 2010
PDF
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
PPTX
Mickiel Vroon - Test Environment, The Future Achilles’ Heel
PPT
Test strategy &-testplanning
PPTX
'Mixing Open And Commercial Tools' by Mauro Garofalo
PPTX
ISTQB foundation level - day 2
PPT
Software test management overview for managers
PPT
AJRA Test Strategy Discussion
PDF
Dirk Van Dael - Test Accounting - EuroSTAR 2010
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
Edwin Van Loon - Exploitation Testing revised
Christian Bk Hansen - Agile on Huge Banking Mainframe Legacy Systems - EuroST...
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
James Brodie - Outsourcing Partnership - Shared Perspectives
Test management
John Kent - An Entity Model for Software Testing
Vipul Kocher - Software Testing, A Framework Based Approach
Robert Magnusson - TMMI Level 2 - A Practical Approach
Graham Bath - SOA: Whats in it for Testers?
Ane Clausen - Success with Automated Regression Test revised
Andrew Goslin - TMMi, What is Not in the Text Book - EuroSTAR 2010
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
Mickiel Vroon - Test Environment, The Future Achilles’ Heel
Test strategy &-testplanning
'Mixing Open And Commercial Tools' by Mauro Garofalo
ISTQB foundation level - day 2
Software test management overview for managers
AJRA Test Strategy Discussion
Dirk Van Dael - Test Accounting - EuroSTAR 2010
Ad

Similar to Anders Claesson - Test Strategies in Agile Projects - EuroSTAR 2010 (20)

PDF
Beyond "Quality Assurance"
PPT
QA process Presentation
PDF
Fran O'Hara - Evolving Agile Testing - EuroSTAR 2012
PDF
Demise of test scripts rise of test ideas
PPT
Software Testing Process
PPT
Testing process
PPTX
Qa documentation pp
PPTX
Advanced quality control
PDF
Agile testing practice
PDF
Specification-By-Example with Gherkin
PDF
The Speed to Cool - Valuing Testing & Quality in Agile Teams
PPT
System Testingin Agile Environment
PPTX
Is Test Planning a lost art in Agile? by Michelle Williams
PPT
Software testing for biginners
PPT
Test plan
PPTX
B4 u solution_writing test cases from user stories and acceptance criteria
PDF
Pariksha testing services
PDF
Testing In Agile
PPTX
Writing test cases from user stories and acceptance criteria
PPTX
Lecture9 10.pptx or software testing pptx
Beyond "Quality Assurance"
QA process Presentation
Fran O'Hara - Evolving Agile Testing - EuroSTAR 2012
Demise of test scripts rise of test ideas
Software Testing Process
Testing process
Qa documentation pp
Advanced quality control
Agile testing practice
Specification-By-Example with Gherkin
The Speed to Cool - Valuing Testing & Quality in Agile Teams
System Testingin Agile Environment
Is Test Planning a lost art in Agile? by Michelle Williams
Software testing for biginners
Test plan
B4 u solution_writing test cases from user stories and acceptance criteria
Pariksha testing services
Testing In Agile
Writing test cases from user stories and acceptance criteria
Lecture9 10.pptx or software testing pptx
Ad

More from TEST Huddle (20)

PPTX
Why We Need Diversity in Testing- Accenture
PPTX
Keys to continuous testing for faster delivery euro star webinar
PPTX
Why you Shouldnt Automated But You Will Anyway
PDF
Being a Tester in Scrum
PDF
Leveraging Visual Testing with Your Functional Tests
PPTX
Using Test Trees to get an Overview of Test Work
PPTX
Big Data: The Magic to Attain New Heights
PPTX
Will Robots Replace Testers?
PPTX
TDD For The Rest Of Us
PDF
Scaling Agile with LeSS (Large Scale Scrum)
PPTX
Creating Agile Test Strategies for Larger Enterprises
PPTX
Is There A Risk?
PDF
Are Your Tests Well-Travelled? Thoughts About Test Coverage
PDF
Growing a Company Test Community: Roles and Paths for Testers
PDF
Do we need testers on agile teams?
PDF
How to use selenium successfully
PDF
Testers & Teams on the Agile Fluency™ Journey
PDF
Practical Test Strategy Using Heuristics
PDF
Thinking Through Your Role
PDF
Using Selenium 3 0
Why We Need Diversity in Testing- Accenture
Keys to continuous testing for faster delivery euro star webinar
Why you Shouldnt Automated But You Will Anyway
Being a Tester in Scrum
Leveraging Visual Testing with Your Functional Tests
Using Test Trees to get an Overview of Test Work
Big Data: The Magic to Attain New Heights
Will Robots Replace Testers?
TDD For The Rest Of Us
Scaling Agile with LeSS (Large Scale Scrum)
Creating Agile Test Strategies for Larger Enterprises
Is There A Risk?
Are Your Tests Well-Travelled? Thoughts About Test Coverage
Growing a Company Test Community: Roles and Paths for Testers
Do we need testers on agile teams?
How to use selenium successfully
Testers & Teams on the Agile Fluency™ Journey
Practical Test Strategy Using Heuristics
Thinking Through Your Role
Using Selenium 3 0

Recently uploaded (20)

PPTX
Cloud computing and distributed systems.
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
PPTX
A Presentation on Artificial Intelligence
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
cuic standard and advanced reporting.pdf
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Approach and Philosophy of On baking technology
PDF
Review of recent advances in non-invasive hemoglobin estimation
DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPTX
Big Data Technologies - Introduction.pptx
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
Empathic Computing: Creating Shared Understanding
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
Cloud computing and distributed systems.
“AI and Expert System Decision Support & Business Intelligence Systems”
Reach Out and Touch Someone: Haptics and Empathic Computing
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
A Presentation on Artificial Intelligence
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
cuic standard and advanced reporting.pdf
The Rise and Fall of 3GPP – Time for a Sabbatical?
Diabetes mellitus diagnosis method based random forest with bat algorithm
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Approach and Philosophy of On baking technology
Review of recent advances in non-invasive hemoglobin estimation
The AUB Centre for AI in Media Proposal.docx
20250228 LYD VKU AI Blended-Learning.pptx
Big Data Technologies - Introduction.pptx
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Empathic Computing: Creating Shared Understanding
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
Building Integrated photovoltaic BIPV_UPV.pdf

Anders Claesson - Test Strategies in Agile Projects - EuroSTAR 2010

  • 1. Anders Claesson Copyright © 2010, Enea AB 1(29) Test Strategies in Agile Projects
  • 2. 2(29) Contents Agile development and testing Test methods, tools and planning Definition of Done User Stories Test ideas and guidelines Test charters and exploratory testing Logging and reporting When to stop testing
  • 3. 3(29) Agile Development Focus on rapid delivery of business value Reducing risks Continuous planning and feedback Value demonstrated Self-organizing teams
  • 4. 4(29) Agile Testing An iterative process from a customer's perspective Testing is performed early and often Testers are part of the development team “User Stories” are tested Close cooperation with developers and customers Continuous integration and regression tests All test results are logged Defects are reported
  • 5. 5(29) Scrum Overview Product Backlog Sprint Backlog Daily Sprint Meeting Sprint Sprint 2 – 4 Weeks Plan Risk Analysis Retrospective Release Plan Definition of Done Task Board Burndown Chart Release Criteria
  • 6. 6(29) Agile Testing Quadrants Functional Tests Examples Story Tests Prototypes Simulations Exploratory Testing Scenarios Usability Testing UAT Alpha/Beta Q2 Q3Q1Q4 Unit Tests Component Tests Performance & Load TestingSecurity Testing“ility” Testing Automated & ManualAutomated Tools Business-FacingTechnology-Facing Supporting the Team Evaluate Product Manual Brian Marick's agile testing matrix
  • 7. 7(29) Find out what the Customer Values the Most Importance Weight % Customer Satisfaction Quality AttributeUser/Customer Value Analysis Priority IW / CS 4. Performance201,1 18 2. Availability 150,818 1. Reliability 300,9 33 5. Service Level 101,0 10 6. Maintainability50,8 6 3. Usability 201,118 Summary: 100 %X=0,96 Also analyze the complexity of the system including all valid and prioritized combinations of features and platforms Risk/Cost of failure SuppliercostsCustomer costs X < 1: Less satisfiedX > 1: More satisfied HighCriticalHigh Critical Medium HighHighCritical High HighMedium Low
  • 8. 8(29) Test Methods and Techniques Requirements based testing Design based testing Risk based testing Exploratory testing Error guessing Taxonomy based testing Attack based testing Model based testing Scenario based testing Combinatorial testing Value based testing Prototyping
  • 9. 9(29) Three Different Views UserWhat do the user want to do with the system? SystemWhat should the system be capable of doing? RisksWhat problems may occur?
  • 10. 10(29) How to Explore and Learn
  • 11. 11(29) Test Planning What Why Who Where When How Dependencies Risks Prio Time
  • 12. 12(29) Useful Tools for Agile Testing http://www.opensourcetesting.org/http://www.satisfice.com/tools.shtml AllpairsTest Case Generation Tool for combinatory testing PICTGeneration of combinatorial tests using orthogonal arrays http://www.pairwise.org/tools.asp PerlclipTesting of text fields or documents with different kinds of stressful inputs SpectorProLogging/recording of all activities on a PChttp://www.spectorsoft.com/ TestExplorerSession based ET http://www.testexplorer.com Session TesterAn exploratory testing tool for managing and recordingsession-based testing http://sessiontester.openqa.org/ Resource ViewerIs intended for viewing of resources in executable fileshttp://www.glocksoft.com/resource_viewer.htm RastaKeyword Driven Test Automationhttp://rasta.rubyforge.org/index.html List of Testing Tools:http://www.aptest.com/resources.html
  • 13. 13(29) Definition of Done FeatureStory(-ies) or product backlog item(s) SprintA collection of features developed within a Sprint ReleasePotentially shippable parts Example, feature overview: User managementAdd new Delete List SortPagingSearch Tags Smart search Regular expressions CloneEdit
  • 14. 14(29) Definition of Done –Unit Level Test Aspect Criteria for Done Structure based testing 100% Software Design and Module Specifications covered 100% Statement Coverage of the source code Boundary Value Analysis Testing Equivalence Classes and Input Partitioning Testing All Test Cases passed and no remaining faults to be corrected All code reviewed Known weaknesses described Component testing reported (including obtained test coverage) Integration testing Internal and external interfaces in the sub-system covered by verifying protocols and syntax in all messages More than 40% of all tests are negative test cases
  • 15. 15(29) Definition of Done –Functional Level Test Aspect Criteria for Done Functional testing 100% requirements coverage. 100% coverage of the main flows in the operational scenarios. 100% of the highest risks covered. 100% of externally observable system states covered. 100% of externally observable failure modes covered. Operational manuals tested. All failures found are reported. Boundary Values, Equivalence Classes and Input partitioning testing made for all input data. All combinations of input and output parameters and values covered (pair-wise coverage).
  • 16. 16(29) User Stories 1(2) Card: As a registered user, I want to log in, so I can access subscriber content Conversation: User Login Username: Password: Login Remember me [Message] Forgot password? Store cookie if ticked and login successful User’s email address, Validate format Authenticate against SRS using the new web service Go to forgotten password page Display message here if not successful. (See confirmation scenarios on the next page)
  • 17. 17(29) User Stories 2(2) Confirmation: SuccessValid user logged in and referred to the home pagea)Valid user name and passwordb)“Remember me” ticked –Store cookie/automatic login next time c)“Remember me” not ticked –Manual login next timed)Password forgotten and a correct one is sent via email FailureDisplay message: a)“Email address in wrong format” b)“Unrecognized user name, please try again” c)“Incorrect password, please try again” d)“Service unavailable, please try again” e)Account has expired –refer to account renewal sales page
  • 18. 18(29) Test Questions -Which user/usage goals should be met? -What user problems should be solved? -Which user benefits should be achieved? -Why does the orderer/customer want the system? -Who are the customer(s) and the target user group? -Which functions and characteristics are included? -What are the most common and critical parts of the functionality from the users point of view? -Are there any performance requirements included? -What is an acceptable response time for the users? -How tolerant should the system be to faulty input or user actions?
  • 19. 19(29) Test IdeasWhat do we need to find out about the system? ------------------------------ 1What happens if …………………………………..? 2What should happen when ………………………..? 3Will the system be able to fulfil all its requirements? 4What are the expectations and needs from the customer? 5In what way may the system fail? 6What problems were found in the previous release? 7Are the requirements and the input specifications possible to understand and test (sufficient testability)? 8Will the system be reliable and resist failure in all situations? 9Will the system be safe in all configurations and situations it will be used? 10How easy is it for real users to use the system? 11How fast and responsive is the system? 12Is it easy to install (and configure) onto its target platform? 13How well does it work with external components and configurations? 14How effectively can the system be tested (e.g. can log files be read)?
  • 20. 20(29) Guidelines for Agile Testing 1(2) 1.Test in pairs 2.Prepare test charters in advance 3.Use exploratory testing 4.Build tests incrementally 5.Use test design patterns 6.Perform keyword/data driven tests
  • 21. 21(29) Guidelines for Agile Testing 2(2) 7.End-to-end testing 8.Scenario based testing 9.Use automation for test data generation and execution 10.Frequent regression testing 11.Documentation testing 12.Log everything you do
  • 22. 22(29) Test Charter 1(3) Actor < Type of user > Purpose < Describe the function, web page , test idea, … to be tested > Setup < Preconditions i.e. concerning HW, content of data base(s),.. > Priority < Importance of risk, function, web page, …. > Reference(s) < Requirement, risk, test ideas, … > Data < Whatever is needed for the activities, files > Activities < A list of actions and test ideas > Oracle notes < How to evaluate the product for correct results > Variations < Alternative actions and evaluations > Test Charter no # and <title>
  • 23. 23(29) Test Charter 2(3) Test Charter 1: Analyze the copy/paste function of pictures Actor:Normal user. Purpose:To evaluate if the copy/paste function of pictures works in our word processor together with the most commonly used word processors on the market (Word, Power Point., etc.) and other programs where pictures can be inserted in the copy buffer for copy/paste operations in the PC. The purpose is also to see that no information is lost or corrupted. Setup:A Dell PC with 2 Gb memory, our word processor, the Microsoft Office package professional and the home edition 2003 patch level xx, PDF reader version 7.1, Notepad version 4, Internet Explorer version 7, Opera version 5, Mozilla……etc. (the setup might be common for several charters and can therefore be described and referred to instead of repeating the same information in every charter). Priority:High, because this function is used very frequently both within our own word processor, but also between other word processors and programs, the user may want to copy/paste pictures with ours. Reference:Requirementabc_1in document Def-07:034 revision R1.1.Risk number 3from the risk assessment held on April 4 2007 regarding the copy function, which is documented and followed up in the document Rsk-07:012. Data:A variety of pictures with different resolutions, both vector graphics as well as bit mapped pictures. The pictures could be photos or figures in web browsers for example. Complex pictures are also included and pictures which might be copy protected in some way.
  • 24. 24(29) Test Charter 1: Analyze the copy/paste function of pictures, continued…. Activities:1.Copy/paste a picture in our word processor from one location to another in the same document. 2.Copy/paste a picture to and from our word processor into/from Word, PowerPoint and Excel. 3.Copy a picture into our word processor from a variety of the mostly used Web browsers. 4.Copy/paste a picture to/from the most commonly used web site building tools such as Dreamweaver from Adobe. 5. Copy a picture from a PDF document using Acrobat Reader into our word processor. 6.Try to copy a write protected picture from the web or other source into our word processor. 7.Try to copy/paste a variety of non readable and some readable ASCII-characters or corrupted pictures. Oracle Look if the size of the pasted picture changed on the screen (it should not). notes:Check if there is any loss in the resolution of the picture especially when you copy and paste from other programs to or from ours. Check which is the highest resolution picture that can be copied and how that affect the system. Check for memory leaks and how much memory the copy/paste operation takes from the system and how that affects the use of other programs. Other programs should not slow down or be affected in any way. Print pasted pictures to see if there are any differences in color, resolution or any other anomaly. Variations:Try out (and find) the boundaries of how large pictures are possible to copy/paste. Perform copy/paste with a large number of items in the copy/paste buffer in your PC. Try to fill the copy/paste buffer to its limit and then copy/paste a picture and see what happens. Try the longest file name of the picture you can type before making the copy/paste. You may use the tool Perlclip (download from http://www.satisfice.com/tools/perlclip.zip) to generate file names with a million characters or more if you want. Test Charter 3(3)
  • 25. 25(29) Exploratory Test Execution 1.Observe 2.State questions 3.Form hypothesis 4.Design the experiment 5.Test the hypothesis 6.Draw conclusions 7.State additional questions The scientific approach
  • 26. 26(29) A Program can Fail in many WaysSystemundertest Program stateSystem state Intended inputsMonitored outputs Configuration and system resources Input from other cooperating processes, clients or servers Program state, including uninspected outputs System state Impacts on connected devices/system resources Output to other cooperating processes, clients or servers
  • 27. 27(29) Exploratory Test Execution Logging Use a logging tool: Log everything you do Information is traceable Complement with an API logger/sniffer Your notes show your reasoning while testing Exploratory testing requires good logging
  • 28. 28(29) Medium Medium High Low Low Current risk level Q ass. 3 2 1+ 2+ 0 W 6 High Pause Blocked High None Current test effort Medium High High Medium Low Initial Risk Level Medium High High Medium Low Needed test effort IR1212 under investigation. 1+ 1+ 1+ 1 1 Area 4 Crashes, IR12345 1+ 1 1 Area 3 On track, no faults. 2 2 1+ 1 1 Area 2 Feature(s) not yet delivered from design and integration. Definition of Done not fulfilled for functional testing. No testing possible. 0 0 Area 1 3 W 5 2+ W 4 2+ W 3 2 W 2 1 W 1 Configuration problems. Area 5 Comments Test area Test effort and perceived Quality Level –Including risks and test coverage Test Reporting
  • 29. 29(29) All planned/required Test Charters/sessions and characteristics tests have been run and passed according to the current risk areas/levelsand where faults have been found. The coverage objectives have been reached that were stated in the test goals (e.g. System Requirements, User Stories). CoverageTesting should stop when: •The probability of remaining faults has been reduced to a level that can be accepted by the customer. •No open priority A Incident Reports. •The systems’ risk levelis within acceptable limits (i.e. no critical risksremain unsolved). •The Definition of Done for all testing activities have been fulfilled. •The product values have been demonstrated and accepted (i.e. implicit and explicit quality attributes are satisfied). Quality When the agreed ship date has been reached. Time When to Stop Testing