TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
This is a sample of an outline for a test plan. It has been designed for medium to small test projects, and
thus is fairly lightweight. It is by necessity general, because each enterprise, each development group,
each testing group, and each development project is different. This outline should be used as a set of
guidelines for creating your own standard template; add to it or subtract from it as you find appropriate.
Bear in mind that it is generally better to have an excess of detail in the template—detail which can be
removed when creating a specific test plan—than to have to remember to add something that is not in the
template.
Make sure to fill in the running headers and footers with the product name, draft numbers, revision dates,
and page numbers; this is important in places with lots of test projects on the go. Make sure to include
the author’s name, too, so that errors or questions can be addressed to the right person.


1.     OVERVIEW
     1.1.     PRODUCT NAME
     1.2.     PRODUCT REVISION
     1.3.     PROJECT LEADS
        1.3.1.       Marketing Lead (or other customer representative)
        1.3.2.       Program Manager
        1.3.3.       Development Lead
        1.3.4.       Test Lead
        1.3.5.       Build and Release Control Engineer
        1.3.6.       Legal representative
        Include names, phone numbers, and email addresses for each. Note that this
        table will differ for a particular company or group. The goal is to ensure that
        anyone walking into the company or into the test role can easily identify and
        contact the people he/she needs to reach.
     1.4.     TEST PROJECT STAFF
        1.4.1.       Test requirements designers
        1.4.2.       Test case designers
        1.4.3.       Test personnel
              1.4.3.1.      For manual (i.e. non-automated) tests
              1.4.3.2.      For automated tests
              1.4.3.3.      Test automation programmers
        1.4.4.       Documentation reviewers
        1.4.5.       Legal reviewer
        Include names, phone numbers, and email addresses for each. Note that there
        may be several people in each role, that one person may be obliged to fill
        multiple roles, and that some roles (e.g. legal reviewer) won’t be required for all
        projects.



Author                                            Page 1                                         7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
   1.5.   PRODUCT OVERVIEW
            This could also be “description of the change requirements” for maintenance
            projects.
            1.5.1.    Cut and paste from requirements document or specification a brief
                      summary description of the product or change, or describe the project
                      as understood by the developers—if the latter, make sure that there is
                      agreement and sign-off from the customer.
     1.6.        TRACKING AND REPORTING SYSTEMS
            1.6.1.    Identify the defect tracking system in use.
            1.6.2.    Identify the manner and schedule by which defect reports are expected
                      to be delivered to developers.
            1.6.3.    Identify parties that may have access to the tracking system.
            1.6.4.    Identify the change control system.
            1.6.5.    Identify the means by which the team is to be notified of changes to the
                      requirements, the product, the test plan, etc.
2.     TESTING SYNOPSIS
     2.1.        Items to be tested
            2.1.1.    Refer to the functional requirements that specify the features and
                      functions to be tested. The description of the change need not be
                      excessively detailed when there is a complete description to refer to in
                      some other document. On the other hand, if there is no reasonable
                      specification available, more detail is called for here.
     2.2.        Items not to be Tested
            2.2.1.    List the features and functions that will not be covered in this test plan.
                      Identify briefly the reasons for leaving them out.
     2.3.        System Requirements
            2.3.1.    This section should be filled out in detail for new projects. For
                      existing maintenance tasks, a simple cross-reference to the document
                      describing existing system requirements is fine. Note any changes to
                      previous system requirements, especially when support for a given
                      product or platform is being dropped.
            2.3.2.    If there is a system requirement that could be unclear, make it specific;
                      for example, for Web-based projects, identify not only the supported
                      browsers but also the minimum versions of the supported browsers.
     2.4.        Standards/Reference material
            2.4.1.    List any standards or other reference material used in the creation of
                      this test plan.
            2.4.2.    Identify standards for acceptance criteria, defect severity, testable
                      specifications, and so on. (These standards may have to be created, or

Author                                         Page 2                                   7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
              adapted from time to time; the first use of this test plan will require
              more work than later iterations.)
     2.5.   Glossary
        2.5.1.    In cases where terminology could be unfamiliar or open to
                  interpretation, provide a list defining the unclear terms.
        2.5.2.    Obtain agreement on these terms from all interested parties.
        2.5.3.    Note: If no one is forthcoming with the information you need, make
                  something up; they might not have done their jobs from the outset, but
                  they’ll be happy to correct your work! You will have achieved the
                  goal, which is clarity and agreement.
3.     TYPES OF TESTING
     3.1.   ACCEPTANCE TESTING
        3.1.1.    Detail a set of acceptance criteria—conditions that must be met before
                  testing can begin. A smoke test should represent the bare minimum of
                  acceptance testing.
        3.1.2.    As noted above, the ideal is to create a separate document for
                  acceptance criteria that can be reused and referred to here. If any
                  particular, specialized test cases not listed in that document will be
                  used, refer to them here.
     3.2.   FEATURE LEVEL TESTING
        This is the real meat of the test plan. The test categories below are filled in
        itemizing categories of tests, along with references to the test library or catalog.
        Individual test cases should not be listed here; test requirements generally should
        not be either; the details should exist elsewhere and can be cross-referenced.
        3.2.1.    Task-Oriented Functional Tests
            3.2.1.1.    This is a detailed section, listing tests requirements for program
                        features against functional specifications, user guides or other
                        design related documents. If there are test matrices available
                        listing these features and their interdependence (and there
                        should be), refer to them.
        3.2.2.    Forced-Error Tests
            3.2.2.1.    Provide or refer to a list of all error conditions and messages.
                        Identify the tests that will be run to force the program into error
                        conditions.
        3.2.3.    Boundary Tests
            3.2.3.1.    Boundary tests—tests carried out at the lines between valid and
                        invalid input, acceptable and unacceptable system requirements
                        (such as memory, disk space, or timing), and other tests at the
                        limits of performance—are the keys to eliminating duplication of
                        effort. Identify the types of boundary tests that will be carried

Author                                     Page 3                                   7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
                   out. Note that such tests can also fall into the categories
                   outlined below, so this section may be removed, or made a sub-
                   section of those categories.
       3.2.4.      Integration Tests
            3.2.4.1.    Identify components or modules that can be combined and tested
                        independently to reduce dependence on system testing. Identify
                        any test harnesses or drivers that need to be developed.
       3.2.5.      System-Level Tests
            3.2.5.1.    Specify the tests will be carried out to fully exercise the program
                        as a whole to ensure that all elements of the integrated system
                        function properly. Note that when unit and integration testing
                        have been properly performed, the dependence upon system
                        testing can be reduced.
       3.2.6.      Real World User-Level Test
            3.2.6.1.    In contrast to types of testing designed to find defects, identify
                        tests that will demonstrate the successful functioning of the
                        program as you expect the customer to use it. What type of
                        workflow tests will be run? What type of “real work” will be
                        carried out using the program?
       3.2.7.      Unstructured Tests
            3.2.7.1.    Specify the amount of ad-hoc or exploratory testing that will be
                        carried out. Identify the scope and the time associated with this
                        form of testing.
       3.2.8.      Volume Tests
            3.2.8.1.    Indicate the types of tests will be carried out to see how the
                        program deals with very large amounts of data, or with a large
                        demand on timely processing. Note that these tests can rarely be
                        performed without automation; identify the automation tools, test
                        harnesses, or scripts that will be used. Ensure that the programs
                        developed for the test automation effort are accompanied by
                        their own sets of requirements, specifications, and development
                        processes.
       3.2.9.      Stress Tests
            3.2.9.1.    Identify the limits under which the program is expected to
                        perform. These may include number of transactions per unit
                        time, timeouts, memory constraints, disk space constraints, and
                        so on. Volume tests and stress tests are closely related; you may
                        consider wrapping both into the same category.
            3.2.9.2.    How will the product be tested to push the upper functional limits
                        of the program? Will specific tools or test suites be used to carry
                        out stress tests? Ensure that these are reusable.


Author                                     Page 4                                    7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
      3.2.10. Performance Tests
               3.2.10.1.   Refer to the functional requirements that specify acceptable
                           performance. Identify the functions that need to be measured,
                           and the tests needed to show conformance to the requirements.
   3.3.        REGRESSION TESTING
          3.3.1.     At each stage of new development or maintenance, a subset of the
                     regression test library should be run, focusing on the feature or
                     function that has changed from the previous version. Unit, integration,
                     and system tests are all viable places for regression testing. For small
                     maintenance fixes, identify this subset. A good version control system
                     can allow the building of older versions of the software for
                     comparative purposes.
          3.3.2.     In the final phase of a complete development cycle, a full regression
                     test cycle is run. Identify the test case libraries and suites that will be
                     run.
          3.3.3.     Whether a subset or a full regression test run, existing test scripts,
                     matrices and test cases should be used, whether automation is
                     available or not. Identify the documents that describe the details.
                     Emphasize regression tests for functions that are new or that have
                     changed, for components that have had a history of vulnerability, for
                     high-risk defects, and for previously-fixed severe defects.
   3.4.        CONFIGURATION AND COMPATIBILITY TESTING
          3.4.1.     If applicable, identify the types of software and hardware compatibility
                     tests that will be carried out.
          3.4.2.     List operating systems, software applications, device drivers etc. that
                     the product will be tested with or against.
          3.4.3.     List hardware environments required for in-house testing.
   3.5.        DOCUMENTATION TESTING/ONLINE HELP TESTING
          3.5.1.     Documentation and online help testing will be carried out to verify
                     technical accuracy of documented material.
          3.5.2.     If a license agreement is included in or displayed by the product, or the
                     portion of it to which this test plan refers, ensure the correct one is
                     being used (see the next item below).
   3.6.        COPYRIGHTS AND LICENSE AGREEMENT
          3.6.1.     Identify any copyright notices displayed by the program. Verify that
                     they are accurate and up to date.
          3.6.2.     In cases where an End-User License Agreement (EULA) is displayed
                     by the program, which EULA will be used in this product? Provide a
                     link to the file. Ensure that it is the consistent with the one included in
                     the product.


Author                                        Page 5                                     7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
      3.6.3.  Receive sign-off from the legal department that this is the correct
              EULA for this product.
     3.7.    UTILITY, TOOL KIT, AND COLLATERAL TESTS
        3.7.1.     If there are any additional products or components to be included in
                   the final product, or on the distribution media, list the types of tests
                   that will be carried out, and the extent to which they shall be
                   performed.
     3.8.    INSTALL/UNINSTALL TESTS
        3.8.1.     How will deployment and installation be tested?
        3.8.2.     How will the uninstallation or rollback process be tested?
        3.8.3.     Since some form of deployment is required for all software products,
                   what generic installation and uninstallation test catalogs will be used
                   or adapted for these tests?
     3.9.    CODE COVERAGE
        3.9.1.     What tools or processes will be used to assure that each line of code is
                   run at least once during testing?
        3.9.2.     Have the developers performed coverage tests during unit or
                   integration testing? Have they provided the results of these tests?
                   Have they provided source code, test harnesses, or test tools?
        3.9.3.     Are there plans to cover all code during regression testing? If not, why
                   not?
     3.10.   INTERNATIONALIZATION
        3.10.1.    For products intended for global markets, what tests will be carried
                   out to make sure the product can be easily localized (that is, adapted
                   for a specific local market)? For products intended for Asian markets,
                   what tests will be performed to verify that the program correctly
                   handles multiple-byte character sets?
4.     TEST SCHEDULE AND RESOURCES
     4.1.    Identify the estimated effort required to execute the test plan. Include a both
             a range and a confidence level. .
     4.2.    Identify the resources available to carry out the test plan.
     4.3.    Identify time or resource constraints that will lead to a risk of the test project
             falling behind schedule, below expected scope, or below expected quality.
             Cross-reference this with the Unresolved Issues and Risks section later in
             this document.
     4.4.    If any testing is to be handled by another entity, such as another department
             or a third party test lab, identify them. List names and contact information
             at the beginning of this document. List the specific tasks they will be
             assigned to carry out. Include references to contracts with these people, and
             ensure that contracts are approved and signed.

Author                                       Page 6                                   7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
5.   TEST PHASES AND COMPLETION CRITERIA
     5.1.   Detail the planned test cycles and phases; these should be linked to the
            development plan for the project. Specify the type of testing being done in
            each phase. Typically unit testing will be done by the developer of the code,
            and need not be covered in detail in the test plan. Integration and system
            testing phases should be detailed here.
     5.2.   Outline the criteria for assessing the severity of found defects. List
            expectations for setting the priorities on resolving them. Collaborate with
            the developer(s), project managers, and the customer representatives on this.
     5.3.   Identify in advance the criteria that must be fulfilled before each stage of
            testing can be considered complete. Make these specific, measurable, and
            decidable; otherwise, expectations will differ and time will be wasted on
            discussion and debate.
     5.4.   If there are to be staged releases of system testing – typically alpha for
            internal releases, beta for limited releases to external test sites, and final
            releases – sometime called “gold master”, define them. Define acceptance
            standards for each phase. Ideally these should be in a separate document
            that can be referred to here

            Bear in mind that there is a chance that the standards set here are subject to
            being overruled by some authority or another; for example, a product may
            ship with a higher than satisfactory number of minor defects, at the behest of
            a marketing department or CFO that wants the product released with time as
            the most important consideration. Be prepared to accept such decisions
            dispassionately, but also be prepared to record them as failures to fulfill the
            standards set and agreed upon in advance. Companies and individuals can
            forget easily and repeat mistakes when there is no record of breached
            agreements and their consequences; people learn and improve more easily
            when records of successes and failures are available.
6.     UNRESOLVED ISSUES AND RISKS
     6.1.   Identify issues that have yet to be decided as of this draft of the plan. Note
            these as risks to the schedule, scope, or quality of the test effort.
     6.2.   Identify other risks that may have an impact on the success of the plan. Use
            the risks outlined in the course book and the attached speaker notes as a
            guideline to identifying common risks. Refer also to the Software Project
            Survival Guide (Steve McConnell), which includes a good list of risks for
            every phase of development. When assessing risk, don’t be optimistic; the
            quality of the test plan and the risk assessment is weakened by failure to
            assess risk realistically.
7.     TEST PLAN REVIEW
     7.1.   Include plans for review of this test plan. Identify the parties to review and
            approve the document, either within the test group or with another set of
            developers or test engineers. Use ideas from these checklists to develop


Author                                     Page 7                                   7/29/2010
Revision Number
TEST PLAN FOR (PRODUCT)
_______________________________________________________________________
_
          your own checklists, appropriate to the size and scope of the product.
          Identify here the checklist(s) that will be used.
   7.2.     Meet with developers and customers or customer representatives to ensure
            that the test plan meets their requirements.




Author                                   Page 8                               7/29/2010
Revision Number

More Related Content

DOC
Testplan
TXT
Txet Document
PDF
Chapter 2 - Test Management
PPTX
TESTING THROUGHOUT THE SOFTWARE LIFE CYCLE
PDF
Test Plan Template
PPTX
PPT
Testing throughout the software life cycle & statistic techniques
PDF
Ieee829mtp
Testplan
Txet Document
Chapter 2 - Test Management
TESTING THROUGHOUT THE SOFTWARE LIFE CYCLE
Test Plan Template
Testing throughout the software life cycle & statistic techniques
Ieee829mtp

What's hot (20)

PDF
Higher Order Testing
PPTX
Test design techniques
PPTX
Chapter 4 - Testing Quality Characteristics
PDF
Ieee 829 1998-a3
DOCX
4.2.1 test plan (proposed testing)
PPT
Istqb chapter 5
DOC
06 template test plan
DOC
03 software test-plan-template
PDF
Chapter 4 - Performance Testing Tasks
PPTX
Ch25-Software Engineering 9
PDF
Chapter 5 - Tools
PPTX
Test Driven Development:Unit Testing, Dependency Injection, Mocking
PPTX
Chapter 2 - Testing Throughout the Development LifeCycle
PPTX
Chapter 7 - Verifying the TAS
PPT
Testing throughout the software life cycle & statistic techniques
PPTX
Testing throughout the software life cycle
DOC
Softwareenggineering lab manual
DOCX
PPTX
Chapter 4 - Deployment & Delivery
DOC
ISTQB Advanced Study Guide - 3
Higher Order Testing
Test design techniques
Chapter 4 - Testing Quality Characteristics
Ieee 829 1998-a3
4.2.1 test plan (proposed testing)
Istqb chapter 5
06 template test plan
03 software test-plan-template
Chapter 4 - Performance Testing Tasks
Ch25-Software Engineering 9
Chapter 5 - Tools
Test Driven Development:Unit Testing, Dependency Injection, Mocking
Chapter 2 - Testing Throughout the Development LifeCycle
Chapter 7 - Verifying the TAS
Testing throughout the software life cycle & statistic techniques
Testing throughout the software life cycle
Softwareenggineering lab manual
Chapter 4 - Deployment & Delivery
ISTQB Advanced Study Guide - 3
Ad

Viewers also liked (9)

PDF
Learning and Development Programs
PDF
Cambridge university certificate
PPTX
Apresentação git
PPT
Role of locking
PDF
"Едно начало" - Йозо Михайлов Йосифов
PDF
HR BAROMETER - An attractive place to work
PPTX
Introducing solved Question Papers of last 5 years for Anna University, 1st s...
PPTX
Benign sinonasal masses presentation & management-1
Learning and Development Programs
Cambridge university certificate
Apresentação git
Role of locking
"Едно начало" - Йозо Михайлов Йосифов
HR BAROMETER - An attractive place to work
Introducing solved Question Papers of last 5 years for Anna University, 1st s...
Benign sinonasal masses presentation & management-1
Ad

Similar to Test planoutline (20)

DOC
Testplan
PDF
How to Write a Test Plan .pdf
PDF
Performance Test Plan - Sample 2
PPTX
sofwtare standard for test plan it execution
PDF
STE ALl Model Ans 17624 -22518 Paper-merged (1).pdf
PPTX
Qa documentation pp
PDF
What Is Unit Testing_ A Complete Guide With Examples.pdf
PDF
Software testing for project report system.
PDF
Sample test-plan-template
PPTX
Testing throughout the software life cycle
DOCX
Test driven development and unit testing with examples in C++
DOC
sample-test-plan-template how to create test plan
PDF
Software testing for project report .pdf
PDF
What Is Unit Testing A Complete Guide With Examples.pdf
DOCX
Some of the Material in this paper has been repurposed from C.docx
DOC
Test plan
PPT
Test Director Ppt Training
PPTX
Unit 3 for st
PDF
Software Test Automation - Best Practices
PPT
Test planning.ppt
Testplan
How to Write a Test Plan .pdf
Performance Test Plan - Sample 2
sofwtare standard for test plan it execution
STE ALl Model Ans 17624 -22518 Paper-merged (1).pdf
Qa documentation pp
What Is Unit Testing_ A Complete Guide With Examples.pdf
Software testing for project report system.
Sample test-plan-template
Testing throughout the software life cycle
Test driven development and unit testing with examples in C++
sample-test-plan-template how to create test plan
Software testing for project report .pdf
What Is Unit Testing A Complete Guide With Examples.pdf
Some of the Material in this paper has been repurposed from C.docx
Test plan
Test Director Ppt Training
Unit 3 for st
Software Test Automation - Best Practices
Test planning.ppt

More from Dr. C.V. Suresh Babu (20)

PPTX
Data analytics with R
PPTX
Association rules
PPTX
PPTX
Classification
PPTX
Blue property assumptions.
PPTX
Introduction to regression
PPTX
Expert systems
PPTX
Dempster shafer theory
PPTX
Bayes network
PPTX
Bayes' theorem
PPTX
Knowledge based agents
PPTX
Rule based system
PPTX
Formal Logic in AI
PPTX
Production based system
PPTX
Game playing in AI
PPTX
Diagnosis test of diabetics and hypertension by AI
PPTX
A study on “impact of artificial intelligence in covid19 diagnosis”
PDF
A study on “impact of artificial intelligence in covid19 diagnosis”
Data analytics with R
Association rules
Classification
Blue property assumptions.
Introduction to regression
Expert systems
Dempster shafer theory
Bayes network
Bayes' theorem
Knowledge based agents
Rule based system
Formal Logic in AI
Production based system
Game playing in AI
Diagnosis test of diabetics and hypertension by AI
A study on “impact of artificial intelligence in covid19 diagnosis”
A study on “impact of artificial intelligence in covid19 diagnosis”

Recently uploaded (20)

PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
Hazard Identification & Risk Assessment .pdf
PPTX
Computer Architecture Input Output Memory.pptx
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
PDF
semiconductor packaging in vlsi design fab
PDF
MICROENCAPSULATION_NDDS_BPHARMACY__SEM VII_PCI .pdf
PDF
LEARNERS WITH ADDITIONAL NEEDS ProfEd Topic
PPTX
What’s under the hood: Parsing standardized learning content for AI
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PPTX
Education and Perspectives of Education.pptx
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 2).pdf
PDF
Journal of Dental Science - UDMY (2021).pdf
PDF
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
PDF
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
PDF
Empowerment Technology for Senior High School Guide
PDF
Mucosal Drug Delivery system_NDDS_BPHARMACY__SEM VII_PCI.pdf
PDF
What if we spent less time fighting change, and more time building what’s rig...
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
A powerpoint presentation on the Revised K-10 Science Shaping Paper
Hazard Identification & Risk Assessment .pdf
Computer Architecture Input Output Memory.pptx
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
Environmental Education MCQ BD2EE - Share Source.pdf
semiconductor packaging in vlsi design fab
MICROENCAPSULATION_NDDS_BPHARMACY__SEM VII_PCI .pdf
LEARNERS WITH ADDITIONAL NEEDS ProfEd Topic
What’s under the hood: Parsing standardized learning content for AI
B.Sc. DS Unit 2 Software Engineering.pptx
Education and Perspectives of Education.pptx
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 2).pdf
Journal of Dental Science - UDMY (2021).pdf
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
Empowerment Technology for Senior High School Guide
Mucosal Drug Delivery system_NDDS_BPHARMACY__SEM VII_PCI.pdf
What if we spent less time fighting change, and more time building what’s rig...

Test planoutline

  • 1. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ This is a sample of an outline for a test plan. It has been designed for medium to small test projects, and thus is fairly lightweight. It is by necessity general, because each enterprise, each development group, each testing group, and each development project is different. This outline should be used as a set of guidelines for creating your own standard template; add to it or subtract from it as you find appropriate. Bear in mind that it is generally better to have an excess of detail in the template—detail which can be removed when creating a specific test plan—than to have to remember to add something that is not in the template. Make sure to fill in the running headers and footers with the product name, draft numbers, revision dates, and page numbers; this is important in places with lots of test projects on the go. Make sure to include the author’s name, too, so that errors or questions can be addressed to the right person. 1. OVERVIEW 1.1. PRODUCT NAME 1.2. PRODUCT REVISION 1.3. PROJECT LEADS 1.3.1. Marketing Lead (or other customer representative) 1.3.2. Program Manager 1.3.3. Development Lead 1.3.4. Test Lead 1.3.5. Build and Release Control Engineer 1.3.6. Legal representative Include names, phone numbers, and email addresses for each. Note that this table will differ for a particular company or group. The goal is to ensure that anyone walking into the company or into the test role can easily identify and contact the people he/she needs to reach. 1.4. TEST PROJECT STAFF 1.4.1. Test requirements designers 1.4.2. Test case designers 1.4.3. Test personnel 1.4.3.1. For manual (i.e. non-automated) tests 1.4.3.2. For automated tests 1.4.3.3. Test automation programmers 1.4.4. Documentation reviewers 1.4.5. Legal reviewer Include names, phone numbers, and email addresses for each. Note that there may be several people in each role, that one person may be obliged to fill multiple roles, and that some roles (e.g. legal reviewer) won’t be required for all projects. Author Page 1 7/29/2010 Revision Number
  • 2. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ 1.5. PRODUCT OVERVIEW This could also be “description of the change requirements” for maintenance projects. 1.5.1. Cut and paste from requirements document or specification a brief summary description of the product or change, or describe the project as understood by the developers—if the latter, make sure that there is agreement and sign-off from the customer. 1.6. TRACKING AND REPORTING SYSTEMS 1.6.1. Identify the defect tracking system in use. 1.6.2. Identify the manner and schedule by which defect reports are expected to be delivered to developers. 1.6.3. Identify parties that may have access to the tracking system. 1.6.4. Identify the change control system. 1.6.5. Identify the means by which the team is to be notified of changes to the requirements, the product, the test plan, etc. 2. TESTING SYNOPSIS 2.1. Items to be tested 2.1.1. Refer to the functional requirements that specify the features and functions to be tested. The description of the change need not be excessively detailed when there is a complete description to refer to in some other document. On the other hand, if there is no reasonable specification available, more detail is called for here. 2.2. Items not to be Tested 2.2.1. List the features and functions that will not be covered in this test plan. Identify briefly the reasons for leaving them out. 2.3. System Requirements 2.3.1. This section should be filled out in detail for new projects. For existing maintenance tasks, a simple cross-reference to the document describing existing system requirements is fine. Note any changes to previous system requirements, especially when support for a given product or platform is being dropped. 2.3.2. If there is a system requirement that could be unclear, make it specific; for example, for Web-based projects, identify not only the supported browsers but also the minimum versions of the supported browsers. 2.4. Standards/Reference material 2.4.1. List any standards or other reference material used in the creation of this test plan. 2.4.2. Identify standards for acceptance criteria, defect severity, testable specifications, and so on. (These standards may have to be created, or Author Page 2 7/29/2010 Revision Number
  • 3. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ adapted from time to time; the first use of this test plan will require more work than later iterations.) 2.5. Glossary 2.5.1. In cases where terminology could be unfamiliar or open to interpretation, provide a list defining the unclear terms. 2.5.2. Obtain agreement on these terms from all interested parties. 2.5.3. Note: If no one is forthcoming with the information you need, make something up; they might not have done their jobs from the outset, but they’ll be happy to correct your work! You will have achieved the goal, which is clarity and agreement. 3. TYPES OF TESTING 3.1. ACCEPTANCE TESTING 3.1.1. Detail a set of acceptance criteria—conditions that must be met before testing can begin. A smoke test should represent the bare minimum of acceptance testing. 3.1.2. As noted above, the ideal is to create a separate document for acceptance criteria that can be reused and referred to here. If any particular, specialized test cases not listed in that document will be used, refer to them here. 3.2. FEATURE LEVEL TESTING This is the real meat of the test plan. The test categories below are filled in itemizing categories of tests, along with references to the test library or catalog. Individual test cases should not be listed here; test requirements generally should not be either; the details should exist elsewhere and can be cross-referenced. 3.2.1. Task-Oriented Functional Tests 3.2.1.1. This is a detailed section, listing tests requirements for program features against functional specifications, user guides or other design related documents. If there are test matrices available listing these features and their interdependence (and there should be), refer to them. 3.2.2. Forced-Error Tests 3.2.2.1. Provide or refer to a list of all error conditions and messages. Identify the tests that will be run to force the program into error conditions. 3.2.3. Boundary Tests 3.2.3.1. Boundary tests—tests carried out at the lines between valid and invalid input, acceptable and unacceptable system requirements (such as memory, disk space, or timing), and other tests at the limits of performance—are the keys to eliminating duplication of effort. Identify the types of boundary tests that will be carried Author Page 3 7/29/2010 Revision Number
  • 4. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ out. Note that such tests can also fall into the categories outlined below, so this section may be removed, or made a sub- section of those categories. 3.2.4. Integration Tests 3.2.4.1. Identify components or modules that can be combined and tested independently to reduce dependence on system testing. Identify any test harnesses or drivers that need to be developed. 3.2.5. System-Level Tests 3.2.5.1. Specify the tests will be carried out to fully exercise the program as a whole to ensure that all elements of the integrated system function properly. Note that when unit and integration testing have been properly performed, the dependence upon system testing can be reduced. 3.2.6. Real World User-Level Test 3.2.6.1. In contrast to types of testing designed to find defects, identify tests that will demonstrate the successful functioning of the program as you expect the customer to use it. What type of workflow tests will be run? What type of “real work” will be carried out using the program? 3.2.7. Unstructured Tests 3.2.7.1. Specify the amount of ad-hoc or exploratory testing that will be carried out. Identify the scope and the time associated with this form of testing. 3.2.8. Volume Tests 3.2.8.1. Indicate the types of tests will be carried out to see how the program deals with very large amounts of data, or with a large demand on timely processing. Note that these tests can rarely be performed without automation; identify the automation tools, test harnesses, or scripts that will be used. Ensure that the programs developed for the test automation effort are accompanied by their own sets of requirements, specifications, and development processes. 3.2.9. Stress Tests 3.2.9.1. Identify the limits under which the program is expected to perform. These may include number of transactions per unit time, timeouts, memory constraints, disk space constraints, and so on. Volume tests and stress tests are closely related; you may consider wrapping both into the same category. 3.2.9.2. How will the product be tested to push the upper functional limits of the program? Will specific tools or test suites be used to carry out stress tests? Ensure that these are reusable. Author Page 4 7/29/2010 Revision Number
  • 5. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ 3.2.10. Performance Tests 3.2.10.1. Refer to the functional requirements that specify acceptable performance. Identify the functions that need to be measured, and the tests needed to show conformance to the requirements. 3.3. REGRESSION TESTING 3.3.1. At each stage of new development or maintenance, a subset of the regression test library should be run, focusing on the feature or function that has changed from the previous version. Unit, integration, and system tests are all viable places for regression testing. For small maintenance fixes, identify this subset. A good version control system can allow the building of older versions of the software for comparative purposes. 3.3.2. In the final phase of a complete development cycle, a full regression test cycle is run. Identify the test case libraries and suites that will be run. 3.3.3. Whether a subset or a full regression test run, existing test scripts, matrices and test cases should be used, whether automation is available or not. Identify the documents that describe the details. Emphasize regression tests for functions that are new or that have changed, for components that have had a history of vulnerability, for high-risk defects, and for previously-fixed severe defects. 3.4. CONFIGURATION AND COMPATIBILITY TESTING 3.4.1. If applicable, identify the types of software and hardware compatibility tests that will be carried out. 3.4.2. List operating systems, software applications, device drivers etc. that the product will be tested with or against. 3.4.3. List hardware environments required for in-house testing. 3.5. DOCUMENTATION TESTING/ONLINE HELP TESTING 3.5.1. Documentation and online help testing will be carried out to verify technical accuracy of documented material. 3.5.2. If a license agreement is included in or displayed by the product, or the portion of it to which this test plan refers, ensure the correct one is being used (see the next item below). 3.6. COPYRIGHTS AND LICENSE AGREEMENT 3.6.1. Identify any copyright notices displayed by the program. Verify that they are accurate and up to date. 3.6.2. In cases where an End-User License Agreement (EULA) is displayed by the program, which EULA will be used in this product? Provide a link to the file. Ensure that it is the consistent with the one included in the product. Author Page 5 7/29/2010 Revision Number
  • 6. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ 3.6.3. Receive sign-off from the legal department that this is the correct EULA for this product. 3.7. UTILITY, TOOL KIT, AND COLLATERAL TESTS 3.7.1. If there are any additional products or components to be included in the final product, or on the distribution media, list the types of tests that will be carried out, and the extent to which they shall be performed. 3.8. INSTALL/UNINSTALL TESTS 3.8.1. How will deployment and installation be tested? 3.8.2. How will the uninstallation or rollback process be tested? 3.8.3. Since some form of deployment is required for all software products, what generic installation and uninstallation test catalogs will be used or adapted for these tests? 3.9. CODE COVERAGE 3.9.1. What tools or processes will be used to assure that each line of code is run at least once during testing? 3.9.2. Have the developers performed coverage tests during unit or integration testing? Have they provided the results of these tests? Have they provided source code, test harnesses, or test tools? 3.9.3. Are there plans to cover all code during regression testing? If not, why not? 3.10. INTERNATIONALIZATION 3.10.1. For products intended for global markets, what tests will be carried out to make sure the product can be easily localized (that is, adapted for a specific local market)? For products intended for Asian markets, what tests will be performed to verify that the program correctly handles multiple-byte character sets? 4. TEST SCHEDULE AND RESOURCES 4.1. Identify the estimated effort required to execute the test plan. Include a both a range and a confidence level. . 4.2. Identify the resources available to carry out the test plan. 4.3. Identify time or resource constraints that will lead to a risk of the test project falling behind schedule, below expected scope, or below expected quality. Cross-reference this with the Unresolved Issues and Risks section later in this document. 4.4. If any testing is to be handled by another entity, such as another department or a third party test lab, identify them. List names and contact information at the beginning of this document. List the specific tasks they will be assigned to carry out. Include references to contracts with these people, and ensure that contracts are approved and signed. Author Page 6 7/29/2010 Revision Number
  • 7. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ 5. TEST PHASES AND COMPLETION CRITERIA 5.1. Detail the planned test cycles and phases; these should be linked to the development plan for the project. Specify the type of testing being done in each phase. Typically unit testing will be done by the developer of the code, and need not be covered in detail in the test plan. Integration and system testing phases should be detailed here. 5.2. Outline the criteria for assessing the severity of found defects. List expectations for setting the priorities on resolving them. Collaborate with the developer(s), project managers, and the customer representatives on this. 5.3. Identify in advance the criteria that must be fulfilled before each stage of testing can be considered complete. Make these specific, measurable, and decidable; otherwise, expectations will differ and time will be wasted on discussion and debate. 5.4. If there are to be staged releases of system testing – typically alpha for internal releases, beta for limited releases to external test sites, and final releases – sometime called “gold master”, define them. Define acceptance standards for each phase. Ideally these should be in a separate document that can be referred to here Bear in mind that there is a chance that the standards set here are subject to being overruled by some authority or another; for example, a product may ship with a higher than satisfactory number of minor defects, at the behest of a marketing department or CFO that wants the product released with time as the most important consideration. Be prepared to accept such decisions dispassionately, but also be prepared to record them as failures to fulfill the standards set and agreed upon in advance. Companies and individuals can forget easily and repeat mistakes when there is no record of breached agreements and their consequences; people learn and improve more easily when records of successes and failures are available. 6. UNRESOLVED ISSUES AND RISKS 6.1. Identify issues that have yet to be decided as of this draft of the plan. Note these as risks to the schedule, scope, or quality of the test effort. 6.2. Identify other risks that may have an impact on the success of the plan. Use the risks outlined in the course book and the attached speaker notes as a guideline to identifying common risks. Refer also to the Software Project Survival Guide (Steve McConnell), which includes a good list of risks for every phase of development. When assessing risk, don’t be optimistic; the quality of the test plan and the risk assessment is weakened by failure to assess risk realistically. 7. TEST PLAN REVIEW 7.1. Include plans for review of this test plan. Identify the parties to review and approve the document, either within the test group or with another set of developers or test engineers. Use ideas from these checklists to develop Author Page 7 7/29/2010 Revision Number
  • 8. TEST PLAN FOR (PRODUCT) _______________________________________________________________________ _ your own checklists, appropriate to the size and scope of the product. Identify here the checklist(s) that will be used. 7.2. Meet with developers and customers or customer representatives to ensure that the test plan meets their requirements. Author Page 8 7/29/2010 Revision Number