Task: Implement Tests
Implement one or more test artifacts in order to enable the validation of the software product through physical execution. Combine tests to facilitate appropriate breadth and depth of test coverage.
Discipline:  Test
Purpose

Ensure that:

  • The prioritized use-case scenario from your use case model have been used to generate test cases
  • Each test case has an outline
  • Sources other than use cases have been examined as prospective sources of test cases
  • Test data has been identifed and ideally typed
  • Sources of test cases have been captured in a test-ideas list

For more information on test case creation verification, see see Checklist: Test Case.

Relationships
RolesPrimary Performer: Additional Performers:
InputsMandatory: Optional:
Outputs
Steps
Select appropriate implementation technique

Select one or several of the following test implementation techniques:

  • manual test scripts
  • programmed test scripts
  • recorded test scripts
Implement the Test
Using your Test-Ideas List and test cases as inputs, Set up your spreadsheet, specifications, or IDE to record scripts needed to conduct the test.  Use your Test-ideas List and test cases as inputs.  If you are recording explicit steps for your test, navigate through the system-under-test, identifying steps, groups of related steps, verification points, and control points.
Establish external data sets
Create containers for your test data sets.  Separate production data from generated data.  Associate your data sets with a given build of the system-under-test. If data sets are associated with a particular part of the system-under-test, mark them accordingly.
Verify Test implementation

Verify the correct implementation of the test script by testing the test script to ensure that it implements the tests correctly.  In the case of manual testing, conduct a walkthrough of the test script.  For test automation, verify that the test implementation will involve some degree of monitoring of the supporting testing tooling configuration.

Organize tests into Test Suites

Collect tests into related groups.  The grouping mechanism you use depends on your test environment.  You can, for example, organize test cases, test scripts, and test data hierarchically to facilitate navigation within a test as well as within the suite. Another form of test suite organization is based on system functionality and uses the quality attributes of usability, reliability, and performance as categories for groups. You may wish to follow an iteration-based test suite organization. Since the system-under-test is undergoing its own evolution, create your test suites to facilitate regression testing as well as system configuration identification.

Evaluate and verify your results

Verify the appropriate information has been completed and resulting artifacts are of sufficient value.

Now that you have completed the work, it is beneficial to verify that the work was of sufficient value, and that you did not simply consume vast quantities of paper.  You should evaluate whether your work is of appropriate quality, and that it is complete enough to be useful to those team members who will make subsequent use of it as input to their work.  Where possible, use the checklists provided in OpenUp Basic to verify that quality and completeness are "good enough".

Have the people performing the downstream activities that rely on your work as input taken part in reviewing your interim work?  Do this while you still have time available to take action to address their concerns.  You should also evaluate your work against the key input artifacts to make sure you have represented them accurately and sufficiently.  It might be useful to have the author  of the input artifact review your work on this basis.

Try to remember that the OpenUp Basic is an iterative process and that in many cases artifacts evolve over time.  As such, it is not usually necessary-and is often counterproductive-to fully form an artifact that will only be partially used or will not be used at all in immediately subsequent work.  This is because there is a high probability that the situation surrounding the artifact will change--and the assumptions made when the artifact was created proven incorrect--before the artifact is used, resulting in wasted effort and costly rework.  Also, avoid the trap of spending too many cycles on presentation to the detriment of content value.  In project environments where presentation has importance and economic value as a project deliverable, you might want to consider using an administrative resource to perform presentation tasks.

More Information