Knowledge is Power!
Testing is not an activity that can be made up as you go along. By definition success targets must be set in advance. This is typically be done be defining a test script: a repeatable set of instructions that when followed determine the success or otherwise of some application.
While script writing must be done in advance this can often be difficult to do, particularly in cases where the level of system documentation is poor. It is thus essential to understand what is appropriate to go into the script.
The challenge is defining scripts that meet the following objectives.
Keep agile: Often testing out of proportion of what needs to be tested
Ensure complete coverage: It is important that there are no significant holes in the testing
Develop in cooperation of IT: Develop scripts in close cooperation with the developers – testers often feel the preparation activity must be done in a vacuum as they fear that involvement of other parties may contaminate the impartiality – this is a mistake – typically time is very constrained at this point of the project so any ways of right sizing are of benefit.
Ensure separation of testers to IT: The area where you want to keep clear separation in terms of responsibility is the execution of the tests (rather than the formulating of test scripts). If the scripts are clear and concise enough to allow a 3rd party without any domain or functional experience to run, then you have succeeded in pitching the scripts at the right level.
Define the evidence basis: How is success verified? SOX mandates that testing evidence is to be recorded in certain cases.
Ensure the tests are right for what is being tested: Unless the application is very large, or has very complex STP/ algorithms the following areas should are really all that should be considered for the typical web-based application:
Quality assurance: Given that the major part the development is user-facing, quality assurance suite of scripts are to confirm that the user experience is operating in a consistent way (for example ensuring that basic standards in UI design are adhered to such as formatting, accessibility, error-handling, alignment and spelling). Support for different browsers and operating systems is also to be tested. The functionality of security (availability to functions for those with/without a valid account/password) is also covered. The format of these scripts is a checklist against all screens.
End-to-end: The majority of screens form part of an end-to-end enterprise, so it is important to demonstrate that the users can navigate through the screens to achieve their end-goal, without any logical breaks or unexpected behaviour. It is also important to demonstrate that data input in an earlier stage is carried forward in a consistent manner. User-experience suite of tests therefore briefly describes scenarios involving a number of screens to ensure all stages and the end result occur as expected.
Boundary testing: Testing of the behaviour of both inputs and outputs is shown in the boundary-test suite of scripts. With the inputs this works by examining that values of each control in isolation and stressing them with boundary conditions or inconsistent values to their type, for outputs the boundary values of what can be accepted by the receiving component are examined, and tests are contrived to see if these can be exceeded (i.e. it drives us to ensure that test coverage is complete, but we are not actually testing the interface directly).
Interface testing: This is a technical test. Checks to ensure input/output to interface are as expected, and that unavailability of interface does not cause unexpected/unsupported application behaviour in both synchronous and asynchronous modes
Load/SLA testing: For purposes of this agile testing these can be often be combined. Load testing will attempt to see the upper number of users the application can support.