Wednesday, July 4, 2012
Requirements review in Agile: Ensuring consistency and spotting defects
Agile methodology adoption applies to requirements reviews and techniques just as it does to the development process as a whole. However, companies switching over to Agile tend to hang on to the full-fledged requirements document for regulatory or documentation reasons. Agile teams need to perform a thorough requirements review before the document is broken down into user stories.
While Agile teams are frequently coming up with new ideas, there is a need to retain consistency in how software testers perform their reviews – what details they need, how they can recognize gaps in the code or logic and how they can ensure all the requirements are testable.
Testing teams can spot more defects by spending time performing requirements reviews in a consistent manner. In this article, we look at requirements review techniques, examining how to determine testability and how to spot missing connections.
What is a testable requirement?
How does the tester determine if a requirement is truly testable? Testable means that as a tester you’re able to verify the results. There must be something to verify. Whether it’s a result, a database value, calculation, form – whatever it is, the tester must be able to generate it and see it.
When reviewing requirements, a good rule of thumb is the story must contain the essential elements of who, what, where, why, when and how.
For example, is this requirement testable?
The Pony2000 Client shall perform the same as the Pony1900 client.
No, it’s not unless the requirements document contains details of the functionality in the Pony2000 compared to the Pony1900 client so the tester can create a test for each functional piece. We are given the “what” but there is no who, where, when or how so the meaning of “perform the same” is not defined.
It’s highly likely in this type of requirement that the functionality covered is exhaustive. However, the above requirement is absolutely worthless without the information in some fashion of how exactly the Pony2000 client “performs.” There is nothing to verify.
Is this example testable?
The Prescriber's dose unit (or converted dose unit if applicable) should be communicated to caregivers (CareProduct A and CareProduct E). When a dose is converted for extremely small or large units, that conversion will also be available for use in other applications.
Let’s break it down.
Who= Prescriber’s dose unit.
What= It’s communicated to caregivers or two specific integrated applications.
When= When the dose is large or small the conversion is available to other applications.
Why, how, and where are missing. If we take the who – what is the Prescriber’s dose unit? A better definition is necessary. What this actually means is the physician’s chosen dose unit. Now, on what? Where? How? Why? Finally, what do they mean by “if applicable?” Stating what is applicable is the point of the requirements document.
What about this example:
When a medication has the non-pharmacy prompt set to either "CareProduct A" or "CareProduct A and CareProduct B", Pony2012 will automatically append "(NonPharm)" to the end of the display of that medication’s name displayed in the lower left order window. This display will persist during the ordering process, but will NOT be visible as a part of the medication’s name once the order is confirmed.
Yes, it is. Why?
We have:
Who = A medication with non-pharmacy prompt set with available = CareProduct A or CareProduct A and CareProduct B.
What = configuration that appends NonPharm to the end of the medication name in the order display window.
How = automatic display that persists during the session and prior to order completion.
Where = in standard medication entry before the order is completed and moves to the lower left order window.
Why = display a visual indicator when a non-pharmacy medication is ordered.
When = when a medication is defined as non-pharmacy.
Your test team can verify several results with this one requirement. It’s lengthy and includes multiple components, but it gives the necessary information needed to test.
As a test team, your first and most important action is verifying that the requirements are testable. Untestable requirements provide a wide and ambitious opening for defects.
Missing connections – looking for the Grand Canyon
Once your test team has testable requirements, the next task is to review for missing connections. The test team reviews for logic or functional workflows where gaps exist. Is there a spot where an integration point exists with another application? Is the data available for transmission? If a customer places an order, do they get a confirmation number? A receipt? Are the numbers unique and does it actually print to paper? If it prints to paper, is it legible or in a format that makes it easily readable? Testing teams know these small details are often overlooked by development, but not by customers.
Consider your communication points. In healthcare, messaging systems pass information between related systems in a form readable by other health systems. Similarly, a financial application may rely on data being present confirming an account identity. Is the receiving application aware of it, did anything change that would alter the passage of the data?
Confirming that the communication and integration points are covered is critical. Once those are complete, scan for places where data is not written to a database table. For example, in electronic health record software every single action is recorded every time on a patient record. Those records are retained for auditing, historical and reporting purposes. Same with most financial transactions, large or small, it all must be recorded in a database table. A good thing to check is what exactly are the columns the data is written to? Review the database table – is there anything missing? Perhaps a date field has incorrect formatting or doesn’t accept a valid number of characters? Database records are generally ripe harvesting locations for defects not covered in requirements.
Conclusion
Your test will enhance the value you provide, when they review requirements documents consistently. Reviewing requirements for testability and gaps in processing are useful methods that provide consistency. Your team will consistently provide value and enhance your application’s quality and customer acceptance by finding defects before they are coded in.
Subscribe to:
Posts (Atom)