Jun 15 2007

Remember to define your pre-release Caveats

Recently I released a product to one of my customers for a preview (e.g. Alpha Release) and not too long after I starting getting calls with complaints regarding incomplete functionality... As you can image I stated it was Alpha and I might as well have been speaking elvish because they just didn't understand.  Lucky for me, this term was spelled out pretty clearly in the Software Requirements Specifications (SRS) I wrote at the beginning of the project which we all agreed to. This type of thing is crucial and you really need to define threshholds for your releases in writing and do as much as possible to make the definitions comprehensive.  Over the years I have found a lot of different definitions for caveats such as Alpha, Beta, Candidate Release, Etc...  However you define it, write it down and make sure everyone agrees.  Here is what I wrote in my SRS...

PRODUCT TESTING AND SOFTWARE PRERELEASE PROVISIONS

Testing shall occur using a test team coordinated by the customer as well as internal testing coordinated by the software development team. The team shall test the product(s) and evaluate its functionality and capabilities as they apply to the current state of the development life cycle. The pre-release test group(s) shall provide information concerning software changes (enhancements and defects) through the Configuration Control Board (CCB) and test event hot-wash meetings as applicable. For details on project test procedures and methods, refer to the project Test Plan.  The following pre-release testing phases are defined for this project:

  • Alpha Release testing is the first phase of testing in a software development process. This phase includes unit testing, component testing, and limited system testing. This is generally an internal test cycle and is accomplished using an Alpha Release of the product.
  • Beta Release testing is the phase of software testing in which a sampling of the intended audience uses the product(s). This phase of testing may also include a number of specialized tests designed to identify areas of the product needing improvement (Ex. Security, Performance, Scalability, etc…). Beta testing is considered "pre-release testing" and is accomplished using a Beta Release of the product.
  • Candidate Release testing (A.K.A. Production/Acceptance Test Releases) are software releases prior to the final production of the software. In this phase of testing all stated requirements should have been met. The customer’s acceptance testing is often conducted with this release and is accomplished using a Candidate Release of the product.

The aforementioned external tests are designed; among other things; to verify and/or validate the products development. For the purposes of this document, verification is meant to answer two primary questions with respect to the product being tested.

  • "Are we building what we were asked to build?"
  • "Are we building it right?"

Verification is accomplished as part of unit testing, integration testing, systems testing, Function Qualification Test (FQT), and more. For the purposes of this document, validation is meant to answer two primary questions with respect to the product being tested.

  • "Are we building what the operator needs?"
  • "Are we building the right thing?"

Validation is primarily accomplished as part of acceptance testing although valuable insight can be gained by including a diverse set of end-users in Beta testing.

The provisions below identify the scope of each pre-release product and the test phases they support.

Alpha Release
Alpha releases are software releases made prior to any Beta release or Beta test phase. This release of the product is not yet stable and will lack many features. Alpha software is only released to provide for targeted analysis, design reviews, prototyping, or early product demonstrations. Alpha releases shall not be used to fully assess any level of compliancy to stated requirements or goals; however this release may be used to validate the approaches selected. Alpha releases are internal releases and should never be released to third parties. After some testing and some revision, the product will assume beta status.

  • Alpha Test Phase
    Alpha releases are versions of a new product still in development and represents the first phase of testing in a software development process. This phase includes unit testing, component testing, and limited system testing. This is an internal pre-release test cycle. The first Alpha release marks the beginning of the Alpha Test Phase.

Beta Release
Beta releases are versions of a new product still in development, which is provided to external operators for testing. Beta releases are not produced until sometime after bug convergence has been observed. Bug convergence is the point at which the development team has made measurable progress against the active bug count; which is the point at which the rate of bugs resolved exceeds the rate of bugs reported. Many bugs are usually eliminated at the earlier Alpha stage of development and testing, however bugs will still be present in Beta releases. Beta testers use the software to do real work and report any bugs or badly implemented features they find. Beta releases may be used to assess levels of compliancy as it is related to the stated requirements and/or goals; however, it is often limited to a subset of requirements for which the beta was released. While Beta releases are; by definition; releasable to the “public” they should only be distributed to a limited number of expert operators in the earliest phases. Distribution of beta versions allows operator testing and feedback to the developer, so that any necessary modifications can be made before final product release. The term Beta release is given to a product that is not ready for public consumption, but is good enough for a wider testing scope. By the end of a beta test, all major bugs should have been discovered and repaired. Generally, beta testing is considered to be the final pre-release stage of the tests, and includes experienced testers external to the developing organization.

  • Beta Test Phase
    The Beta test release marks the beginning of the Beta Test Phase. Beta Testing is the second phase of testing in which a sampling of the intended audience tries the product out. Beta testing is considered an external pre-release test cycle.

Candidate Release
The Candidate Release (A.K.A. Production/Acceptance Test Release) is a software release made prior to the final production of the software. In this phase of development all stated requirements should have been met or mitigated to a final status. Candidate Releases are not produced until the project has observed a Zero-bug bounce. Zero-bug bounce is the point when development has resolved all bugs raised by testing and no active bugs remain. The customers acceptance testing is often conducted with this release, which may include a wide range of testing models (e.g. usability, security, performance, etc…) needed to fully validate the product. This is still considered a pre-release product pending final release approval from the customer.

  • Production Test Phase
    The Production test phase is the final phase of testing prior full release of the product in which the customer performs final acceptance testing. Particular attention is placed on media (re)production methodologies, software installation routines, and documentation.

Tags: ,

Nov 22 2006

Authoring software requirements with a common vernacular

So it was that time again...  Time to write requirements for a new project.  Depending on your environment and the complexity of the anticipated solution this can be a really tedious piece of work.  As with most efforts such as this you need to try and create a common vernacular.  This is really crucial to ensuring you have requirements that are objective and agreeable. Here are a couple of examples.

Software Requirements Validation Criteria 

The requirements defined in the Software Requirements Specification (SRS) are evaluated with respect to the following criteria. This criterion is not intended to compete but is provided to clarify language usage between project stakeholders:

  • Correct. The SRS is correct if and only if every requirement stated therein represents something required of the system to be built.
  • Unambiguous. The SRS is unambiguous if and only if every requirement stated therein has only one interpretation.
  • Complete. The SRS is complete when it encompasses the following four qualities: 
    • Everything that the software is supposed to do is included in the SRS.
    • Definitions of the responses of the software to all classes of input data in all classes of situations are included.
    • All artifacts related to the requirement are properly identified and referenced.
    • No sections are marked as To Be Determined (TBD).
  • Verifiable. The SRS is verifiable if each requirement can be implicitly or explicitly tested. Every attempt should be made to minimize the number of implicitly tested requirements.
  • Consistency. The SRS is consistent if it uses the same terminology throughout.
  • Modifiable. The SRS should be written in such a way that requirements can be easily added, changed, and deleted. Establish criteria for how this will be accomplished in the Overview section of the SRS.
  • Externally Consistent. The SRS is externally consistent if and only if no documented requirement conflicts with the Statement of Work, System Requirements, or other projectspecific controlling documents.
  • Internally Consistent. The SRS is internally consistent if and only if no documented requirement conflicts with another documented requirement.
  • Traceable. The SRS should be traceable to the system/software specification which is provided by the customer. Each requirement shall identify its source.
  • Design-independent. The SRS cannot contain any design dependent requirements.
  • Concise. The SRS should contain only the necessary verbiage. Elimination of excess wording is critical for conciseness. Use of compound expressions (e.g. X and Y) shall also be avoided.

Best Practice Guidelines for Authoring Requirements

This project used the natural language specification for authoring requirements. The natural language approach requires only moderate training for those specifying the requirements and no, or very little, training for those who use the specification (e.g., software designers, programmers, testers, etc…). In general, each requirements statement shall address the following elements:

  • localization
  • entity
  • action
  • target
  • constraint

The following example requirement statement exhibits all five of the elements:

"When the user selects the 'Check-Out' document function on the Document Control Page, the Document Control Page shall launch the most recent document revision in its native application unless the Document Control Page is empty (i.e., has no document revisions)."

  • localization - When the user selects the 'Check-Out' document function on the Document Control Page
  • entity - the Document Control Page
  • action - launch
  • target - the most recent document revision in its native application
  • constraint - unless the Document Control Page is empty (i.e., has no documents revisions)

The following bullets provide some additional guidance for developing sound requirements statements:

  • Avoid weak phrases, such as:
    • as appropriate
    • may
    • if required
    • if practical
    • at a minimum
    • be able to
    • capable of
    • not limited to
    • as much as possible
  • Avoid imprecise words, such as:
    • simply
    • quickly
    • efficiently
    • friendly
    • timely
    • easy
    • normal
    • adequate
    • effective
  • Avoid generalities, such as:
    • large
    • many
    • most
    • near
  • Use the correct imperative verb:
    • shall - prescribes
    • will - describes
    • must - constrains
    • should - recommends
    • may - permits
    • can - indicates capacity/ability/possibility
  • Avoid compound statements such as ‘Shall do X and Y’ or ‘Shall include X, Y, Z’. This type of stated requirement should be split into separate requirements (Ex. Shall do X, Shall do Y, Shall include X, etc…) unless it is an all or nothing requirement
  • Do not use different words to refer to the same thing, for example:
    • computer screen and computer monitor
    • window frame and window pane 

Note: The use of the terms SHOULD and SHALL are guided by RFC 2119. If you find yourself writing requirements formally; or otherwise; this is worth a read.

Each statement should be accompanied by the source of the requirement and the date it was entered. If the source is a person (e.g., from a meeting, an informal conversation with the customer or customer representative, or a requirements elicitation session), the source should review the requirements statement once documented.

Tags: ,