Difference between revisions of "HL7 DSTU testing toolkit"
Charliemccay (talk | contribs) |
|||
(8 intermediate revisions by 2 users not shown) | |||
Line 2: | Line 2: | ||
This is a project hosted by the ICTC as part of its [[Implementation workstream]] | This is a project hosted by the ICTC as part of its [[Implementation workstream]] | ||
* [[Project Team]] | * [[Project Team]] | ||
+ | The HL7 development framework (HDF) provides guidance on how to develop a standard. However, the HDF does not provide guidance on how best to evaluate the standard. Without guidance on how to evaluate a standard it is the burden of each technical committee (TC) to determine there own methods of proving a standard is fit for purpose. In addition, implementers have no clear way to claim that a system confirms to an HL7 V3 standard | ||
+ | |||
+ | The goal of this project is to provide guidance to implementers on when a HL7 V3 standard is stable enough to implement and a mechanism to ensure a system confirms to an HL7 V3 standard. During the course of this project we may need to make a distinction on when an HL7 V3 standard is ready for early adoption compared to mainstream adoption. | ||
+ | |||
+ | This project does not directly address the testing of locally produced implementation guides, but it may inform such activity. This may become the subject of a future project. | ||
+ | |||
+ | == Rationale == | ||
+ | There is more and more pressure that HL7 standards, as well as other standards, are of high quality before they are normative. In fact, several governments are requiring, by law, normative standards be adopted. | ||
+ | |||
+ | To meet this need, the Implementation and Conformance Technical Committee (ICTC), in conjunction with the project lifecycle task force, is determining the best procedures for testing standard. The ICTC have approved a project that focuses on procedures and artifacts for testing. The following document is a straw man for what tests needs to occur before each ballot type and also provide an example of a test plan. At this time, this document only covers testing a standard. This document does not cover the topic on how to ensure an implementation guide meets the standard or how to ensure a system is conformant to the standard. | ||
+ | |||
+ | Please note that no process will ensure that an exchange standard is perfect. No matter how much effort you put in the testing process, it is probable that once the exchange standard is used in the mainstream, deficiencies of said standard will be uncovered. | ||
+ | |||
+ | == Approach == | ||
+ | a vote for DSTU is voting to allow some to implement with moderate risk. | ||
+ | |||
+ | The plan below describes uses the HDF and extends it to provide artifacts that will map and provide evidence that every requirement described in storyboards can be accomplished by the exchange standard. Furthermore, the plan will provide implementers a structured a structured way (template) to inform the sponsoring group what requirements they have implemented. | ||
+ | |||
+ | == Pre-DSTU period == | ||
+ | Before committing resources to implement a standard the implementer should have a reasonable expectation that the standard will not radically change. The HL7 development framework (HDF) and committee/informative ballot ensure that the stated requirements are firm. | ||
+ | In the DSTU ballot package, the working group should provide a stability statement. The stability statement states which part of the standard is ready to be implemented and which part of the standard is likely to change. | ||
+ | |||
+ | If a standard basses DSTU ballot, they HL7 community (sponsoring group) confirms | ||
+ | *Requirements are properly documented as defined by the HDF and agreed to by the recipients and senders (based upon activity diagram). | ||
+ | *A test plan (new artifact). Test plans help to gather requirements. Often when creating a test plan, new requirements are added. | ||
+ | *Implementers have performed a walkthrough (reviewed) of the RMIM to ensure the specification is implementable and meets the stated requirements | ||
+ | *The standard is ready to be implemented at a moderate risk | ||
+ | |||
+ | == DSTU Testing Plan == | ||
+ | More experience is needed to finalize recommendation on creating a DSTU test plan. | ||
+ | |||
+ | There are many reasons why a test plan should be created. A test plan should determine the effort needed to validate the acceptability of a standard and solutions needed to use the standard. The test plans should also provide visibility on how the standard is validated; including but not limited to, the objective, scope and approach to testing. The test plan should be independent of the system that will be used to test the standard but it is not feasible to test a standard without a system. | ||
+ | |||
+ | The HL7 development framework (HDF) requires that storyboards, interaction diagrams, and state diagrams are created. The storyboards are refined during the requirements gathering process and there should be a storyboard for all supported states and interactions. In short, the storyboards should contain all of the business requirements. Accordingly, each storyboard should be tested. | ||
+ | |||
+ | The test scripts should list the objectives of the test, what storyboards, interactions, and states the test covers. The objective of the test explains in a natural language (English) what is the content of the message. It is to be expected that one test case can predicated on another test case. In that example, prerequisites should be listed. In addition, each test case should be verified by a sender, recipient and implementer for accuracy. See [[test plan example]]. | ||
+ | |||
+ | The business users (both sender and receiver) will ensure that all requirements are documented (storyboards) and the test plan reflects real world process. The XML snippet will prove that the requirement can be met by the standard. The implementer will ensure the XML snippet provided meets the test case objective as written. It is not expected that all business users will be able to identify how the requirements are met in the standard. It is also not expected that all implementers will know why certain requirements are needed. | ||
+ | |||
+ | Implementers will also need to describe what storyboards, interactions, and states they have implemented. There should be at least two forms of an implementation per storyboards, interactions, and states. Although creating and consuming an exchange standard is different, if a different system creates and consumes an exchange standard that would be considered two forms of implementations. | ||
+ | |||
+ | If each XML snippets of each function the exchange standard and at least two implementations of every storyboard have occurred, then the standard is ready to go to a normative ballot. | ||
+ | |||
+ | At this time, this document does not describe how to ensure a system either consumes or creates the HL7 exchange standard correctly. There will be another set of user acceptance test scripts that will have to be created. These user acceptance test scripts will provide the implementers assurance that their system conforms to the standard and provide the business assurance that their requirements are implemented correctly. However, it is expected the same test cases used to prove that the standard is fit for purpose, will be leveraged to create the implementation guide as well as system conformance/user acceptance testing. | ||
+ | |||
+ | Before a standard is balloted as normative the following should occur: | ||
+ | *Two forms of implementation of the standard (need not be commercial or fully implemented) | ||
+ | **For example: XForm, style sheet, commercial off the shelf software | ||
+ | **See implementation coverage | ||
+ | *The test plan is executed at least once (see test script example) | ||
+ | *Update the standard based on experience. The standard should only change if it is difficult to implement or is not implementable. On rare cases new requirements can be added. | ||
+ | |||
+ | |||
== related / existing work == | == related / existing work == | ||
* co-chairs handbook | * co-chairs handbook | ||
Line 7: | Line 60: | ||
* Project Lifecycle Taskforce | * Project Lifecycle Taskforce | ||
− | == Early Adopters of project outputs | + | == Early Adopters of project outputs == |
* RCRIM projects | * RCRIM projects | ||
* ITS SIG projects | * ITS SIG projects | ||
Line 17: | Line 70: | ||
* Templates for reporting test results | * Templates for reporting test results | ||
* examples of best practice | * examples of best practice | ||
+ | |||
+ | *Provide guidance and documentation to prove that a HL7 V3 standard fit for purpose and stable enough for implementers | ||
+ | Document a testing process that can be followed to establish whether a standard meets these requirements | ||
+ | Create artefacts and templates to record how an HL7 V3 standard meets the balloted business requirements | ||
+ | *Provide guidance and documentation to prove that a system, either a consumer or producer, conforms or partially confirms to an HL7 v3 standard | ||
== Project Plan == | == Project Plan == | ||
* Produce project scope statement | * Produce project scope statement | ||
+ | |||
+ | == Conference calls == | ||
+ | This project is discussed on the regular implementation calls http://www.hl7.org/concalls/index.cfm?action=home.calldetail&wg_concall_id=3289&workingcalendardate=06/04/2007&listofwgids= |
Latest revision as of 16:21, 15 September 2008
Contents
Introduction
This is a project hosted by the ICTC as part of its Implementation workstream
The HL7 development framework (HDF) provides guidance on how to develop a standard. However, the HDF does not provide guidance on how best to evaluate the standard. Without guidance on how to evaluate a standard it is the burden of each technical committee (TC) to determine there own methods of proving a standard is fit for purpose. In addition, implementers have no clear way to claim that a system confirms to an HL7 V3 standard
The goal of this project is to provide guidance to implementers on when a HL7 V3 standard is stable enough to implement and a mechanism to ensure a system confirms to an HL7 V3 standard. During the course of this project we may need to make a distinction on when an HL7 V3 standard is ready for early adoption compared to mainstream adoption.
This project does not directly address the testing of locally produced implementation guides, but it may inform such activity. This may become the subject of a future project.
Rationale
There is more and more pressure that HL7 standards, as well as other standards, are of high quality before they are normative. In fact, several governments are requiring, by law, normative standards be adopted.
To meet this need, the Implementation and Conformance Technical Committee (ICTC), in conjunction with the project lifecycle task force, is determining the best procedures for testing standard. The ICTC have approved a project that focuses on procedures and artifacts for testing. The following document is a straw man for what tests needs to occur before each ballot type and also provide an example of a test plan. At this time, this document only covers testing a standard. This document does not cover the topic on how to ensure an implementation guide meets the standard or how to ensure a system is conformant to the standard.
Please note that no process will ensure that an exchange standard is perfect. No matter how much effort you put in the testing process, it is probable that once the exchange standard is used in the mainstream, deficiencies of said standard will be uncovered.
Approach
a vote for DSTU is voting to allow some to implement with moderate risk.
The plan below describes uses the HDF and extends it to provide artifacts that will map and provide evidence that every requirement described in storyboards can be accomplished by the exchange standard. Furthermore, the plan will provide implementers a structured a structured way (template) to inform the sponsoring group what requirements they have implemented.
Pre-DSTU period
Before committing resources to implement a standard the implementer should have a reasonable expectation that the standard will not radically change. The HL7 development framework (HDF) and committee/informative ballot ensure that the stated requirements are firm. In the DSTU ballot package, the working group should provide a stability statement. The stability statement states which part of the standard is ready to be implemented and which part of the standard is likely to change.
If a standard basses DSTU ballot, they HL7 community (sponsoring group) confirms
- Requirements are properly documented as defined by the HDF and agreed to by the recipients and senders (based upon activity diagram).
- A test plan (new artifact). Test plans help to gather requirements. Often when creating a test plan, new requirements are added.
- Implementers have performed a walkthrough (reviewed) of the RMIM to ensure the specification is implementable and meets the stated requirements
- The standard is ready to be implemented at a moderate risk
DSTU Testing Plan
More experience is needed to finalize recommendation on creating a DSTU test plan.
There are many reasons why a test plan should be created. A test plan should determine the effort needed to validate the acceptability of a standard and solutions needed to use the standard. The test plans should also provide visibility on how the standard is validated; including but not limited to, the objective, scope and approach to testing. The test plan should be independent of the system that will be used to test the standard but it is not feasible to test a standard without a system.
The HL7 development framework (HDF) requires that storyboards, interaction diagrams, and state diagrams are created. The storyboards are refined during the requirements gathering process and there should be a storyboard for all supported states and interactions. In short, the storyboards should contain all of the business requirements. Accordingly, each storyboard should be tested.
The test scripts should list the objectives of the test, what storyboards, interactions, and states the test covers. The objective of the test explains in a natural language (English) what is the content of the message. It is to be expected that one test case can predicated on another test case. In that example, prerequisites should be listed. In addition, each test case should be verified by a sender, recipient and implementer for accuracy. See test plan example.
The business users (both sender and receiver) will ensure that all requirements are documented (storyboards) and the test plan reflects real world process. The XML snippet will prove that the requirement can be met by the standard. The implementer will ensure the XML snippet provided meets the test case objective as written. It is not expected that all business users will be able to identify how the requirements are met in the standard. It is also not expected that all implementers will know why certain requirements are needed.
Implementers will also need to describe what storyboards, interactions, and states they have implemented. There should be at least two forms of an implementation per storyboards, interactions, and states. Although creating and consuming an exchange standard is different, if a different system creates and consumes an exchange standard that would be considered two forms of implementations.
If each XML snippets of each function the exchange standard and at least two implementations of every storyboard have occurred, then the standard is ready to go to a normative ballot.
At this time, this document does not describe how to ensure a system either consumes or creates the HL7 exchange standard correctly. There will be another set of user acceptance test scripts that will have to be created. These user acceptance test scripts will provide the implementers assurance that their system conforms to the standard and provide the business assurance that their requirements are implemented correctly. However, it is expected the same test cases used to prove that the standard is fit for purpose, will be leveraged to create the implementation guide as well as system conformance/user acceptance testing.
Before a standard is balloted as normative the following should occur:
- Two forms of implementation of the standard (need not be commercial or fully implemented)
- For example: XForm, style sheet, commercial off the shelf software
- See implementation coverage
- The test plan is executed at least once (see test script example)
- Update the standard based on experience. The standard should only change if it is difficult to implement or is not implementable. On rare cases new requirements can be added.
- co-chairs handbook
- DSTU comments page - http://www.hl7.org/dstucomments/index.cfm
- Project Lifecycle Taskforce
Early Adopters of project outputs
- RCRIM projects
- ITS SIG projects
- Wrappers R2 project
Deliverables
- Changes to co-chairs handbook
- Guidelines for running DSTU
- Templates for reporting test results
- examples of best practice
- Provide guidance and documentation to prove that a HL7 V3 standard fit for purpose and stable enough for implementers
Document a testing process that can be followed to establish whether a standard meets these requirements Create artefacts and templates to record how an HL7 V3 standard meets the balloted business requirements
- Provide guidance and documentation to prove that a system, either a consumer or producer, conforms or partially confirms to an HL7 v3 standard
Project Plan
- Produce project scope statement
Conference calls
This project is discussed on the regular implementation calls http://www.hl7.org/concalls/index.cfm?action=home.calldetail&wg_concall_id=3289&workingcalendardate=06/04/2007&listofwgids=