This wiki has undergone a migration to Confluence found Here
<meta name="googlebot" content="noindex">

Difference between revisions of "HTC Tools Roadmap"

From HL7Wiki
Jump to navigation Jump to search
 
(7 intermediate revisions by 5 users not shown)
Line 23: Line 23:
 
Figure 1.  Summary of potential tools with flows representing general progression through message development and implementation stages. Feedback flows are not represented in this diagram. The gray arrows represent the areas documented with the documentation editing and management tools.
 
Figure 1.  Summary of potential tools with flows representing general progression through message development and implementation stages. Feedback flows are not represented in this diagram. The gray arrows represent the areas documented with the documentation editing and management tools.
 
==Description==
 
==Description==
A brief description of the function of each of the boxes in the diagram follows (in italics: the ??? person):
+
A brief description of the function of each of the boxes in the diagram follows:
 +
*''Grey Box'' - Those components whose sources and outputs are primarily represented in the Model Interchange Format(MIF) and managed by the Shared Artefact Repository.
 
*'''Business Requirements Modeling Tools''' - Enables the use of UML  Modeling concepts to develop the following components of a domain analysis model (DAM): (1) A  business process model that diagrams a business process as an ordered collection of activities with a beginning triggered by an event that uses defined inputs (information and resources) to produce a specific output; (2) A class model that captures the business process information that is exchanged; (3) An activity diagram of the information exchange actors and interactions; and (4) A glossary that defines business terms, which may be used in the development of vocabulary domains and code sets.  The resulting DAM artifacts must be precise, stringent specifications of each information exchange of interest in a technology-neutral notation understandable to domain experts, and must be capable of being exported as XMI to ensure traceability between business requirements and the development of HL7 Static and Dynamic models, and terminology.
 
*'''Business Requirements Modeling Tools''' - Enables the use of UML  Modeling concepts to develop the following components of a domain analysis model (DAM): (1) A  business process model that diagrams a business process as an ordered collection of activities with a beginning triggered by an event that uses defined inputs (information and resources) to produce a specific output; (2) A class model that captures the business process information that is exchanged; (3) An activity diagram of the information exchange actors and interactions; and (4) A glossary that defines business terms, which may be used in the development of vocabulary domains and code sets.  The resulting DAM artifacts must be precise, stringent specifications of each information exchange of interest in a technology-neutral notation understandable to domain experts, and must be capable of being exported as XMI to ensure traceability between business requirements and the development of HL7 Static and Dynamic models, and terminology.
*'''Terminology Management Tools''' - Terminology management tools are required to support the runtime exchange and design time development of interoperable information models. For these tools to be effective, they need to integrate with the larger framework of message design and implementation tooling, and are required to support: (1) the creation and maintenance of coded concepts, value sets, and domains in a controlled and repeatable format. This is necessary to ensure consistency and reduce ambiguity in the terminology source. (2) The ability to specify both the static and dynamic binding of a vocabulary elements to a static model attribute at design time, and (3) the ability to resolve value set contents at run time when a value set is bound to a static model attribute.  Terminology management tools must provide consistent and standardized access across multiple terminology sources through a common set of vocabulary APIs, and support a federate distribution model of vocabulary.
+
*'''Terminology Management Tools''' - Terminology management tools are required to support the runtime exchange and design time development of interoperable information models. For these tools to be effective, they need to integrate with the larger framework of message design and implementation tooling, and are required to support: (1) the creation and maintenance of coded concepts, value sets, and domains in a controlled and repeatable format. This is necessary to ensure consistency and reduce ambiguity in the terminology source. (2) The ability to specify both the static and dynamic binding of a vocabulary elements to a static model attribute at design time, and (3) the ability to resolve value set contents at run time when a value set is bound to a static model attribute.  With the preference that Value Set elements be drawn from standard vocabularies, terminology management tools must provide consistent and standardized access across these disparate terminology sources through a common set of vocabulary APIs, and support access across a federated vocabulary distribution model.
 
*'''Static Model Designer''' - Version 3 is predicated on supporting communication between systems where the contents of the communication are based on a common Reference Information Model (RIM).  The HL7 Development Framework (HDF) defines the rules whereby static information models (a UML term) are constrained to define the information structures that support communication.  This process of refinement by constraint has been automated in HL7 using a static model designer (previously, an "RMIM designer").  This component supports the definition of static information models derived from the HL7 RIM and enforces the derivation rules embedded in the HL7 methodology.
 
*'''Static Model Designer''' - Version 3 is predicated on supporting communication between systems where the contents of the communication are based on a common Reference Information Model (RIM).  The HL7 Development Framework (HDF) defines the rules whereby static information models (a UML term) are constrained to define the information structures that support communication.  This process of refinement by constraint has been automated in HL7 using a static model designer (previously, an "RMIM designer").  This component supports the definition of static information models derived from the HL7 RIM and enforces the derivation rules embedded in the HL7 methodology.
 
*'''Dynamic Model Designer''' - The "dynamic model" in Version 3 defines the communication requirements placed on individual application components in order to implement an interoperable set of systems using the HL7 V3 standards.  The dynamic model is the standard specification that implements the business requirements for each subject domain, and is ultimately the foundation for conformance testing of V3 specifications.  This component will support the definition and diagramming of the dynamic model content following the rules of the HDF.
 
*'''Dynamic Model Designer''' - The "dynamic model" in Version 3 defines the communication requirements placed on individual application components in order to implement an interoperable set of systems using the HL7 V3 standards.  The dynamic model is the standard specification that implements the business requirements for each subject domain, and is ultimately the foundation for conformance testing of V3 specifications.  This component will support the definition and diagramming of the dynamic model content following the rules of the HDF.
*'''Schema Generator''' - ''Lloyd Mckenzie''
+
*'''Schema Generator''' - This includes creation of schemas for static models (wrappers, CMETs, payloads, etc.) as well as interactions based on the published XML ITS.  It also potentially includes generation of additional ITSs.
*'''Example & Test Message Generator''' - ''Lloyd Mckenzie''
+
*'''Example & Test Message Generator''' - Based on MIF structures and repositories of sample data and message fragments, produces XML instances compliant against one of the XML ITSs.  This may include both the creation of larger fragments (e.g. example payloads) as well as full-blown interactions.  The process will include integrated validation of the messages so-generated.  The process should also allow the construction of deliberately invalid instances for use in testing application's validation capabilities.
*'''Message Tester''' - ''Lloyd Mckenzie''
+
*'''Message Tester''' - This component has two aspects.  The first provides the capability to validate XML instances against static and dynamic model definitions, including templates and conformance profiles.  The second provides a testing framework using various transport protocols, able to receive test messages and return canned responses based on customizable configurations.
*'''Code Generator Frameworks''' - ''Ioana Singureanu''
+
*'''Code Generator Frameworks''' - To aid implementers, object-level interfaces can be automatically generated to provide parsing, serialization and validation capabilities for defined models.  Code can be generated in a variety of languages with initial focus likely on Java and .NET technologies.
 
*'''Documentation Editing & Management Tools''' - In order to allow domain experts to vote on the technical content of an HL7 specification, it is necessary to document the context and requirements that led to these specifications, the relationship of the "business model" to the technical model, and so forth.  These publication "packages" are then assembled into an overall framework that allows the automated publication of the content.  This component will support the committee facilitators in assembling and previewing their ballot content, and expressing it (using MIF schemas) in XML.  One key feature of this component is the inclusion of "WYSIWYG" editors, and ballot previews in HTML.  In current tooling, this has been partially fulfilled with the HL7 Publication Database (PubDb).
 
*'''Documentation Editing & Management Tools''' - In order to allow domain experts to vote on the technical content of an HL7 specification, it is necessary to document the context and requirements that led to these specifications, the relationship of the "business model" to the technical model, and so forth.  These publication "packages" are then assembled into an overall framework that allows the automated publication of the content.  This component will support the committee facilitators in assembling and previewing their ballot content, and expressing it (using MIF schemas) in XML.  One key feature of this component is the inclusion of "WYSIWYG" editors, and ballot previews in HTML.  In current tooling, this has been partially fulfilled with the HL7 Publication Database (PubDb).
 
*'''Documentation Publisher''' - Once individual committees have assembled their "ballot packages" for the content they wish to submit for voting, the staff at HL7 Headquarters, under the supervision of the Publishing Committee, must convert this content into a ballot web-site of several thousand pages.  This process currently takes a variety of XML files (in several idiosyncratic formats) and assembles these into the ballot site.  The current process is heavily automated, but not well integrated.  Development of this component of the HTC will replace the existing functionality with a process based upon source MIF files.
 
*'''Documentation Publisher''' - Once individual committees have assembled their "ballot packages" for the content they wish to submit for voting, the staff at HL7 Headquarters, under the supervision of the Publishing Committee, must convert this content into a ballot web-site of several thousand pages.  This process currently takes a variety of XML files (in several idiosyncratic formats) and assembles these into the ballot site.  The current process is heavily automated, but not well integrated.  Development of this component of the HTC will replace the existing functionality with a process based upon source MIF files.
*'''Shared Artefact Repository''' - ''Geoffry Roberts''
+
*'''Shared Artefact Repository''' - The SAR serves as the persistent store for all artefacts created through use of the HTC tool set.  It exists in two parts: (1) the centralised part(CSAR) and the local part(LSAR).  The CSAR resides on a shared computing platform it: (1) Maintains the revision history of artefacts, (2) Reports differences between two versions of the same artefact, (3) Is organised into different project areas.  The LSAR, resides on a user's local computer. It is accessed through the HTC tool set. (1) Search and retrieval of artefacts, (2) Enforces the repository structure.
 
*'''Design Analysis Tools''' - This component will support various quality assurance methods including comparing models to validate derivation history, report on model differences, trace model element use across models, and verify constraint definition rules as they are formalized.  Reporting on the results of analysis and managment of derivation artifacts are also anticipated.  
 
*'''Design Analysis Tools''' - This component will support various quality assurance methods including comparing models to validate derivation history, report on model differences, trace model element use across models, and verify constraint definition rules as they are formalized.  Reporting on the results of analysis and managment of derivation artifacts are also anticipated.  
*'''Artefact Configuration Management tool''' - ''Jane Curry''
+
*'''Artefact Configuration Management tool''' - Managing the status and version naming of artifacts in the Shared Artefact Repository as new or updated components are added.  Configuration of tool components for deployment is one function.  Configuration of HL7 specification related artefacts into specific editions is another function.
  
 
==Dependencies==
 
==Dependencies==

Latest revision as of 03:28, 8 December 2006

Introduction

The HL7 Tooling Collaborative (HTC) has been established with resource contributions from key HL7 stakeholders, to deliver significant benefits within 1 year of the project's initiation. This collaboration will enhance and integrate current initiatives by various stakeholders. A business model and project organisation, as well as a high-level technical architecture for tools, are described in separate documents.

At the request of the HTC, the HL7 Tooling Committee has undertaken to revise and extend this roadmap document to reflect a more complete picture of HL7 tools requirements. The extension is reflected here on the Wiki.

Tools 'vision'

In general, the HL7 / NHS Collaborative Tools Project envisions tooling that:

  1. produces machine-processable artefacts, spanning through all stages of the message design cycle (requirements, design, implementation and testing)
  2. supports end-to-end automated testing of interoperability solutions
  3. standardises the type and quality of the information conveyed between each stage and between communicating organisations
  4. produces coherent, traceable and versioned concepts from analysis to implementation
  5. facilitates direct involvement / feedback in international standards and tools development, ensuring ongoing alignment of implementation specifications with industry standards, including HL7 V3
  6. reduces message development time, allowing the automatic translation of message designs to supplier-specific formats
  7. facilitates consistent workflows and project management / oversight
  8. provides a framework for publishing documentation about the artefacts generated throughout the process.

Tooling 'Roadmap'

A potential summary view of tools required to accomplish the vision is provided in Figure 1.

Editors Please Note: If you wish to help by editing this diagram, there are two source forms available as Visio files. The first is the unmodified source file from HTC‎. The second is the file that generated the GIF below. It was taken from the HTC Word document and then cobbled together in Visio.

HTC-Roadmap-03.gif

Figure 1. Summary of potential tools with flows representing general progression through message development and implementation stages. Feedback flows are not represented in this diagram. The gray arrows represent the areas documented with the documentation editing and management tools.

Description

A brief description of the function of each of the boxes in the diagram follows:

  • Grey Box - Those components whose sources and outputs are primarily represented in the Model Interchange Format(MIF) and managed by the Shared Artefact Repository.
  • Business Requirements Modeling Tools - Enables the use of UML Modeling concepts to develop the following components of a domain analysis model (DAM): (1) A business process model that diagrams a business process as an ordered collection of activities with a beginning triggered by an event that uses defined inputs (information and resources) to produce a specific output; (2) A class model that captures the business process information that is exchanged; (3) An activity diagram of the information exchange actors and interactions; and (4) A glossary that defines business terms, which may be used in the development of vocabulary domains and code sets. The resulting DAM artifacts must be precise, stringent specifications of each information exchange of interest in a technology-neutral notation understandable to domain experts, and must be capable of being exported as XMI to ensure traceability between business requirements and the development of HL7 Static and Dynamic models, and terminology.
  • Terminology Management Tools - Terminology management tools are required to support the runtime exchange and design time development of interoperable information models. For these tools to be effective, they need to integrate with the larger framework of message design and implementation tooling, and are required to support: (1) the creation and maintenance of coded concepts, value sets, and domains in a controlled and repeatable format. This is necessary to ensure consistency and reduce ambiguity in the terminology source. (2) The ability to specify both the static and dynamic binding of a vocabulary elements to a static model attribute at design time, and (3) the ability to resolve value set contents at run time when a value set is bound to a static model attribute. With the preference that Value Set elements be drawn from standard vocabularies, terminology management tools must provide consistent and standardized access across these disparate terminology sources through a common set of vocabulary APIs, and support access across a federated vocabulary distribution model.
  • Static Model Designer - Version 3 is predicated on supporting communication between systems where the contents of the communication are based on a common Reference Information Model (RIM). The HL7 Development Framework (HDF) defines the rules whereby static information models (a UML term) are constrained to define the information structures that support communication. This process of refinement by constraint has been automated in HL7 using a static model designer (previously, an "RMIM designer"). This component supports the definition of static information models derived from the HL7 RIM and enforces the derivation rules embedded in the HL7 methodology.
  • Dynamic Model Designer - The "dynamic model" in Version 3 defines the communication requirements placed on individual application components in order to implement an interoperable set of systems using the HL7 V3 standards. The dynamic model is the standard specification that implements the business requirements for each subject domain, and is ultimately the foundation for conformance testing of V3 specifications. This component will support the definition and diagramming of the dynamic model content following the rules of the HDF.
  • Schema Generator - This includes creation of schemas for static models (wrappers, CMETs, payloads, etc.) as well as interactions based on the published XML ITS. It also potentially includes generation of additional ITSs.
  • Example & Test Message Generator - Based on MIF structures and repositories of sample data and message fragments, produces XML instances compliant against one of the XML ITSs. This may include both the creation of larger fragments (e.g. example payloads) as well as full-blown interactions. The process will include integrated validation of the messages so-generated. The process should also allow the construction of deliberately invalid instances for use in testing application's validation capabilities.
  • Message Tester - This component has two aspects. The first provides the capability to validate XML instances against static and dynamic model definitions, including templates and conformance profiles. The second provides a testing framework using various transport protocols, able to receive test messages and return canned responses based on customizable configurations.
  • Code Generator Frameworks - To aid implementers, object-level interfaces can be automatically generated to provide parsing, serialization and validation capabilities for defined models. Code can be generated in a variety of languages with initial focus likely on Java and .NET technologies.
  • Documentation Editing & Management Tools - In order to allow domain experts to vote on the technical content of an HL7 specification, it is necessary to document the context and requirements that led to these specifications, the relationship of the "business model" to the technical model, and so forth. These publication "packages" are then assembled into an overall framework that allows the automated publication of the content. This component will support the committee facilitators in assembling and previewing their ballot content, and expressing it (using MIF schemas) in XML. One key feature of this component is the inclusion of "WYSIWYG" editors, and ballot previews in HTML. In current tooling, this has been partially fulfilled with the HL7 Publication Database (PubDb).
  • Documentation Publisher - Once individual committees have assembled their "ballot packages" for the content they wish to submit for voting, the staff at HL7 Headquarters, under the supervision of the Publishing Committee, must convert this content into a ballot web-site of several thousand pages. This process currently takes a variety of XML files (in several idiosyncratic formats) and assembles these into the ballot site. The current process is heavily automated, but not well integrated. Development of this component of the HTC will replace the existing functionality with a process based upon source MIF files.
  • Shared Artefact Repository - The SAR serves as the persistent store for all artefacts created through use of the HTC tool set. It exists in two parts: (1) the centralised part(CSAR) and the local part(LSAR). The CSAR resides on a shared computing platform it: (1) Maintains the revision history of artefacts, (2) Reports differences between two versions of the same artefact, (3) Is organised into different project areas. The LSAR, resides on a user's local computer. It is accessed through the HTC tool set. (1) Search and retrieval of artefacts, (2) Enforces the repository structure.
  • Design Analysis Tools - This component will support various quality assurance methods including comparing models to validate derivation history, report on model differences, trace model element use across models, and verify constraint definition rules as they are formalized. Reporting on the results of analysis and managment of derivation artifacts are also anticipated.
  • Artefact Configuration Management tool - Managing the status and version naming of artifacts in the Shared Artefact Repository as new or updated components are added. Configuration of tool components for deployment is one function. Configuration of HL7 specification related artefacts into specific editions is another function.

Dependencies

Some of the tools listed above have dependencies either on other tools (e.g. represented by input flows in Figure 1), or on design decisions required for other tools.

  1. All tools must be able to easily interoperate and should be orthogonal. The interoperability layer that underpins this toolset is a prerequisite for all design.
  2. The Static Model Designer must be able to accept input from the Business Requirements Analysis tool [XMI], as well as feedback from the Message Tester, and the Example & Test Message Generator.
  3. The Dynamic Model Designer must be able to accept input from the Business Requirements Analysis tool.
  4. The Schema Generator requires output from both the Static and Dynamic Model Designers
  5. The Code Generator requires output from the Static Model Designer and Schema Generator
  6. The Example & Test Message Generator requires the Static Model Designer and Schema Generator
  7. The Message Tester requires output from the Static Model Designer, Schema Generator and Shared Artefact Repository (with stored Test Messages)
  8. The Documentation Editing and Managing Tools provide support for collecting textual documentation about the Business Requirements and Model Designs.
  9. The Documentation Publisher draws primary data from the Shared Artefact Repository and the Documentation Management Tools.
  10. The Shared Artefact Repository needs clear guidelines on types/defining attributes of artefacts

Some of these prerequisites require greater definition than others.

Achieving the vision

In order to achieve the full suite of tools in the vision, the following high-level schedule is anticipated (presuming resources become available to support all listed sub-projects).

Completed tools by end of Year 1:

  • Infrastructure components
  • Shared artefact repository
  • Business requirements modeller
  • HL7 Model Designer
  • Schema generator
  • Example instance generator

Tools produced to be used within Year 1 and to complete by end of Year 2:

  • Terminology tools
  • Design analysis tools
  • Message testers
  • Code generator frameworks

Commercial models & licensing

Stakeholders will engage with the project in different ways, depending on their particular requirements and commercial interests. The following are proposed models.

NOTE: The following model elements are copied from the HTC Draft MOU, replacing the equivalent elements in the roadmap. As the Roadmap advances, this note paragraph should be dropped.

  • Open Model. This model allows full sharing with no commercial obligations. All artifacts contributed in this way will be made available through Eclipse. Artifacts delivered in the Open Model may be freely used in packaged solutions developed under other models, where there may be commercial interests
  • Consortium model. This model will allow for full sharing with a group of stakeholders. Artifacts may be licensed to other parties outside the consortium, on commercial terms agreed by the consortium.
  • Closed model. Artifacts developed and shared according to the closed model will remain private to the Collaborative Member, who may then license the product to other parties.
  • Tooling Vendor model. Artifacts developed under the Tooling Vendor model remain private to the Collaborative Member, who may then license them and optionally act as a broker and service provider for other stakeholder who wish to market their products develop in a non-open model

Tools and Proposed Commercial Models

A summary of tools and proposed commercial models:
Tool Open Consortium Closed Tool Vendor

/ OTS

Business requirements modeller       X
Terminology management* X X   X
Static model designer X      
Dynamic model designer X      
Schema generator X      
Code generator frameworks X X X  
Example instance generator X      
Message & interactions tester* X X X  
Documentation Editing & Management Tools X      
Documentation Publisher X      
Shared artefact repository X      
Design analysis X      
Artefact configuration management X      
Infrastructure components X      

* Different parts of these tools will have different modes of involvement. The common underlying infrastructure will be open, and extensions may be either open or developed under the consortium or closed models.