This wiki has undergone a migration to Confluence found Here
<meta name="googlebot" content="noindex">

Difference between revisions of "FHIR QA Guidelines"

From HL7Wiki
Jump to navigation Jump to search
Line 3: Line 3:
  
 
This page identifies the guidelines used as part of the QA process.  Not all of these will necessarily be evaluated as part of the QA process.  As well, we need to define who is responsible for verifying these.  Ideally, most QA would be required to be performed by authoring committees with a smaller set of criteria as well as spot-checks of other criteria performed by FMG.  We may also want to differentiate what gets done for different levels of ballot.  As well, QA can be focused on areas that have changed.
 
This page identifies the guidelines used as part of the QA process.  Not all of these will necessarily be evaluated as part of the QA process.  As well, we need to define who is responsible for verifying these.  Ideally, most QA would be required to be performed by authoring committees with a smaller set of criteria as well as spot-checks of other criteria performed by FMG.  We may also want to differentiate what gets done for different levels of ballot.  As well, QA can be focused on areas that have changed.
 +
 +
 +
=Precepts=
 +
* Content should only be submitted to DSTU ballot which we believe is completely ready to be implemented
 +
** Bold content below are those QA steps that validate this
  
 
=QA Steps=
 
=QA Steps=
Line 10: Line 15:
  
 
Need to confirm that the build process actually does all of these
 
Need to confirm that the build process actually does all of these
* All examples & fragments are schema-valid and schematron valid
+
* '''All XML examples & fragments are schema-valid and schematron valid'''
* All resource definitions and profiles are valid against their schemas & schematrons + additional rules
+
* '''All JSON is valid''' - need to figure out how this works
* All links resolve in the HTML
+
* '''All resource definitions and profiles are valid against their schemas & schematrons + additional rules'''
* All coded datatypes have bindings
+
* '''All links resolve in the HTML'''
* Fixed values only exist for simple types
+
* '''All coded datatypes have bindings'''
 +
* '''Fixed values only exist for simple types'''
 
* All OCL constraints compile
 
* All OCL constraints compile
 
* Definitions, etc. only end with periods when they ought to
 
* Definitions, etc. only end with periods when they ought to
* UML views of everything (including data types) agrees with definitions
+
* '''UML views of everything (including data types) agrees with definitions'''
* sid values are legal.  All non-sid/guid/oid ids are flagged as warnings unless part of a pre-defined example space
+
* '''sid values are legal.  All non-sid/guid/oid ids are flagged as warnings unless part of a pre-defined example space'''
  
 
==Automatable==
 
==Automatable==
Line 52: Line 58:
 
*** Check where Conformance = Required, minimum cardinality = 1
 
*** Check where Conformance = Required, minimum cardinality = 1
 
** Are mappings valid against source specification
 
** Are mappings valid against source specification
 +
**Is content "complete"
 +
***Are there any known issues declared that would prevent implementers from using the spec "as is"?
 +
**Are there any dependencies on content that doesn't exist?  (DSTU resources must not have dependencies on content that is not also DSTU)

Revision as of 20:08, 2 January 2013

This page identifies the guidelines used as part of the QA process. Not all of these will necessarily be evaluated as part of the QA process. As well, we need to define who is responsible for verifying these. Ideally, most QA would be required to be performed by authoring committees with a smaller set of criteria as well as spot-checks of other criteria performed by FMG. We may also want to differentiate what gets done for different levels of ballot. As well, QA can be focused on areas that have changed.


Precepts

  • Content should only be submitted to DSTU ballot which we believe is completely ready to be implemented
    • Bold content below are those QA steps that validate this

QA Steps

Automated

These processes are currently handled by the build process (though someone needs to view the build log and ensure it is clean)

Need to confirm that the build process actually does all of these

  • All XML examples & fragments are schema-valid and schematron valid
  • All JSON is valid - need to figure out how this works
  • All resource definitions and profiles are valid against their schemas & schematrons + additional rules
  • All links resolve in the HTML
  • All coded datatypes have bindings
  • Fixed values only exist for simple types
  • All OCL constraints compile
  • Definitions, etc. only end with periods when they ought to
  • UML views of everything (including data types) agrees with definitions
  • sid values are legal. All non-sid/guid/oid ids are flagged as warnings unless part of a pre-defined example space

Automatable

These are processes that must be reviewed manually at the moment bug could be handled in an automated fashion with appropriate enhancements to the build process

  • Ensure all RIM mappings are "legal"
  • Ensure RIM mappings don't collide/overlap

Manual

(Some of these can be focused only on those resources & sections that have changed from prior release)

  • Tooling validation (validates that the build tooling is working correctly - only needs checking when build process changes)
    • Ensure all content that's supposed to make it into the book form actually does
    • Ensure all content from the website that doesn't appear in the book form appears in a secondary form for review
  • Content validation
    • Ensure the build runs successfully with no warnings
    • Test that the xpath assertions for Schematrons are valid using Saxon SA
    • Ensure a wiki page with the default content exists for each page
    • Formal process
      • Do we have a PSS and resource request in place for all resources?
      • Do we have mappings for the "source" specifications used to determine/validate 80%?
    • Technical review
      • Place both forms into MS Word and run grammar & spelling checks (U.S. English)
      • Ensure style guide is followed for use of formatting
        • To be defined. Includes: when to use bold, italics, capitalization, hyperlinks, color, ordered vs. unordered lists, sections
        • Can we steal from w3c or someone?
    • Text content review
      • Ensure all definitions for code sets are mutually exclusive (and comprehensive)
      • Ensure statuses on resources & profiles are accurate for ballot
      • Ensure definitions are non-tautological and clear
      • Ensure definition, rationale & notes are properly split
      • Ensure definitions include examples when appropriate
      • Ensure text is clear and reads well, with references to other topics when appropriate
    • Ensure that constraints (cardinality, vocabulary, invariants, etc.) do not constrain extensibility more than necessary to allow safe base interoperability
      • Check where Conformance = Required, minimum cardinality = 1
    • Are mappings valid against source specification
    • Is content "complete"
      • Are there any known issues declared that would prevent implementers from using the spec "as is"?
    • Are there any dependencies on content that doesn't exist? (DSTU resources must not have dependencies on content that is not also DSTU)