This wiki has undergone a migration to Confluence found Here
<meta name="googlebot" content="noindex">

Difference between revisions of "Batch Chunking"

From HL7Wiki
Jump to navigation Jump to search
 
(6 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{INM Workitem}}
+
{{INM Finalized Workitem}}
  
Motion: ''Committee intends to support the concept of batch chunking and accepts a work item to do this.''  (Rene Spronk/John Arnett 14-0-0 Jan06 WGM THU Q4)
+
{{INM Motion|The Committee intends to support the concept of batch chunking and accepts a work item to do this.''  (Rene Spronk/John Arnett 14-0-0 Jan06 WGM THU Q4)}}
  
 +
{{INM Motion|To drop the work item because we changed (at the May 2006 WGM) the nature of the Batch Transmission wrapper not to be transactional (but for it to be a simple grouper of transmissions), Rene/Sandy, 11-0-2. WGM 2007-01-09}}
 +
 +
==Discussion==
 +
 +
===Dynamic Model===
 
Discussion: One consequence: if you allow for batch chunking then the HL7 [[Dynamic Model]] does not work anymore. One Transmission/Interaction may result in multiple (chunk-)interactions.  
 
Discussion: One consequence: if you allow for batch chunking then the HL7 [[Dynamic Model]] does not work anymore. One Transmission/Interaction may result in multiple (chunk-)interactions.  
  
Batch chunking in response to a query works just fine, because the [[Query Transmission Pattern]] supports a throttling mechanism. If the Bolus query mechanism is used we have the same dynamic model issues as in a batch (-chunks) notification scenario.
+
Note that under the newly proposed dynamic model ([[CPM]]) the dynamic model does allow for multiple chunk interactions in response to a batch.
 +
 
 +
===Sequencing of Batch chunks===
 +
 
 +
(Action item 978, Sept05) Need a mechanism (used by the sender) to sequence multiple batch response messages that are derived from a single batch request (as might result from imposing limits on batch size). This allows identification of missing or out of sequence batch response messages. Solved, see [[Harmonization: add priorTransmission to support Transmission Sequencing]]. This may not cover the entire use-case, but then we'll need a description of what else would be needed.
  
(Action item 978, Sept05) Work to add a new batch group class to the batch transmission wrapper. The new class will contain attributes such as batch ID. A new sequence number needs to be added to the current batch class as well. New dynamic models, interactions, etc. need to be documented.
+
===Chunking of query responses===
  
Add BatchSequenceNumber attribute: the ordinal number of this batch in a sequence of batches This is used by the sender to sequence multiple batch response messages that are derived from a single batch request (as might result from imposing limits on batch size). This allows identification of missing or out of sequence batch response messages.  
+
Batch chunking in response to a query works just fine, because the [[Query Transmission Pattern]] supports a throttling mechanism. If the [[bolus|Bolus query mechanism]] is used we have the same dynamic model issues as in a batch (-chunks) notification scenario.
  
Issue needs to be re-examined in the light of the [[CPM]] approach.
+
:(Rene) If we examine the query use case, then there are various relevant existing features:
 +
:*A series of query response interactions will all have a queryId (which links the response at the business-conversation level) and reultsRemainingQuantity, which "decreases" to 0 with each response message. If M(9) is a message with resultRemainingQuantity 9, then a series of query responses could be '''M(12), M(11), M(8), M(4), M(0)''' - a message may contain multiple response payloads.
 +
:*If one uses a batch (1 batch), all of the above messages will be contained in 1 batch wrapper '''B(M(12), M(11), M(8), M(4), M(0))'''. If you use multiple batches (chunks) it could be (e.g.) '''B(M(12), M(11))''' followed by '''B(M(8), M(4), M(0))'''. The receiver has queryId to link all response mesages together, and is aware that it has received all batch chunks if the last message has resutRemainingQuantity equal to 0.

Latest revision as of 15:45, 30 January 2007

Discussion

Dynamic Model

Discussion: One consequence: if you allow for batch chunking then the HL7 Dynamic Model does not work anymore. One Transmission/Interaction may result in multiple (chunk-)interactions.

Note that under the newly proposed dynamic model (CPM) the dynamic model does allow for multiple chunk interactions in response to a batch.

Sequencing of Batch chunks

(Action item 978, Sept05) Need a mechanism (used by the sender) to sequence multiple batch response messages that are derived from a single batch request (as might result from imposing limits on batch size). This allows identification of missing or out of sequence batch response messages. Solved, see Harmonization: add priorTransmission to support Transmission Sequencing. This may not cover the entire use-case, but then we'll need a description of what else would be needed.

Chunking of query responses

Batch chunking in response to a query works just fine, because the Query Transmission Pattern supports a throttling mechanism. If the Bolus query mechanism is used we have the same dynamic model issues as in a batch (-chunks) notification scenario.

(Rene) If we examine the query use case, then there are various relevant existing features:
  • A series of query response interactions will all have a queryId (which links the response at the business-conversation level) and reultsRemainingQuantity, which "decreases" to 0 with each response message. If M(9) is a message with resultRemainingQuantity 9, then a series of query responses could be M(12), M(11), M(8), M(4), M(0) - a message may contain multiple response payloads.
  • If one uses a batch (1 batch), all of the above messages will be contained in 1 batch wrapper B(M(12), M(11), M(8), M(4), M(0)). If you use multiple batches (chunks) it could be (e.g.) B(M(12), M(11)) followed by B(M(8), M(4), M(0)). The receiver has queryId to link all response mesages together, and is aware that it has received all batch chunks if the last message has resutRemainingQuantity equal to 0.