Question

I have a flow in mule that executes one query to retrieve 100 records in a batch. It then passes the records into a foreach and within that loop, each record is processed within its own transaction. If one record's transaction fails, this does not affect the other records in the foreach loop as they should just continue their own execution.

This works great except that it executes each record's transaction serially. I'd like to have multiple transactions working at the same time (within a limit - say 4 transactions at once) to speed up the process.

<foreach doc:name="For Each">
  <ee:multi-transactional action="ALWAYS_BEGIN" doc:name="Transactional">
    EXECUTE A NUMBER OF DIFFERENT QUERIES WITHIN THE TRANSACTION
  </ee:multi-transactional>
</foreach>

I thought I could use an async, but I need to correlate everything at the end of the process.

EDIT: Solution I used

<foreach doc:name="For Each"  batchSize="50" >
  <all enableCorrelation="IF_NOT_SET" doc:name="All">
    <async doc:name="Async">
      <foreach doc:name="BatchedForEach">   
        <ee:multi-transactional action="ALWAYS_BEGIN" doc:name="Transactional">
          EXECUTE A NUMBER OF DIFFERENT QUERIES WITHIN THE TRANSACTION
        </ee:multi-transactional>     
      </foreach>
      <flow-ref name="completeBatch" />
    </async>
   </all>
</foreach>

<sub-flow name="completeBatch">
  <collection-aggregator doc:name="Collection Aggregator"/>
  <combine-collections-transformer />
</sub-flow>
Was it helpful?

Solution

Foreach batchSize="n", and then Fork and Join for each payload[0..n-1]

http://blogs.mulesoft.org/aggregation-with-mule-fork-and-join-pattern/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top