Question

I need some guidance from anyone who has deployed a real-world, in-production application that uses the Sql Server Service Broker external activation mechanism (via the Service Broker External Activator from the Feature Pack).

Current mindset:

My specs are rather simple (or at least I think so), so I'm thinking of the following basic flow:

  1. order-like entity gets inserted into a Table_Orders with state "confirmed"

  2. SP_BeginOrder gets executed and does the following:

    • begins a TRANSACTION
    • starts a DIALOG from Service_HandleOrderState to Service_PreprocessOrder
    • stores the conversation handle (from now on PreprocessingHandle) in a specific column of the Orders table
    • sends a MESSAGE of type Message_PreprocessOrder containing the order id using PreprocessingHandle
    • ends the TRANSACTION

    Note that I'm not ending the conversation, I don't want "fire-and-forget"

  3. event notification on Queue_PreprocessOrder activates an instance of PreprocessOrder.exe (max concurrent of 1) which does the following:

    • begins a SqlTransaction
    • receives top 1 MESSAGE from Queue_PreprocessOrder
    • if message type is Message_PreprocessOrder (format XML):
      • sets the order state to "preprocessing" in Table_Orders using the order id in the message body
      • loads n collections of data of which computes an n-ary Carthesian product (via Linq, AFAIK this is not possible in T-SQL) to determine the order items collection
      • inserts the order items rows into a Table_OrderItems
      • sends a MESSAGE of type Message_PreprocessingDone, containing the same order id, using PreprocessingHandle
      • ends the conversation pertaining to PreprocessingHandle
    • commits the SqlTransaction
    • exits with Environment.Exit(0)
  4. internal activation on Queue_HandleOrderState executes a SP (max concurrent of 1) that:
    • begins a TRANSACTION
    • receives top 1 MESSAGE from Queue_InitiatePreprocessOrder
    • if message type is Message_PreprocessingDone:
      • sets the order state to "processing" in Table_Orders using the order id in the message body
      • starts a DIALOG from Service_HandleOrderState to Service_ProcessOrderItem
      • stores the conversation handle (from now on ProcessOrderItemsHandle) in a specific column of Table_Orders
      • creates a cursor for rows in Table_OrderItems for current order id and for each row:
        • sends a MESSAGE of type Message_ProcessOrderItem, containing the order item id, using ProcessOrderItemsHandle
    • if message type is Message_ProcessingDone:
      • sets the order state to "processed" in Table_Orders using the order id in the message body
    • if message type is http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog (END DIALOG):
      • ends the conversation pertaining to conversation handle of the message
    • ends the TRANSACTION
  5. event notification on Queue_ProcessOrderItem activates an instance of ProcessOrderItem.exe (max concurrent of 1) which does the following:
    • begins a SqlTransaction
    • receives top 1 MESSAGE from Queue_ProcessOrderItem
    • if message type is Message_ProcessOrderItem (format XML):
      • sets the order item state to "processing" in Table_OrdersItems using the order item id in the message body, then:
        • loads a collection of order item parameters
        • makes a HttpRequest to a URL using the parameters
        • stores the HttpResponse as a PDF on filesystem
      • if any errors occurred in above substeps, sets the order item state to "error", otherwise "ok"
      • performs a lookup in the Table_OrdersItems to determine if all order items are processed (state is "ok" or "error")
      • if all order items are processed:
        • sends a MESSAGE of type Message_ProcessingDone, containing the order id, using ProcessOrderItemsHandle
        • ends the conversation pertaining to ProcessOrderItemsHandle
    • commits the SqlTransaction
    • exits with Environment.Exit(0)

Notes:

  • specs specify MSSQL compatibility 2005 through 2012, so:
    • no CONVERSATION GROUPS
    • no CONVERSATION PRIORITY
    • no POISON_MESSAGE_HANDLING ( STATUS = OFF )
  • I am striving to achieve overall flow integrity and continuity, not speed
  • given that tables and SPs reside in DB1 whilst Service Broker objects (messages, contracts, queues, services) reside in DB2, DB2 is SET TRUSTWORTHY

Questions:

  1. Are there any major design flaws in the described architecture ?
  2. Order completion state tracking doesn't seem right. Is there a better method ? Maybe using QUEUE RETENTION ?
  3. My intuition tells me that in no case whatsoever should the activated external exe terminate with an exit code other than 0, so there should be try{..}catch(Exception e){..} finally{ Environment.Exit(0) } in Main. Is this assumption correct ?
  4. How would you organize error handling in DB code ? Is an error log table enough?
  5. How would you organize error handling in external exe C# code ? Same error logging table ?
  6. I've seen the SQL Server Service Broker Product Samples, but the Service Broker Interface seems overkill for my seemingly simpler case. Any alternatives for a simpler Service Broker object model ?
  7. Any cross-version "portable" admin tool for Service Broker capable of at least draining poison messages ?
  8. Have you any decent code samples for any of the above ?
Was it helpful?

Solution

Q: Are there any major design flaws in the described architecture ?
A: Couple of minor perks:
- waiting for an HTTP request to complete while holding open a transaction is bad. You can't achieve transactional consistency between a database and HTTP anyway, so don't risk to have a transaction stretch for minutes when the HTTP is slow. The typical pattern is to {begin tran/receive/begin conversation timer/commit} then issue the HTTP call w/o any DB xact. If the HTTP call succeeds then {begin xact/send response/end conversation/commit}. If the HTTP fails (or client crashes) then let the conversation time activate you again. You'll get a timer message (no body), you need to pick up the item id associated with the handle from your table(s).

Q: Order completion state tracking doesn't seem right. Is there a better method ? Maybe using QUEUE RETENTION ?
A: My one critique of your state tracking is the dependency on scanning the order items to determine that the current processed one is the last one (5.3.4). For example you could add the information that this is the 'last' item to be processed in the item state so you know, when processing it, that you need to report the completion. RETENTION is only useful in debugging or when you have logic that require to run 'logical rollback' and to compensating actions on conversation error.

Q: My intuition tells me that in no case whatsoever should the activated external exe terminate with an exit code other than 0, so there should be try{..}catch(Exception e){..} finally{ Environment.Exit(0) } in Main. Is this assumption correct ?
A: The most important thing is for the activated process to issue a RECEIVE statement on the queue. If it fails to do so the queue monitor may enter the notified state forever. Exit code is, if I remember correctly, irrelevant. As with any background process is important to catch and log exceptions, otherwise you'll never even know it has a problem when it start failing. In addition to disciplined try/catch blocks, Hookup Application.ThreadException for UI apps and AppDomain.UnhandledException for both UI and non-UI apps.

Q: How would you organize error handling in DB code ? Is an error log table enough?
A: I will follow up later on this. Error log table is sufficient imho.

Q: How would you organize error handling in external exe C# code ? Same error logging table ?
A: I created bugcollect.com exactly because I had to handle such problems with my own apps. The problem is more than logging, you also want some aggregation and analysis (at least detect duplicate reports) and suppress floods of errors from some deployment config mishap 'on the field'. Truth be told nowadays there are more options, eg. exceptron.com. And of course I think FogBugs also has logging capabilities.

Q: I've seen the SQL Server Service Broker Product Samples, but the Service Broker Interface seems overkill for my seemingly simpler case. Any alternatives for a simpler Service Broker object model ?
finally, an easy question: Yes, it is overkill. There is no simple model.

Q: Any cross-version "portable" admin tool for Service Broker capable of at least draining poison messages ?
A: The problem with poison messages is that the definition of poison message changes with your code: the poison message is whatever message breaks the current guards set in place to detect it.

Q: Have you any decent code samples for any of the above ?
A: No

One more point: try to avoid any reference from DB1 to DB2 (eg. 4.3.4 is activated in DB1 and reads the items table from DB2). This creates cross DB dependencies which break when a) one DB is offline (eg. for maintenance) or overloaded or b) you add database mirroring for HA/DR and one DB fails over. Try to make the code to work even if DB1 and DB2 are on different machines (and no linked servers). If necessary, add more info to the messages payload. And if you architect it that way that DB2 can be on a different machine and even multiple DB2 machines can exists to scale out the HTTP/PDF writing work.

And finally: this design will be very slow. I'm talking low tens messages per second slow, with so many dialogs/messages involved and everything with max_queue_readers 1. This may or may not be acceptable for you.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top