Question

I think I might have my Unit of Work set up wrong in my architecture. Here is what I currently have (indented to show order):

HttpRequest.Begin()
  UnitOfWork.Begin()
    Session.BeginTransaction(System.Data.IsolationLevel.ReadCommitted);

Here, I call various services to perform crud using NHibernate. When I want to make a change to the database (update/save), I call this code:

        using (var transaction = unitOfWork.Session.BeginTransaction())
        {
            try
            {
                // These are just generics
                ret = (Key)unitOfWork.Session.Save(entity);
                transaction.Commit();
                rb.unitOfWork.Session.Clear();
            }
            catch
            {
                transaction.Rollback();
                rb.unitOfWork.Session.Clear();
                rb.unitOfWork.DiscardSession();
                throw;
            }
        }

When the HttpRequest is over, I perform these steps:

      UnitOfWork.Commit()
    Transaction.Commit() // This is my sessions transaction from the begin above

I am running into issues with being able to rollback large batch processes. Because I am committing my transactions in my CRUD layer, as seen above, my transaction is no longer active and when I try to rollback in my UnitOfWork, it does nothing because of the transaction already being committed. The reason I'm committing my code in my CRUD layer is so I can persist my data as quickly as possible without locking the database for too long.

What is the best course of action to take with a situation like the one above? Do I just make special CRUD operation that don't commit for batch jobs and just handle the commit at the end of my job, or is my logic just flawed with my UnitOfWork and Session Per Request? Any suggestions?

Était-ce utile?

La solution

You've discovered the reason why the session-per-request pattern is so popular and the problems that can stem from micro-managing your unit of work.

Typically with each web request, everything that needs to be done within that request can be thought of as one unit of work so it stands to reason that you should only have one unit of work and one NHibernate session open during that single web request.

Also, I think you may be a bit confused about how NHibernate works due to this sentence in your question: "The reason I'm committing my code in my CRUD layer is so I can persist my data as quickly as possible without locking the database for too long."

NHibernate is not going to be causing any locking in your database. Everytime you call ISession.Save(entity), as long as you do not call ISession.Flush() or ITransaction.Commit(), nothing will be written to the database rather it will be added to a queue of items to be inserted or updated in the database when the current transaction is committed at the end of the web request.

So your session per request should be setup like so:

void Application_BeginRequest()
{
    // Start your unit of work, open a session and begin a transaction
}

// Do all of your work ( Read, insert, update, delete )

void Application_EndRequest()
{
    try
    {
        // UnitOfWork.Current.Transaction.Commit();
    }
    catch( Exception e )
    {
        // UnitOfWork.Current.Transaction.Rollback();
    }
}

Of course there are many ways to do this same thing but this is the basics of the session per request pattern --only one session for the entire web request.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top