Question

I have an asp.net-mvc site that is using nhibernate and SQL server, there are a few pages that are quite slow because they require view that need queries which join about 25 different tables. If i don't a large join it takes a while and if I do a multi query it still seems to take a while

Its a pretty ready heavy (light write) DB so I wanted to see if there is a good way to basically load up the entire object graph of my database (my server has plenty of memory) into 2nd level cache so I am confident that it rarely hits the db. I am using

  NHibernate.Caches.SysCache.SysCacheProvider

as the second level cache (not a distributed cache). Is there any flaw in this idea and is there a recommended way of doing this?

Was it helpful?

Solution

You are caching your query results, but not your entity (those are separate caches) Caching a query's results just stores the IDs; if you are not caching your entities too, a query is issued to load each returned entity (this is usually bad) The default table name for the MyDTO class is MyDTO, so that's where it's looking This looks like a Query by ID, for which you shouldn't be using a loose named query, but a proper loader .(see 17.4. Custom SQL for loading) Once you set up the loader and entity caching, you'll just be able to retrieve your objects using just session.Get(id), which will use the second level cache, as long as you do all of your work inside transactions, which is a recommended practice.

OTHER TIPS

Second-level cache always associates with the Session Factory object. While running the transactions, in between it loads the objects at the Session Factory level, so that those objects will be available to the entire application, not bound to single user. Since the objects are already loaded in the cache, whenever an object is returned by the query, at that time no need to go for a database transaction.

second-level cache has to be not only enabled but configured for every single individual entity class that you want cached. So enable cache all 15 objects that are mapped to 15 tables.

In XML, this is done inside the element:

<cache usage="read-write"/>

In Fluent NHibernate (non-automap), it's done in the ClassMap constructor or wherever you put the rest of your mapping code:

Cache.ReadWrite().Region("Configuration");

From here it depends on the size and load of DB, and how new data must be present.

If Db is relatively small and there are few writes you can update cache on every write/update.

ISession.Clear();
ReloadCache();

If Db is massive: And you have a luxury of updating database once per day lets say a 12am and then keep 'new' data in cache for a day then you're ok too. You will get a lag spike for a few users while reloading.

Here's an example: http://www.codeproject.com/Articles/529016/NHibernate-Second-Level-Caching-Implementation

If your DB is massive and users must get updated data you will manually have to update data in cache.

Database db =   new Database (); 
Transaction tx = db.BeginTransaction (); 
try 
{ 
// Read from the cache 
MyEntity1 entity1 = cache.Get <MyEntity1> ("pk of entity1"); 
// Cache is not read from the database 
 if (entity1 ==   null) entity1 = db.Get <MyEntity1> ("pk of entity1"); 

// Entity1 processing 

updated = db.Update, (entity1); / / entity1 update saved to the database 
 if (updated) cache.Put (entity1); / / database update successfully, the update cache 

// Transaction processing 

tx.commit (); 
} 
catch 
{ 
tx.Rollback (); 
throw; 
}

more on that here: http://www.databaseskill.com/3093355/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top