Question

Problem:

There are a lot of different databases, which is populated by many different applications directly (without any common application layer). Data can be accessed only through SP (by policy)

Task:

Application needs to track changes in these databases and react in minimal time.

Possible solutions:

1) Create trigger for each table in each database, which will populate one table with events. Application will watch this table through SqlDependency.

2) Watch each table in each database through SqlDependency.

3) Create trigger for each table in each database, which will notify application using managed extension.

Which is the best way?

Was it helpful?

Solution

This can be an extensive topic. First of all: What is the SQL Server version used?

if your are using SQL 2008 the Change Data Capture is the tool of choice This new feature enables you to monitor EVERY change made to databases within SQL 2008. This includes DDL changes as well as changes to the data. Check an introduction here.

If you are using an older version of SQL 2008 and you are allowed to modify the DDL of the database the option 3 would be the one of choice (of the once you described). I would not recommend it though, since there are other things to consider, like what happens when a transaction rolls back or when triggers are deactivated when bulk inserting for example?

It will be quiet a challenge to make your solution work properly in all of these cases.

Another way you could go is to watch the Transaction Log file. This way by far the best, but also most complex way of doing it, since there is almost no documentation on the proprietary log format. Also it's bound to a specific version of SQL Server. This will result in a no impact monitoring of the chosen databases.

Yet another approach is creating a copy of the data that is to be monitored and periodically check if there are differences. This has the benefit that NO changes to the source databases have to be made. As well as get rid of transaction or bulk inserting issues. Since latest at the next monitoring run you will be able to discover the changes.

The performance impact is rather minimal since it would only require a primary index consecutive read for the tables that are to be monitored. And this is by far the most optimized way of interacting with a database. This approach will however require quite a development effort. I have to know, since this is my prime focus since the last years. Check here ;)

(I hope linking is ok, in this case since its on topic otherwise I remove it)

OTHER TIPS

Don't you review possibility to use SQL profiler. Using filter you can select only update operations, then write to log.

You can use an open source realization of the SqlDependency class - SqlDependencyEx. It uses a database trigger and native Service Broker notification to receive events about the table changes. This is an usage example:

int changesReceived = 0;
using (SqlDependencyEx sqlDependency = new SqlDependencyEx(
          TEST_CONNECTION_STRING, TEST_DATABASE_NAME, TEST_TABLE_NAME)) 
{
    sqlDependency.TableChanged += (o, e) => changesReceived++;
    sqlDependency.Start();

    // Make table changes.
    MakeTableInsertDeleteChanges(changesCount);

    // Wait a little bit to receive all changes.
    Thread.Sleep(1000);
}

Assert.AreEqual(changesCount, changesReceived);

With SqlDependecyEx you are able to monitor just INSERT or DELETE, avoiding UPDATE. Hope this helps.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top