Question

I use SQL Server 2008 R2 and SSAS and SSIS.

I create a Stage database that fill every day from OLTPs databases.

and then after data cleansing and integrated data (ETL) i most transfer my data to DataWarehouse.

Now what I need to do is how i can change tracking in my OLTPs DataBase.

I know about solution Trigger ( I can fire a trigger on all table in OLTP Database and log data Inserted/ Deleted/ Updated from Inserted and Deleted table inside trigger)

but my OLTP database is very big (about 80,000,000 record) and create trigger make my business slow.

i find some query from this site like this :

SELECT 
*
FROM 
sys.fn_dblog(NULL,NULL)

that show all record from LDF file.

and i find some 3rd parties software that it can read LDF file and then extract command (Insert/Update/Delete) command like ApexSQL

and i find one question in this site How to view transaction logs in SQL Server 2008

and finally i think that if one 3rd parties can extract this command from LDF file why we can not extract that?!?

and in the other hand i need find DDL Command like Alter Table and Alter Field in OLTPs Database for change my Stage Database.

and i find This Link that can revers deleted row from LDF file.

Was it helpful?

Solution

While the answer above will tell you how to look at the transaction log, I agree with JNK that parsing a transaction log is not a good audit trail for table changes.

It all depends on how much audit data you want to keep and how much speed you are willing to loose with the auditing. Do not forget about data retention period.

1 - Trigger based auditing at both the table and database level work fine. See my blog for a presentation on this.

However, auditing on a table that has a large number of changes might not be practical and you might never use the data. A retention period is key.

2 - If you want to use something that is built into SQL Server, take a look at the Change Data Capture feature. It is based upon reading the log file as a periodic SQL Agent job. Therefore, you do not have a trigger fire for each event. There is a lag time between when something happens and when it occurs.

3 - If you are just looking if the record was inserted or updated, by who, and when, this can be done by the custom ETL code that is modifying the Data Warehouse. Just add those fields to the table and have the code update them.

I hope these suggestions move you away from reading the transaction log which can be very tedious in-deed.

John

OTHER TIPS

You can use the sys.fn_dblog() function that was described in Kalen Delaney's SQL Server xxx Internals book to dump out log entires,

-- Use adventure works
USE AdventureWorks2012;
go

-- Dump (ins, del, upd) rows
SELECT *
FROM sys.fn_dblog(NULL,NULL)
WHERE Operation IN ('LOP_DELETE_ROWS', 'LOP_INSERT_ROWS','LOP_MODIFY_ROW');
go

However, the are binary fields that contain the inserted and deleted data. Also, updates might only contain partial data.

I did not do much research on this ...

However, everyone knows that fixed data is stored first in a row followed by variable length data. See the internals book. You will have to parse out each part of the binary blob. This is for simple data, not special page types.

Check out this blog article that takes a swag at just doing that for a fixed example.

If you want to do it dynamically for any table, buy a Commercial Off The Shelf (COTS) product from a company that has spent the time doing the research.

I find this :

Select * from ::fn_dblog(null,null)

or see this link : How can I view SQL Server 2005 Transaction log file

or this link : How Do You Decode A Simple Entry in the Transaction Log?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top