Question

We have recently implemented the CDC in SQL Server 2008. The execution was fine till the last week. But day before yesterday, we got an out of memory issue from the server. It says that the temp files created from CDC made the server hard disk full. Do we have a solution for this. The temp files are being removed as soon as the CDC finish, but since this creates huge temp files, the OUT or Memory issue arises.

Is it possible to optimize the CDC package so that it will consume less memory for huge updates.

Was it helpful?

Solution

Your package is using more memory than the server can allocate. When this happens, it creates temporary files on disk to cope with memory situation. By default, this is going to write to the BufferTempStoragePath value in your Data Flow Task. If you have not provided this value, then those files will be defaulted to the OS's temporary file location. This is typically the C: which when you fill can cause instability. That's bad (tm)

When this occurs, you have 4 options to you.

  1. Change the Property on each of your Data Flow Tasks to explicitly define a location for BufferTempStoragePath. This is usually the least impactful to your process but does not address the root cause of paging to disk.
  2. Rework your overall package to have no parallel operations occurring when this Data Flow fires.
  3. Adjust the BufferRows (approx) to a smaller number to allow fewer rows into pipeline which would result in less paging.
  4. Rework your data flow to use fewer async transformations. This article is a good starting point to identify the memory intensive transformations

Reference on buffer location What is the actual use of buffer temp and blob temp in ssis

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top