سؤال

I have the following batch insert query running on MSSQL 2012:

WHILE 1<2 --busy loop until user stops the query
BEGIN

DECLARE @batch int = 200000

BEGIN TRANSACTION

WHILE @batch > 0
BEGIN

DECLARE @hourRand int = CONVERT(int,60*RAND() )
DECLARE @minRand int = CONVERT(int,60*RAND() )
--...more DECLAREs... --

INSERT INTO dbo.details (COLUMN1,COLUMN2) VALUES (@hourRand, @minRand, ...)

set @batch = @batch - 1
END

COMMIT

END

When I leave this query on, SQL's memory usage continuously grows. Why would this loop cause memory growth? Are the inserted entries being stored in some kind of cache or buffer that's taking up memory? If so, how can I free the memory that's being used?

I'm aware that SQL grows its memory pool as needed, but my queries begin to hang when the server's memory usage approaches 98%, so I do not think it's simply the memory pool being large. SQL appears to be actually "using" most of the memory it's holding onto.

Restarting the server frees the memory as expected, but I can't have the server run out of memory often.

Thank you for the help!

هل كانت مفيدة؟

المحلول

There is no issue here. All server databases make extensive use of caching and do not release memory unless required, eg because the available free RAM falls below some threshold. The server will not run out of RAM. Trying to "free" the memory will degrade performance as the server will have to read the data back from storage.

As for the handing queries, it probably has more to do with the fact that you have issues 200K insert statements inside a single transaction and you are probably experience locking or deadlock issues.

This an artificial example that doesn't display the server's behavior in batch operations. Batches are never 200K statements long. If you want to import 200K rows, use BULK INSERT on the server's side or SqlBulkCopy on the client.

SQL Server is used in Data Warehouse applications where a typical import size is millions of rows. The trick is to use the proper tools for this job, eg BULK INSERT, SSIS and proper use of staging databases.

نصائح أخرى

Because you are doing all this work inside a TRANS.

Either COMMIT after your INSERT or better yet remove your BEGIN TRANSACTION and COMMIT

You are trying to insert 200000 records.

You were trying to commit at every record initially and the insert was very slow. Then you changed the commit after the end of all the inserts.

Try having a counter for the number of records inserted commit once this counter reaches 10000 and reinitialize it to 0.

DO NOT FORGET TO HAVE A COMMIT AT THE END This should solve your memory problems and the insert should also be faster. keep playing with the number 10000 till you are comfortable with the speed of the insert and the memory usage.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top