質問

I am trying to execute a large sql script which contains about 1000000 simple UPDATEqueries.

The total size of this script file is about 100 MB.

When I run this script I'm getting an Out Of Memory exception.

When I split the file into chunks of 10 MB I can run each of them.

However, I for convenience I would like to have only one script I can run at once. Is there any statement I can introduce so SQL server releases allocated memory after running each query so I can execute this large script at once?

役に立ちましたか?

解決

If you have not done so already, insert a GO every thousand statements or so. Otherwise the whole file will be one large batch. SQL Server calculates a single execution plan for a batch which may be pushing you to resource limits.

You might run into another type of resource problem if you run the whole file in the same transaction. The larger the transaction, the more diskspace your TX log file will need to complete the processing of your file.

他のヒント

UPDATED: Open the window command prompt (run+cmd) and run the following:

sqlcmd -S YOURSQLSERVER\INSTANCENAME -i c:\your_script_file.sql -o c:\your_log_file.log

I've noticed what appears to be a memory leak in SSMS 2012 that appears to be related to the results window. Try putting set nocount on at the top of your script to prevent the results pane from being filled up with "1 row(s) affected" messages. Alternatively, is there a way to write your update statements as one statement so you're not doing a constant back and forth from the server and the client?

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top