سؤال

I have a powershell script that exports data from DBF files, saves them as .SQL files and then imports the SQL files into MS SQL 2016 using the sqlcmd command.

There's one SQL file per table in my SQL database. The SQL file drops the table, recreates it and inserts the data into the table.

Most SQL files are small, however, I have 3 files that are 14MB, 26MB and 30MB. Importing these files fails with a memory error:

Msg 701, Level 17, State 123, Server SQL\INSTANCE02, Line -22204 There is insufficient system memory in resource pool 'internal' to run this query.

All SQL files have rather short insert statements. I.e. the 30MB file has 150610 insert statements like this one.

insert into [RITORDER] values(1668556,'   ',7452,6811,0,'F8429E4K4W',63267840,63269279,0,' ',0,0,'010203              ','',63267840,63269279,0,0,0,'      ',0,'               ',0,0,'','               ');

On SQL\INSTANCE02 I had the Maximum server memory set to 5000MB. So I have increased this to 8000MB.

After upgrading the Maximum Server Memory with ~3GB, the 10MB file gets imported, while the other two "larger files" still fail with the 701 message.

Since SQL\INSTANCE02 contains a couple of small database, I don't feel like upgrading the Maximum Server Memory setting for this instance any more.

Is there a way I can tweak the command so it can also run the 3 "larger" queries?

The command I use is:

sqlcmd -S 'SQL\INSTANCE02' -U 'user' -P 'thisisnotmypassword' -d 'dbname' -i $ExportDir\$($file[1])

Or should I use a whole different method instead?

هل كانت مفيدة؟

المحلول

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى dba.stackexchange
scroll top