Question

I'm importing CSV data into a SQL Server via an nvarchar(max) parameter in a stored procedure and then splitting it using a function. I can't use Bulk insert for various reasons.

It's likely that these strings could get very large indeed.

Is there an upper limit to what I can pass through an SP call before things get hairy? I can split the string into batches easily but I'd like to know how small to make my batches.

Was it helpful?

Solution

http://technet.microsoft.com/en-us/library/ms186939.aspx --

Variable-length Unicode string data. n defines the string length and can be a value from 1 through 4,000.

max indicates that the maximum storage size is 2^31-1 bytes (2 GB).

The storage size, in bytes, is two times the actual length of data entered + 2 bytes. The ISO synonyms for nvarchar are national char varying and national character varying. simply put 2 gig of data on an NVARCHAR(max)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top