Question

Here is my predicament.

Basically, I need a column in a table to hold up an unknown length of characters. But I was curious if in Sql Server performance problems could arise using a VARCHAR(MAX) or NVARCHAR(MAX) in a column, such as: 'This time' I only need to store 3 characters and most of the time I only need to store 10 characters. But there is a small chances that It could be up to a couple thousand characters in that column, or even possibly a million, It is unpredictable. But, I can guarantee that it will not go over the 2GB limit.

I was just curious if there are any performance issues, or possibly better ways of solving this problem where available.

Was it helpful?

Solution

Sounds to me like you plan to use the varchar(MAX) data type for its intended purpose.

When data in a MAX data type exceeds 8 KB, an over-flow page is used. SQL Server 2005 automatically assigns an over-flow indicator to the page and knows how to manipulate data rows the same way it manipulates other data types.

For further reading, check out Books Online: char and varchar

OTHER TIPS

You cannot create indexes on varchar(max) ( and nvarchar(max)) columns (although they can be included in them. But who would include a column in an index that could get to 2GB?!) so if you want to search on this value, you will do a scan each time unless you use full-text indexes. Also, remember that any report designer or presentation designer (web or otherwise) must assume that someone might put the Encyclopedia into that column and design around it. Nothing is worse than hearing "the users probably won't do X". If a user can do it, they will do it. If a user can put in a tome into a column, at some point they will. If they never should, then IMO, it makes more sense to cap the column size at some reasonable level and if a user tries to stuff more into that column that is allowed, it would elicit a discussion of whether they should be entering that value into that column in the first place.

I just saw this article the other day. It documents a fairly minor performance lag for varchar(max) over a varchar(n) column. Probably not enough to make a difference for you. But if it does, perhaps you can use a separate table to store those few large text blocks. Your small text could stay in the main table, but you could add a flag field to tell you to look in the new table for the big ones.

I've seen some problems - particularly with scalar functions (but these are generally horrible, anyway) which return varchar(MAX) and then aren't re-cast. For instance, say you have a special function CleanString(somevarcharmax) returns varchar(max) and call it on varchar(50) but don't CAST(CleanString(varchar10col) AS varchar(10)) - nasty performance issues.

But typically, when you have varchar(max) columns in a table, you shouldn't be performing those kinds of operations en masse, so I'd say if you are using it properly for your data needs in the table, then it's fine.

Crystal Reports 12 (and other versions, as far as I know) doesn't handle varchar(max) properly and interprets it as varchar(255) which leads to truncated data in reports.

So if you're using Crystal Reports, thats a disadvantage to varchar(max). Or a disadvantage to using Crystal, to be precise.

See:
http://www.crystalreportsbook.com/Forum/forum_posts.asp?TID=5843&PID=17503
http://michaeltbeeitprof.blogspot.com/2010/05/crystal-xi-and-varcharmax-aka-memo.html

No, varchar(max) adjusts itself based on the size of the entry, so it is the most efficient if you will be using widely varied sized inputs.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top