Question

When I'm creating a database table (in my case its usually arcgis geodatabases, but I expect the issues are the same across all databases) I often find it difficult to accurately judge the required length of a text field so I overestimate it (50, or 100). Will this significantly affect the performance of, and disk space used, by the database? and are there any lenths that are particularly effiecient (like 8,16,32,64)?

Or should I continue to make my databases flexible?

Était-ce utile?

La solution

Strings are not really memory intensive, each char usually only exists out of 2 bytes (supporting every language) , the only reason why you should use shorter strings then longer ones if you are doing certain operations on them, because a string is generally just a array of characters and with bigger strings, bigger operations have to be done.

so in general, reserving 100 characters for a string results in 200 bytes which is 0.2KB , and with 1000 strings this is less then 0.2 MB, so this shouldn't really be any problem unless you are storing everything in RAM on a mini computer with 256MB ram.

Answer: Not a very big effect on size (realistically seen), medium effect on performance (in nanoseconds) because iterating each character and for example comparing with another one in a long string can be relatively intense, but nothing to worry about untill you are comparing thousands of strings with thousands of other strings

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top