Question

What is the underlying datastructure of datetime values stored in SQL Server (2000 and 2005 if different)? Ie down to the byte representation?

Presumably the default representation you get when you select a datetime column is a culture specific value / subject to change. That is, some underlying structure that we don't see is getting formatted to YYYY-MM-DD HH:MM:SS.mmm.

Reason I ask is that there's a generally held view in my department that it's stored in memory literally as YYYY-MM-DD HH:MM:SS.mmm but I'm sure this isn't the case.

Was it helpful?

Solution

It's stored as an 8 byte field, capable of a range from 1753-01-01 through 9999-12-31, accurate to 0.00333 seconds.

The details are supposedly opaque, but most resources (1), (2) that I've found on the web state the following:

The first 4 bytes store the number of days since SQL Server's epoch (1st Jan 1900) and that the second 4 bytes stores the number of ticks after midnight, where a "tick" is 3.3 milliseconds.

The first four bytes are signed (can be positive or negative), which explains why dates earlier than the epoch can be represented.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top