Question

I have a variable that I want to return a max of 5 to the left of the decimal and always 3 to the right

CONVERT(VARCHAR(14),CONVERT(DECIMAL(8,3),
        DATEDIFF(MILLISECOND,@ProcessTime,GETDATE()))/@OneThousand)

Now variables are defined as

DECLARE @ProcessTime DATETIME 
DECLARE @OneThousand DECIMAL(8,3)
SET @OneThousand = 1000.000

This always returns something like

0.003000000000

I guess I can solve my problem with a left function, but my question is why does this happen at all. If the Minimum variable of decimal defined, @OneThousand, is 3 shouldn't the return value have 3 decimals?

Was it helpful?

Solution

The result of the division of DECIMAL(8,3) by DECIMAL(8,3)gives a datatype of DECIMAL(20,12) to see this

DECLARE @ProcessTime DATETIME =getdate()
DECLARE @OneThousand DECIMAL(8,3)
SET @OneThousand = 1000.000

DECLARE @v SQL_VARIANT

SET @v = CONVERT(DECIMAL(8,3),
        DATEDIFF(MILLISECOND,@ProcessTime,GETDATE()))/@OneThousand

SELECT
    CAST(SQL_VARIANT_PROPERTY(@v, 'BaseType') AS VARCHAR(30)) AS BaseType,
    CAST(SQL_VARIANT_PROPERTY(@v, 'Precision') AS INT) AS Precision,
    CAST(SQL_VARIANT_PROPERTY(@v, 'Scale') AS INT) AS Scale

The BOL section that explains why decimal(20,12) is here

Operation:        e1 / e2
Result precision: p1 - s1 + s2 + max(6, s1 + p2 + 1)
Result scale:     max(6, s1 + p2 + 1)

So in your case

Result precision: 8 - 3 + 3 + max(6, 3 + 8 + 1) = 20
Result scale:     max(6, 3 + 8 + 1)             = 12

OTHER TIPS

Try...

SELECT CAST(DATEDIFF(ms, @processDate, GETDATE()) / 1000.000 AS decimal(8,3))
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top