문제

Found an interesting problem on SQL Server yesterday, try this:

DECLARE @a REAL = 0.0000
DECLARE @c DECIMAL (18,10) = 20754/3.45 -- this is 6015.6521730000
DECLARE @b DECIMAL (18,10) = 
CASE 
    WHEN 1 > 2 
        THEN @a / 100
    WHEN 1 = 2
        THEN 56
    ELSE @c
END

SELECT @b

It seems to be a problem with precision, and how the case statement is compiled. It can be easily remedied by casting the REAL @a to a decimal in the case statement, but as it's @c we're returning and the other cases should never be hit, it's a strange problem to come across. Anyone know enough about SQL compilation to explain this?

도움이 되었습니까?

해결책

It's down to Data Type Precedence, as detailed in the documentation

[CASE] Returns the highest precedence type from the set of types in result_expressions and the optional else_result_expression.

http://msdn.microsoft.com/en-us/library/ms181765(v=sql.100).aspx

You can break it entirely by adding the line

WHEN 1 = 3 then getdate()

There's a more detailed explanation here

I imagine that the compiler assumes that all cases may be possible (your example is of course deliberately perverse :) )

다른 팁

This is a wrong answer, see podiluska's answer for the right one.

The type of the result of the case statement is the type of the first then clause. So:

CASE 
    WHEN 1 > 2 
        THEN @a / 100

Makes the case result in the type of @a, a real. Change this to:

CASE 
    WHEN 1 > 2 
        THEN cast(@a as decima(18,10)) / 100

To get the result as decimal(18,10).

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top