Question

I had assumed the first call below was more efficient, because it only makes this check once to see if @myInputParameter is less than 5000.

If the check fails, I avoid a query altogether. However, I've seen other people code like the second example, saying it's just as efficient, if not more.

Can anyone tell me which is quicker? It seems like the second one would be much slower, especially if the call is combing-through a large data set.

First call:

IF (@myInputParameter < 5000)
    BEGIN
        SELECT 
            @myCount = COUNT(1) 
        FROM myTable 
        WHERE someColumn=@someInputParameter
            AND someOtherColumn='Hello'

        --and so on
    END

Second call:

SELECT 
    @myCount = COUNT(1) 
FROM myTable 
WHERE someColumn=@someInputParameter
    AND someOtherColumn='Hello'
    AND @myInputParameter < 5000

--and so on

Edit: I'm using SQL Server 2008 R2, but I'm really asking to get a feel for which query is "best practice" for SQL. I'm sure the difference in query-time between these two statements is a thousandth of a second, so it's not THAT critical. I'm just interested in writing better SQL code in general. Thanks

Was it helpful?

Solution 2

SQL Server is well designed. It will do literal evaluations without actually scanning the table/indexes if it doesn't need to. So, I would expect them to perform basically identical.

From a practice standpoint, I think one should use the if statement, especially if it wraps multiple statements. But, this really is a matter of preference to me. For me, code that can't be executed would logically be faster than code that "should" execute without actually hitting the data.

Also, there is the possibility that SQL Server should create a bad plan and actually do a hit the data. I've never seen this specific scenario with literals, but I've had bad execution plans be created.

OTHER TIPS

Sometimes, SQL Server is clever enough to transform the latter into the former. This manifests itself as a "startup predicate" on some plan operator like a filter or a loop-join. This causes the query to evaluate very quickly in small, constant time. I just tested this, btw.

You can't rely on this for all queries but once having verified that it works for a particular query through testing I'd rely on it.

If you use OPTION (recompile) it becomes even more reliable because this option inlines the parameter values into the query plan causing the entire query to turn into a constant scan.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top