Question

I had created the following table and inserted 20K records by executing the following statements:

create table Testing
(
col1 int,
col2 varchar(50),
col3 bit,
col4 int,
col5 varchar(50),
col6 bit);

declare 
@flag bit = 1,
@count int

while @flag = 1
    begin
        set @count = (Select count(*) from Testing);
        if (@count = 20000)
            begin
                set @flag = 0;
            end
        else
            begin
                insert into Testing values(100, 'Testing', 1, 100, 'Testing', 1)
            end
    end

And then executed the following query 6 times:

select * from testing
where col2 = 'Testing'

The profiler shows 276 reads for the first execution and 135 for the rest of five.

enter image description here

I don't know why it's taking high reads on first time and how to decrease the reads for the first time.

Note: I had executed DBCC DROPCLEANBUFFERS; DBCC FREEPROCCACHE; before each execution.

Was it helpful?

Solution

I see a similar pattern here. In my case the first batch shows up as 226 reads and subsequent reads as 105.

The first time you execute the query after creating and populating the table it automatically creates statistics on col2 in order to get an estimated number of rows.

If you add the SP:StmtCompleted event to the trace you can see that this is responsible for the extra reads that the first batch encounters.

Profiler

To answer the question "how to decrease the reads for the first time?" technically you could disable automatic statistics creation but you shouldn't do this the overhead of automatic statistics maintenance is generally nothing when compared to the benefit of more accurate cardinality estimates and better query plans.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top