Question

I'm trying to select more than 80,000 record in SQL Server in a table that has millions of records. The issue is that I've the correct Index, but it takes more than 15 minutes to return the recordset.

I'm using MS SQL Server 2000, I found a pagination method using stored procedures but it uses a temporal table that I have to insert the entire Result Set and then select the quantity of record I'm going to show per page. This method takes too long.

Any help for a faster way that I can implement ?

Was it helpful?

Solution

You'll have to edit this to implement your input parameters for user's filtering and sort options, but the general principle will apply.. I used this technique with SQL 2000 in 2000/2001 timeframe with 90M record table to deliver fast paging for 150-200k row resultsets. Since only the key is in the temp table, it is a very narrow, very small temp tabl;e, and performance is fast, (and it only has to read the main table index, not the table itself, for this step) Then, when actually generating data from main table for actual (smaller) return resultset (only @PageSize rows) the query only has to read a very few records...

Create Procedure GetPagedData
@Page Integer = 1,
@PageSize Integer = 100,
@UsersFilteringCOnditions,
@UsersSortOptions
As
Set NoCount On

Declare @Start Integer,
Declare @End Integer
Declare @NumRecs Integer

   -- Declare a temp table variable to hold all the pk values...
   Declare @Keys Table (rowNum integer Identity Primary Key NotNull,
                        keyVal Integer Not Null)

   -- Insert all the Primary Keys into the temp table variable...
   Insert @keys(keyVal)
   Select PrimaryKey From MyMillionRowTable
   Where UsersFilterConditionsAreTrue
   Order By UsersSortOptions

  -- Then, select from your big table only the data 
  -- from the rows for the page the user wants

   Select @NumRecs = Count(*) From Keys 
   Set @End = @Page * @PageSize
   Set @Start = @End + 1 - @PageSize

   Select {Insert ColumnListHere}
   From MyMillionRowTable T
       Join @Keys K On K.KeyVal = T.PrimaryKey 
   Where K.rowNum Between @Start And @End

OTHER TIPS

There are a few of options. The usual DB tuning tricks (on which I'm not an expert). Is it possible to extract the subset data you care about into a smaller table behind the scenes?

Is it definately using the index? Is it a clustered index? Check the Execution plan. You may also want to rebuild the index.

Failing that, post the query and the table definition and indexes.

It could also be a limitation of your SQL configuration (amount of RAM being used etc), or your hardware. Is it a single disk or a RAID? If a RAID has one of the drives failed causing a background rebuild?

@jmpena: sorry, the execution plan takes 2% reading from the bookmark and 98% scanning the index

I think you answered your own question here. "98% scanning the index". Scanning an index just means that SQL found an index that may help performance chosen based on statistics. Depending on the index, the operation can be as bad as a table scan itself.

What you should be seeing if the index/query were tuned properly would be an index seek.

Can you share what indexes are on the table? (sp_help {tablename})

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top