Question

I am working on a project for a client using a classic ASP application I am very familiar with, but in his environment is performing more slowly than I have ever seen in a wide variety of other environments. I'm on it with many solutions; however, the sluggishness has got me to look at something I've never had to look at before -- it's more of an "acadmic" question.

I am curious to understand a category page with say 1800 product records takes ~3 times as long to load as a category page with say 54 when both are set to display 50 products per page. That is, when the number of items to loop through is the same, why does the variance in the total number of records make a difference in loading the number of products displayed when that is a constant?

Here are the methods used, abstracted to the essential aspects:

SELECT {tableA.fields} FROM tableA, tableB WHERE tableA.key = tableB.key AND {other refining criteria};
set rs=Server.CreateObject("ADODB.Recordset")   
rs.CacheSize=iPageSize
rs.PageSize=iPageSize
pcv_strPageSize=iPageSize
rs.Open query, connObj, adOpenStatic, adLockReadOnly, adCmdText

dim iPageCount, pcv_intProductCount
iPageCount=rs.PageCount
If Cint(iPageCurrent) > Cint(iPageCount) Then iPageCurrent=Cint(iPageCount)
If Cint(iPageCurrent) < 1 Then iPageCurrent=1

if NOT rs.eof then
    rs.AbsolutePage=Cint(iPageCurrent)
    pcArray_Products = rs.getRows()
    pcv_intProductCount = UBound(pcArray_Products,2)+1
end if

set rs = nothing

tCnt=Cint(0)

do while (tCnt < pcv_intProductCount) and (count < pcv_strPageSize)
    {display stuff}                     
count=count + 1                     
loop

The record set is converted to an array via getRows() and the destroyed; records displayed will always be iPageSize or less.

Here's the big question: Why, on the initial page load for the larger record set (~1800 records) does it take significantly longer to loop through the page size (say 50 records) than on the smaller records set (~54 records)? It's running through 0 to 49 either way, but takes a lot longer to do that the larger the initial record set/getRows() array is. That is, why would it take longer to loop through the first 50 records when the initial record set/getRows() array is larger when it's still looping through the same number of records/rows before exiting the loop?

Running MS SQL Server 2008 R2 Web edition

Was it helpful?

Solution

You are not actually limiting the number of records returned. It will take longer to load 36 times more records. You should change your query to limit the records directly rather than retrieving all of them and terminating your loop after the first 50.

Try this:

SELECT * FROM (SELECT *, ROW_NUMBER() OVER(ORDER BY tableA.Key) AS RowNum FROM tableA INNER JOIN tableB ON tableA.key = tableB.key WHERE {other refining criteria}) AS ResultSet WHERE RowNum BETWEEN 1 AND 50

Also make sure the columns you are using to join are indexed.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top