Question

So database administration is not a strong point of mine, so I have a few questions.

This all relates to the performance of a website I'm looking at, and have spent a fair bit of time researching/diagnosing.

  • There are two sites, a test site and a live site.
  • I have noticed page load times are much longer on the live site as opposed to the test site.

  • The specs of the test site are really low - VPS 2 Virtual processors with 14GB RAM and SQL + website running on the same VPS

  • The specs of the live website are 4 dedicated servers all with 8 virtual processors and 16GB RAM

  • The live site uses a separate dedicated server for SQL with 16 virtual processors and 112GB RAM

  • The website averages 120 concurrent users average at any one time but can fluctuate from 70 to 150 and when we send out an email campaign it can hit 400. When we hit 400 the cpus as you would expect can hit 100% depending on what actions users are doing on the website, but they hold up fairly well same with the RAM.

I would expect the live website to actually perform faster than the test website, but this is not the case. I have looked through the code but I strongly believe it's due to the database. The database size of the live website is over 50GB when backed up, and the test site is 4GB.

Both databases are quite fragmented, and I'm wondering how likely the fragmentation will affect performance?

Also because the live website is many times larger than the test site, will high fragmentation affect the performance of the live site much more than the test site?

Here is a capture of one of the indexes in the products table. enter image description here

Because I have never dealt with this before what are some tips around rebuilding/refreshing indexes.

Should I put the website into maintenance mode so no-one can interact with tables while I'm rebuilding/refreshing indexes?

How long would it take on a table with 12 indexes and 40000 products?

Thanks

Was it helpful?

Solution

Based on the data that you have provided -

there are 67 total pages. Index fragmentation for such a small table would not affect the performance. I would not worry about index fragmentation for the table that you have mentioned.

You should update your table statistics so that sql server can generate better query plan.

I would start my troubleshooting by

Read : Stop Worrying About SQL Server Fragmentation

OTHER TIPS

Indices are bound to fragment in course of usage (time). What matters is if it is affecting the performance.

Usually we do not worry about of indices with page_count < 500. (We as in, my team)However,there is not hard and fast rule for a particular page_count.In my previous shop we had set the limit in our Ola script for 1000 page count. But yes with index having page_count = 67, the fragmentation will always be on the higher side.

You should not be worried about the fragmentation at current page_count.

Please have a go at below blogpost from Paul Randal :

Where do the Books Online index fragmentation thresholds come from?

Main difference between your Test server and Live server is Concurrency.

You haven't done Concurrency Test in your Test server.

Concurrency is one of the main reason for Live Site to be slow.

What code is written in Front end application like what you have written in connection string . Like connection pool,whether connection is properly close after each transaction.This is often the main reason behind slowness .

Then your other part of Front end code.

What query is written in your SP. Whether any deadlock issue because of Concurrent user.

Are there any situation where concurrent user trying to update same set of rows ? How you have handle Race condition ?

Why there are 12 index in single Table with only 40000 records ?

Please throw table structure along with all indexes.

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top