Question

I run an online photography community and it seems that the site draws to a crawl on database access, sometimes hitting timeouts.

I consider myself to be fairly compentent writing SQL queries and designing tables, but am by no means a DBA... hence the problem.

Some background:

  1. My site and SQL server are running on a remote host. I update the ASP.NET code from Visual Studio and the SQL via SQL Server Mgmt. Studio Express. I do not have physical access to the server.

  2. All my stored procs (I think I got them all) are wrapped in transactions.

  3. The main table is only 9400 records at this time. I add 12 new records to this table nightly.

  4. There is a view on this main table that brings together data from several other tables into a single view.

  5. secondary tables are smaller records, but more of them. 70,000 in one, 115,000 in another. These are comments and ratings records for the items in #3.

  6. Indexes are on the most needed fields. And I set them to Auto Recompute Statistics on the big tables.

When the site grinds to a halt, if I run code to clear the transaction log, update statistics, rebuild the main view, as well as rebuild the stored procedure to get the comments, the speed returns. I have to do this manually however.

Sadly, my users get frustrated at these issues and their participation dwindles.

So my question is... in a remote environment, what is the best way to setup and schedule a maintenance plan to keep my SQL db running at its peak???

Thanks so much! KLK

Was it helpful?

Solution

You do not need to set up your maintenance tasks as a maintenance plan.

Simply create a stored procedure that carries out the maintenance tasks you wish to perform, index rebuilds, statistics updates etc.

Then create a job that calls your stored procedure/s. The job can be configured to run on your desired schedule.

To create a job, use the procedure sp_add_job.

To create a schedule use the procedure sp_add_schedule.

I hope what I have detailed is clear and understandable but feel free to drop me a line if you need further assistance.

Cheers, John

OTHER TIPS

My gut says you are doing something wrong. It sounds a bit like those stories you hear where some system cannot stay up unless you reboot the server nightly :-)

Something is wrong with your queries, the number of rows you have is almost always irrelevant to performance and your database is very small anyway. I'm not too familiar with SQL server, but I imagine it has some pretty sweet query analysis tools. I also imagine it has a way of logging slow queries.

I really sounds like you have a missing index. Sure you might think you've added the right indexes, but until you verify the are being used, it doesn't matter. Maybe you think you have the right ones, but your queries suggest otherwise.

First, figure out how to log your queries. Odds are very good you've got a killer in there doing some sequential scan that an index would fix.

Second, you might have a bunch of small queries that are killing it instead. For example, you might have some "User" object that hits the database every time you look up a username from a user_id. Look for spots where you are querying the database a hundred times and replace it with a cache--even if that "cache" is nothing more then a private variable that gets wiped at the end of a request.

Bottom line is, I really doubt it is something mis-configured in SQL Server. I mean, if you had to reboot your server every night because the system ground to a halt, would you blame the system or your code? Same deal here... learn the tools provided by SQL Server, I bet they are pretty slick :-)

That all said, once you accept you are doing something wrong, enjoy the process. Nothing, to me, is funner then optimizing slow database queries. It is simply amazing you can take a query with a 10 second runtime and turn it into one with a 50ms runtime with a single, well-placed index.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top