Question

I have a website (.org) for a project of mine on LAMP hosted on a shared plan.

It started very small but now I extended this community to other states (in US) and it's growing fast.

I had 30,000 (or so) visits per day (about 4 months ago) and my site was doing fine and today I reached 100,000 visits.

I want to make sure my site will load fast for everyone and since it's not making any money I can't really move it to a private server. (It's volunteer work).

Here's my setup:

- Apache 2
- PHP 5.1.6
- MySQL 5.5

I have 10 pages PER state and on each page people can contribute, write articles, like, share, etc... on few pages I can hit 10,000 per hours during lunch time and the rest of the day it's quiet.

All databases are setup properly (I personally paid a DBA expert to build the code). I am pretty sure the code is also good. Now, I can make page faster if I use memcached but the problem is I can't use it since I am on a shared hosting.

Will the MySQL be able to support that many people, with lots of requests per minutes? or I should create a fund to move to a private server and install all the tools I need to make it fast?

Thanks

Était-ce utile?

La solution

To be honest there's not much you can do on shared hosting. There's a reason why they are cheap ... they limit you to do stuff like you want to do.

Either you move to a VPS that allow memcache (which are cheaper) and you put some google ads OR you keep going on your shared hosting using a pre-generated content system.

VPS can be very cheap (look for coupons) and you can install what ever you want since you are root.

for example hostmysite.com with the coupon: 50OffForLife you pay 20$ per month for life ... vs a 5$ shared hosting ...

If you want to keep the current hosting, then what you can do is this:

Pages are generated by a process (cronjob or on the fly), everytime someone write a comment or make an update. This process start and fetch all the data on the page and saves it to the web page.

So let say you have a page with comments, grab the contents (meta, h1, p, etc..) and the comments and save both into the file.

Example: (using .htaccess - based on your answer you are familiar with this)

/topic/1/

If the file exists, then simply echo ... if not:

select * from pages where page_id = 1;
select * from comments where page_id = 1;
file_put_contents('/my/public_html/topic/1/index.html', $content);

Or something along these lines.

Therefore, saving static HTML will be very fast since you don't have to call any DB. It just loads the file once it's generated.

Autres conseils

I know I'm stepping on unstable ground providing an answer to this question, but I think it is very indicative.

Pat R Ellery didn't provide enough details to do any kind of assessment, but the good news there can't be enough details. Explanation is quite simple: anybody can build as many mental model as he wants, but real system will always behave a bit differently.

So Pat, do test your system all the time, as much as you can. What you are trying to do is to plan the capacity of your solution.

You need the following:

  • Capacity test - To determine how many users and/or transactions a given system will support and still meet performance goals.
  • Stress test - To determine or validate an application’s behavior when it is pushed beyond normal or peak load conditions.
  • Load test - To verify application behavior under normal and peak load conditions.
  • Performance test - To determine or validate speed, scalability, and/or stability.

See details here:

In the other words (and a bit primitive): if you want to know your system is capable to handle N requests per time_period simulate N requests per time_period and see the result.

enter image description here

(image source)

Another example:

enter image description here

There are a lot of tools available:

See list here

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top