Question

We're developing a web application which must tolerate noticeable amount of load. I'm running my tests on a HP (Proliant DL 380) server (two 3.6GHz Xeon cpus, 16 GBs of RAM, ...). I'm using Apache JMeter and pylot to run load tests (they kinda show similar results).

In one scenario, I configured the load testing program to hit my index page ,using just one thread, as much as it can. The index page is about 60KB and consists of about 10 ajax calls, lots of JavaScript and jQuery Codes, required CSS and etc. The results I got was, well, disappointing.

Full index.jsp page:

  • Throughput (req/sec): 3.567
  • Response Time (secs): 0.278

So I removed every ajax call, got rid of charts and also CSS (but not JS)

  • Throughput (req/sec): 6.082
  • Response Time (secs): 0.161

Still very low! So I built an static index page in HTML format which contains all the data with the same size (without any server side and client side computation)

  • Throughput (req/sec): 20.787
  • Response Time (secs): 0.046

Wow, that was a breakthrough! Now I added some of the JavaScript codes to the index.html page

  • Throughput (req/sec): 9.617
  • Response Time (secs): 0.103

Well, I think the bottleneck has been found, Java Script codes. I need to find out how many req/sec "the server" can handle and since java Script is run client side, I don't think I should include it in this test. So should load testing tools process JS codes? (they seem to be doing this)

Another key question is, according to hardware, content size and aforementioned configs, is this amount of throughput plausible? shouldn't I expect more? my expectation is 500 req/sec! is the only solution adding hardware?!

BTW, the webapp has been built using Java+Struts2+JSP+Hibernate+MySQL. it is also distributed over multiple servers using haproxy. but the aforementioned tests was run on a single server.

Was it helpful?

Solution 2

Is your JavaScript (CSS,images) directly embedded on the page or loaded from a script tag? The latter case will force the browser to download the file from the server ,instantly halving your paged per second.that's one of the reasons you should load jquery from a different server ( eg Google) - this will have a minor hit on the user perceived page load time (one additional DNS query to be made) but really takes the load off your server

OTHER TIPS

If what you are looking for is a raw number of how fast the server can serve up the content, then I would say yes, ignore the java-script and focus on how fast it takes to transfer all the various bits of the page (HTML, images, script files, CSS, etc).

However, if you are trying to test user experience, well JS is part of that, so you need to take that into account. From your description, you are not worried about user experience but rather load to the server.

You might want to consider setting up JMeter to make direct calls to the pieces of your pages, as described in the first sentence above.

Another key question is, according to hardware, content size and aforementioned configs, is this amount of throughput plausible? shouldn't I expect more? my expectation is 500 req/sec! is the only solution adding hardware?!

Writing your application so that pages are cacheable and then putting a http-proxy up in front of your application is often a good strategy.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top