Question

httperf ... --rate=20 --send-buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=10

Gives 1000 requests as expected on nginx.

Total: connections 100 requests 1000 replies 1000 test-duration 5.719 s

Connection rate: 17.5 conn/s (57.2 ms/conn, <=24 concurrent connections)
Connection time [ms]: min 699.0 avg 861.3 max 1157.5 median 840.5 stddev 119.5
Connection time [ms]: connect 56.9
Connection length [replies/conn]: 10.000

Request rate: 174.8 req/s (5.7 ms/req)
Request size [B]: 67.0

Reply rate [replies/s]: min 182.0 avg 182.0 max 182.0 stddev 0.0 (1 samples)
Reply time [ms]: response 80.4 transfer 0.0
Reply size [B]: header 284.0 content 177.0 footer 0.0 (total 461.0)
Reply status: 1xx=0 2xx=1000 3xx=0 4xx=0 5xx=0

CPU time [s]: user 1.42 system 4.30 (user 24.9% system 75.1% total 100.0%)
Net I/O: 90.2 KB/s (0.7*10^6 bps)

Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0

On same hardware querying uWSGI on port 8000 results 200 requests and 100 replies, and 100 reset connections. What's wrong? The server is extremely powerful.

Total: connections 100 requests 200 replies 100 test-duration 5.111 s

Connection rate: 19.6 conn/s (51.1 ms/conn, <=5 concurrent connections)
Connection time [ms]: min 69.5 avg 128.4 max 226.8 median 126.5 stddev 27.9
Connection time [ms]: connect 51.4
Connection length [replies/conn]: 1.000

Request rate: 39.1 req/s (25.6 ms/req)
Request size [B]: 67.0

Reply rate [replies/s]: min 19.8 avg 19.8 max 19.8 stddev 0.0 (1 samples)
Reply time [ms]: response 68.8 transfer 8.2
Reply size [B]: header 44.0 content 2053.0 footer 0.0 (total 2097.0)
Reply status: 1xx=0 2xx=100 3xx=0 4xx=0 5xx=0

CPU time [s]: user 1.87 system 3.24 (user 36.6% system 63.4% total 100.0%)
Net I/O: 42.6 KB/s (0.3*10^6 bps)

Errors: total 100 client-timo 0 socket-timo 0 connrefused 0 connreset 100
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
Was it helpful?

Solution

This is the more logic answer:

http://projects.unbit.it/uwsgi/wiki#Wherearethebenchmarks

The listen queue size is reported on uWSGI startup logs.

But as you have not reported your uWSGI config, it is impossibile to give you the right hint.

OTHER TIPS

Just use <listen>1024</listen> directive in your uWSGI config.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top