Recently due to increased traffic on my site the server was taking a while to return pages, many times timing out the request. I was thinking of shifting my hosting with another provider, but before I could do that I first needed to test the load the present server could take. What I needed was a simple load testing tool, http_load being one such.
http_load is a useful HTTP benchmarking utility that lets you run multiple http fetches in parallel to test the throughput of your web server. It gives you a rough idea of how many bytes a server can serve in a predetermined time. A compiled binary can be downloaded from here.
A note before proceeding : Hitting a server quickly for an extended period of time constitutes a DOS attack. So make sure you load a server only for a short period.
Before you run the tool you have to save the urls to test in a text file – one url per line, which you than pass to the program. In the following example we run the program emulating 3 concurrent users for a time of 10 seconds. The program fetches the urls as fast as it can. The type of net connection also affects the end result, so make sure you use http_load on a fast broadband connection.
c:/> http_load -parallel 3 -seconds 10 ./urls.txt
The statistics returned is shown below.
148 fetches, 3 max parallel, 32708 bytes, in 10 seconds 221 mean bytes/connection 14.8 fetches/sec, 3270.8 bytes/sec msecs/connect: 33.3615 mean, 812.5 max, 15.625 min msecs/first-response: 157.2 mean, 390.625 max, 140.625 min HTTP response codes: code 200 -- 148
The following is a list of options for the program:
* One start specifier, either -parallel or -rate
parallel – number of requests to open at a time
rate – number of requests to open per second
jitter – vary the rate by about 10%
You can specify a -jitter flag that tells http_load to vary the rate randomly by about 10%.
* One end specifier, either -fetches or -seconds
fetches – quit after this many fetches
seconds – seconds to run the program
The following will emulate 1 user and exit after fetching 50 pages rather than after a predetermined time as above.
c:/> http_load -parallel 1 -fetches 50 ./urls.txt
50 fetches, 1 max parallel, 1.27445e+006 bytes, in 106.156 seconds 25489 mean bytes/connection 0.471004 fetches/sec, 12005.4 bytes/sec msecs/connect: 286.563 mean, 312.5 max, 281.25 min msecs/first-response: 741.563 mean, 1125 max, 640.625 min HTTP response codes: code 200 -- 50