[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[cobalt-users] What is the limiting factor on simultaneous access to a web page



Hi,

I've been trying to stress test a customers web site to find out how the
site performs under load. By varying the Apache parameters
MinSpareServers
MaxSpareServers
StartServers
MaxClients
we have seen varying results.

We are using webstress 3.0  to test the site (it was cheap), running on a PC
connected to the target RaQ 4 in a closed network. Webstress sends a stream
of requests for a page to the server and measures the time taken for the
requests to be served and also whether an error occurred or not.  The number
of requests to be sent can be set. What we've been doing is altering the
parameters and then increasing the number of requests to see what the
performance is and at what point the server starts to return page requests
errors.

This seems to work well up to a point. If we throttle Apache down to only a
few processes the performance is in line with what we expected. However,
there seems to be an upper limit whereby changing the above parameters has
no effect on the number of completed requests. The interesting thing is that
this figure is well below the maximum number of clients. Another issue is
that it doesn't seem to be related to the page size, we've tried the same
test on two different pages, one about 50Kb and one about 250kb and the
number of requests that get serviced before a failure occurs is constant.

The one thing we think it may be is if there is a limit on the number of
times a single file may be opened under Linux, and if this is defined by a
kernel parameter.

Has anyone come across this issue before? Or does anyone have any ideas
regarding this?

Thanks

Peter Goodwin
peter.goodwin@xxxxxxxxxxxxxxxxxxxx
http://www.dialeforbusiness.com