Re: concurrent connections

From: Henrik Nordstrom <[email protected]>
Date: Fri, 12 May 2000 20:15:57 +0200

There is no know upper limit in Squid, but performance will degrade
somewhat as the number of connections increase.

You are probably seeing other limitations in the OS. For example how
many TCP connections/second are Squid trying to make to other servers?

Quick items to check:

1. netstat --inet -a -n | wc -l

2. /var/log/messages

3. squid/log/cache.log

4. Settings in /proc/sys/fs/file-nr, inode-nr

5. /proc/sys/net/ipv4/ip_local_port_range

Zhang, Jenny (jennyz) wrote:
>
> I want to test the limit on number of concurrent connections Squid can
> handle.
>
> I am using RedHat Linux 6.1 kernel 2.2.12 and Squid 2.3 Stable2. I followed
> the
> instruction on Henrik Nordstrom' page http://squid.sourceforge.net/hno/ to
> raise the limit on file descriptors to 4096, and I can see in cache.log,
> there are
> 4096 file descriptors available. In squid.conf, I set pconn_timeout to 1200
> seconds
> (keep all the connections persistant). I drive 2000 users sending request to
> Squid.
> When the number of users is below 1000, everything is fine. But when the
> number of
> users is more than 1000, a lot of connections is closed. Using 'netstat', I
> observered that
> number of connections established bounced between 1 and about 950. When more
> user send request, number of connections established grows from 1 to 950, but
> once
> it reaches around 950, it goes back to 1.
>
> Are there any factors other than file descriptors which put a limit on number
> of connections?
> Maybe there are some limit in OS?
>
> Thanks,
> Jenny
Received on Fri May 12 2000 - 13:28:39 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:53:29 MST