Re: [squid-users] eating cpu

From: Chris Kleinschmidt <[email protected]>
Date: Thu, 31 Jan 2002 00:33:22 +1000

If you have 500-1000 machines, I suspect that:
a) You'd need more RAM in the machine. Much more. RAM works
   best as cache, so you don't need to do much with it in the
   Squid config... just have it on the machine.
b) I have no idea what CPU is in a Dell "Poweredge 2300", but
   you'd need around a PIII-600 (or better)... and it would NOT
   want to have much else running on it apart from Squid.
c) You'd want a MUCH faster hard disk system than "normal SCSI".
   Possibly also more storage space. I'd go RAID for speed and
   use some 10K RPM drives... but this costs $$.
d) Have at least 100Mbit (and possibly two) network cards, if
   the users are local. 1000 users at once is alot of users.
e) A decent Internet feeder link (e.g.. 2Mbit?).

We run a dual PIII 450 with 512M RAM and an IDE 7200RPM UDMA-66
and service 200 PCs. This works OK for us (the PC also doubles
as a file server too, since Squid uses only one CPU). Squid has
about 10G on disk. We have several of these systems around the
place and they appear to be OK using normal ufs (but they have
been tweaked just a little with respect to caching parameters).
Our feeder link is 256Kbit and system has 100Mbit full-duplex
network card to the switch rack (plus a 10Mbit card for talking
to the other, local servers - for convenience).

That's my 2c worth.

-- 
C.M.Kleinschmidt
christopherk@optusnet.com.au
pankaj patel wrote:
> I was not facing problem of cpu usage due to virus(nimada, codered) , but
> if i was running around 20-30 machines it was ok but if i was putting load
> of
> around 500-1000 machines , load average was going more than 1.9 to 2.0
> 
> --pankaj
> 
> ----- Original Message -----
> From: "Alceu Rodrigues de Freitas Junior" <alceu.rodrigues@wws.com.br>
> To: "Kancha ." <kancha2np@yahoo.com>
> Cc: "pankaj patel" <pankaj_surat@nettaxi.com>; "Peter Smith"
> <peter.smith@UTSouthwestern.edu>; "Squid" <squid-users@squid-cache.org>
> Sent: Tuesday, January 29, 2002 4:29 AM
> Subject: Re: [squid-users] eating cpu
> 
> >
> > the best solution for you, of course, it's to clean up all your client
> > machines. I had a problem with Nimda flooding a Gauntlet Firewall (from
> > NAI) because the virus makes HTTP request all the time. I got a lot of
> > "bad http header request" in the log files but you can't block these
> > request using a firewall because your users would do the same.
> >
> > this is a mess that maybe you could check (using a sniffer) EXACTLY how
> > the Nimda's requests works and try to match them using firewall rules. But
> > this could be a rigmarole. Try to clean up your client machines. It's a
> > hard work, but it's worth of it.
> >
> > On Tue, 29 Jan 2002, Kancha . wrote:
> >
> > > I'm using a Dell PowerEdge 2300 without RAID. I'm
> > > using a SCSI HDD.
> > >
> > > One of the reasons squid is consuming cpu is due to
> > > nimda and codered. I've seen lots nimda and codered
> > > requests in the log file.
> > >
> > > So i put ACL to block the worms
> > >
> > > acl nimda1 url_regex -i defaul.ida
> > > and similar lines for root.exe and cmd.exe then
> > > http_access deny nimda1 and similarly for the other
> > > two acls
> > >
> > > Despite this the requests aren't blocked. Whenever
> > > there is work attack the cpu utilization just grows
> > > rapidly.
> > >
> > > If i could only block these worms i guess cpu
> > > utilization would drop.
> > >
> > > Currently I'm using ipchains to redirect port 80 to
> > > 3128 only for request coming from my network. My
> > > clients are infected with these worms. I can't have
> > > all my clients to clean nimda as it is impossible to
> > > keep track of every client.
> > >
> > > I've seen lots of people even in this list mention the
> > > use of iptables, so i gues i'll switch to iptables as
> > > well.
> > >
> > > What should be the value of cache_mem for a server
> > > with 256M RAM. Currently I'm using 8M. I was using 16M
> > > previously.
> > >
> > > --- pankaj patel <pankaj_surat@nettaxi.com> wrote:
> > > > I was also facing the same problem, I was using
> > > > Netfinity5000, I also tried
> > > > on assambled pc(p3-500)
> > > > Finally I mooved back to RHL6.2 (2.2.14-5.0)
> > > > squid-2.3.STABLE1-5 and its
> > > > working fine on both the machines.
> > > >
> > > > ----pp
> > > >
> > > > ----- Original Message -----
> > > > From: "Peter Smith" <peter.smith@UTSouthwestern.edu>
> > > > To: "Kancha ." <kancha2np@yahoo.com>
> > > > Cc: <squid-users@squid-cache.org>
> > > > Sent: Monday, January 28, 2002 10:11 PM
> > > > Subject: Re: [squid-users] eating cpu
> > > >
> > > >
> > > > > Kancha:
> > > > >  It is entirely possible that you are using a Dell
> > > > box that comes with
> > > > > raid hardware which uses the aacraid driver.  If
> > > > so, most likely you
> > > > > will have better luck downgrading to the 2.2
> > > > kernel.  That is what I've
> > > > > had to do as I have 2 Dell Poweredge 2550s (with
> > > > the aacraid driver.)
> > > > >  My theory is the 2.4 series has a buggy aacraid
> > > > driver.
> > > > >
> > > > > Peter Smith
> > > > > Linux Systems Administrator
> > > > > University of Texas Southwestern Medical Center at
> > > > Dallas
> > > > > (USA) 214 648 3111
> > > > > peter.smith@utsouthwestern.edu
> > > > >
> > > > >
> > > > > Kancha . wrote:
> > > > >
> > > > > >I'm using squid as a transparent proxy on a RH
> > > > 7.2
> > > > > >machine. The hardware that i'm using is Dell
> > > > Power
> > > > > >Edge 2300 with 256Mb Ram and 6GB HDD. I've
> > > > allocated
> > > > > >2G for cache. I've 8M and cache_mem and I'm also
> > > > > >running named on the server.
> > > > > >
> > > > > >Average requests / hr through the proxy is around
> > > > > >22000. After about 2 hours the cpu is utilized
> > > > more
> > > > > >than 90% and the system gets really slow. The
> > > > browsing
> > > > > >get really slow. Despite the available bandwidth
> > > > the
> > > > > >browsing speed drastically decreases.
> > > > > >
> > > > > >Where have i gone wrong ?? I'm using ipchains and
> > > > > >redirecting all my web traffic throuh the router.
> > > > > >
> > > > > >Under this circumstance what would be the idle
> > > > > >configuration ??
> > > __________________________________________________
> > > Do You Yahoo!?
> > > Great stuff seeking new owners in Yahoo! Auctions!
> > > http://auctions.yahoo.com
> > --
> > Go away or I'll replace you with a very short shell script.
> >
Received on Tue Jan 29 2002 - 07:39:08 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:05:59 MST