[SQU] Squid Failure

From: Andrew Blanche <[email protected]>
Date: Mon, 4 Sep 2000 21:53:42 +1000

Hello to all

I have recently developed a rather huge problem with our Squid cache.

It keeps Failing aleast once a day during peak load times!!!
On failurethe squid child process has stopped.
The only message I have been able to get on the failure is:
"Duplicate page error" then some type of memory address.
This message is then repeated a couple of times with different addresses.
Then the message vm:stopping process squid then a message about deleting duplicate pages.

We had been running Squid for about two years without any problems, I recently added 30 additional modems to our dial pool making a total of 140 and the problems started.

We were running Squid 2 on Redhat 6.0
The machine was a pentium 233, 256Mb ram 2X 9.1 gig SCSI drives(40Mb)

I then built up a new box:
Pentium 3 733 Mhz
384 Mb Ram(133 Mhz)
1X 9.1 Gig lvd SCSI (160 Mb/sec)
Red hat 6.2
Squid 2

I am still getting the same failure during peak load(About 5 access sec)

If anybody has any ideas I would love to hear them A.S.A.P.
I am getting really frazzeled and don't know what to do!!!!!!!!!!!!!!

Regards Andrew Blanche

--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Mon Sep 04 2000 - 04:56:36 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:55:12 MST