[squid-users] RE: corrupt web caches?

From: Steve <[email protected]>
Date: Sat, 3 Apr 2004 06:30:23 -0500

Ralph,

The problem is that we have very poor performance, and have a group of
people taking the shotgun approach to its cause. I've ruled out aliens,
sun-spots and nostradamus, but I can't refute claims of corrupted cache,
index issues, etc, because I don't know enough about squid. I have been
told that we need to reboot the server down every day, delete the acct file
every week, and start squid with the "-z" option every 3-6 months. I don't
know if this is SOP, or if I'm supposed to do it because its the way things
have always been done.

Personally, I think it is a problem with the load balancer which all three
cache servers are connected to, because the performance issues tend to arise
during peak access times. Unfortunately, a different group does the
networking support, and they claim that their infrastructure is fine.

Can you suggest any "proof-positive" methods I can use to demonstrate that
the problem is NOT with the cache servers? (And I'm afraid you'll need to
be detailed in how to run the commands, because I am just learning squid.)

Thanks again!
Steve

-----Original Message-----
From: Raphael Maseko [mailto:ralph@zamnet.zm]
Sent: Saturday, April 03, 2004 3:38 AM
To: stevebayer@cox.net
Cc: squid-users@squid-cache.org
Subject: RE: corrupt web caches?

Steve,

The good news is that logs show that your cache is fine - all the object
have been validated.

What I wanted to know, however, is where you got the information that your
cache was corrupt. Was this from the logs or from someone (I think Tim asked
you this). Is the cache working at all?
The last error message that you sent was complaining about DNS and not
corruption of your cache. I guess I am now lost as to what the exact problem
is, pardon me. You may have to explain it again.

Ralph

-----Original Message-----
From: stevebayer@cox.net [mailto:stevebayer@cox.net]
Sent: Friday, April 02, 2004 8:50 PM
To: ralph@zamnet.zm
Subject: corrupt web caches?

Ralph,

I have been unable to get permission to send a log file, but I've selected
entries which I believe will confirm that the cache is stable

When we restart squid with the -s option, we are able to see the startup
results in the syslog. On each server, it comes up saying:

Apr 2 11:45:05 mickeymouse squid[232]: Done reading /var/squid/cache
swaplog (556853 entries)
Apr 2 11:45:06 mickeymouse squid[232]: Finished rebuilding storage from
disk.
Apr 2 11:45:06 mickeymouse squid[232]: 556853 Entries scanned
Apr 2 11:45:06 mickeymouse squid[232]: 0 Invalid entries.
Apr 2 11:45:06 mickeymouse squid[232]: 0 With invalid flags.
Apr 2 11:45:06 mickeymouse squid[232]: 556853 Objects loaded.
Apr 2 11:45:06 mickeymouse squid[232]: 0 Objects expired.
Apr 2 11:45:06 mickeymouse squid[232]: 0 Objects cancelled.
Apr 2 11:45:06 mickeymouse squid[232]: 0 Duplicate URLs purged.
Apr 2 11:45:06 mickeymouse squid[232]: 0 Swapfile clashes avoided.
Apr 2 11:45:06 mickeymouse squid[232]: Took 17.0 seconds (32691.5
objects/sec).
Apr 2 11:45:06 mickeymouse squid[232]: Beginning Validation Procedure
Apr 2 11:45:06 mickeymouse squid[232]: 262144 Entries Validated so far.
Apr 2 11:45:06 mickeymouse squid[232]: 524288 Entries Validated so far.
Apr 2 11:45:06 mickeymouse squid[232]: Completed Validation Procedure

further down in the log, we see the message:

Apr 1 14:43:01 mickeymouse squid[237]: Rebuilding storage in
/var/squid/cache (CLEAN)

Based upon these startup results in the log file, is it necessary to rebuild
the caches?

Thanks in advance!

Steve

--
Outgoing mail is certified Virus Free.
Checked by AVG Anti-Virus (http://www.grisoft.com).
Version: 7.0.230 / Virus Database: 262.6.5 - Release Date: 3/31/2004
Received on Sat Apr 03 2004 - 04:30:02 MST

This archive was generated by hypermail pre-2.1.9 : Fri Apr 30 2004 - 12:00:01 MDT