RE: [squid-users] Fine Tuning - Suggestions If you would..

From: Nguyen, Khanh, INFOT <[email protected]>
Date: Thu, 19 Oct 2006 17:21:59 -0400

Errol,

If you are familiar with scripting language like Perl, awk, you can
quickly process the access.log (not the cache.log) for MISS entry, list
of URLs that are MISS, how many of them, Approximate how long the object
stay in cache (a hit then a miss on the same url) Are they including the
query string.. Those data would be handful for you to figure out a
policy to improve the hit rate and bandwidth saving.

Alternative to scripting, you can try grep utility of unix, but you will
have lot more data to analyze by eyes. For example: 'grep MISS
access.log > output.txt' to get all miss entries and output to file
output.txt

Not sure if there is any package, utility out there to do this kind of
analysis. Maybe a nice utility tool to analyze the log and give out some
suggestions to tune it for higher hit rate.

One thing i found useful when tunning is to go after the objects that
being accessed the most. Maybe a very handful of objects that result low
hit rate, if you can correct those, your goal is almost done.

Khanh

-----Original Message-----
From: Errol Neal [mailto:eneal@dfi-intl.com]
Sent: Thursday, October 19, 2006 12:48 PM
To: Nguyen, Khanh, INFOT; squid-users@squid-cache.org
Subject: RE: [squid-users] Fine Tuning - Suggestions If you would..

Khanh wrote :

>> I would suggest you analyze the cache.log to see which objects are
responsible for miss, size of object....

Thanks for the reply.
Do you have any suggestions on how one goes about doing this in an
effective, efficient manner?

Errol.
Received on Thu Oct 19 2006 - 15:22:12 MDT

This archive was generated by hypermail pre-2.1.9 : Wed Nov 01 2006 - 12:00:04 MST