I've been working on using Squid as an image cache for images that
really won't change very often but do have dynamic-looking addresses
(large querystrings). I've been able to get Squid to cache these
images in memory, so if I do a fresh pageload from Opera or Safari, I
get a TCP_MEM_HIT, which is fantastic, but when I reload the page, I
get a TCP_REFRESH_MISS, which is not so great. I know that the browser
is specifying that this is a reload so it'll get the cache to re-cache
the image, but I really don't want this to happen - actually hitting
the images is a very expensive operation, since they're generated. How
can I get squid to ignore browser refreshes and just serve everything
as if it was a new request? Ideally images will live in the cache
almost forever and always be served from the cache. My squid.conf is
below.
Thanks!
Tom MacWright
http_port 80 accel defaultsite=209.20.72.110
cache_peer 127.0.0.1 parent 8080 0 no-query originserver name=myAccel
acl mapbox_sites dstdomain 209.20.72.110
http_access allow mapbox_sites
cache_peer_access myAccel allow mapbox_sites
# Items without a specific expiry date will be fresh for 10 minutes
# After 20% of the object's stated life (via Expires) the object will
be refreshed
# the longest non-specified objects can last is 4320 minutes
# refresh_pattern -i (/cgi-bin/|\?) 10 100% 4320
refresh_pattern .����������� 300 20% 4320
# by default, squid ignores requests which have a ? in them
# so, we override this (and comment out the code later on in this doc)
# this also commands squid to ignore lots of things that would prevent
# it from caching stuff with certain headers
refresh_pattern -i \? 3000 990% 30000 override-expire override-lastmod
ignore-no-cache ignore-private ignore-reload
cache_dir ufs /var/spool/squid 500 16 256
# Cache commonly-used images in memory, since that's nice.
maximum_object_size_in_memory 1000 KB
maximum_object_size 1 GB
Received on Thu Feb 26 2009 - 18:23:36 MST
This archive was generated by hypermail 2.2.0 : Fri Feb 27 2009 - 12:00:01 MST