I have an internal web server that some users are accessing through a proxy, it's an older server that badly needs to be replaced, but until the developers get around to migrating the applications I have to solve a problem. Basically it is serving some Excel and PDF files, but the server responds with a 304 Unmodified response even though the files have been updated, causing squid to serve the cached file instead of the updated file.
I was able to use the an ACL using the dstdomain option and a no_cache deny line to stop it from caching the server entirely. However as this machine is quite slow, I would like to still cache the html and images as those work correctly. While using the url_regex lines to get just the Excel and PDF files not cached, I am still getting some TCP_MEM_HIT entries in the access logs on these files. I probably should mention that I disabled the disk cache for now on this system while figuring this problem out, all actual web request are forwarded through another proxy that is still caching on disk, only the internal web applications go direct.
Here's what I have, anyone have an idea where I went wrong
I am Running Squid 3.0 Stable 9 on FreeBSD 6.2
Acl NOCACHEPDF url_regex -i ^http://hostname.\*pdf$
Acl NOCACHEXLS url_regex -i ^http://hostname.\*xls$
No_cache deny NOCACHEPDF NOCACHEXLS
I have used cat combined with awk and grep to check the pattern matching on the access logs with:
Cat /usr/local/squid/var/logs/access.log | awk '{print $7}' | grep -e ^http://hostname.\*pdf$
Cat /usr/local/squid/var/logs/access.log | awk '{print $7}' | grep -e ^http://hostname.\*xls$
This correctly matches all the entries I want and none that I don't want to stop caching.
Thanks,
���� Dean Weimer
���� Network Administrator
���� Orscheln Management Co
Received on Fri Jan 02 2009 - 16:13:26 MST
This archive was generated by hypermail 2.2.0 : Sat Jan 03 2009 - 12:00:01 MST