urlregex is not bad, but here you test the complete url incl. path.
try using dstdomain to block .foo.com (includes www.foo.com, www2.foo.com,
acl banned dstdomain "/var/squid/banned"
http_access deny banned all
there is also dstdomain_regex to block webservers with regex, like
.foo[0-9].com.
see squid.conf and look for dstdomain in the acl-part...
mfg
Markus Rietzler
* <rietzler_software/>
* RZF NRW
* Tel: 0211.4572-130
> -----Urspr�ngliche Nachricht-----
> Von: stuart.lamble@batepro.co.za [mailto:stuart.lamble@batepro.co.za]
> Gesendet am: Mittwoch, 9. Oktober 2002 16:56
> An: squid-users@squid-cache.org
> Betreff: [squid-users] Blocking specific web sites - not keywords
>
> I use the following line in the squid.conf file.
> acl banned url_regex "/var/squid/banned"
> In this file I list keywords.
> How do I enter specific web sites to block.
> Example. www.hotpcat.com
> It must take that full string and not use just parts of it, like www.
> Or it must not stop something similar like www.hotcars.com
>
> Please advise.
>
>
>
> Stuart
>
Received on Thu Oct 10 2002 - 06:19:21 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:10:39 MST