29-Apr-02 at 13:19, fefi@ig.com.br (fefi@ig.com.br) wrote :
> However, I still can�t make squid block properly certain sites as per the
> instructions I�ve found. When I block them, there are some other dynamic
> pages that are blocked as well, which are not intended to be blocked.
>
> I am running Squid 2.4 Stable 1 on Linux Red Hat 7.2. Here it is an extract
> of my squid.conf file:
>
> hierarchy_stoplist cgi-bin ?
> acl QUERY urlpath_regex cgi-bin \?
> no_cache deny QUERY
> acl rionet src 121.200.200.0/255.0.0.0
> acl nosite url_regex "/usr/local/squid/etc/denied.txt"
> #Deny access to certain sites
> http_access deny nosite
> #Allow access to our local network
> http_access allow rionet
> http_access allow localhost
Can you send a small amount of the denied.txt file, since the actual
regexps are in there. Dynamic pages are set to not be cached, but are not
set to be blocked, in the config above.
-- [Simon White. vim/mutt. simon@mtds.com. GIMPS:95.16% see www.mersenne.org] If the brain was so simple that we could understand it, we would be so simple that we could not understand it -- Lyall Watson [Linux user #170823 http://counter.li.org. Home cooked signature rotator.]Received on Mon Apr 29 2002 - 10:33:43 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:07:44 MST