Re: [squid-users] Why squidguard or another redirektor???

From: Michael Fuller / Hotmail <[email protected]>
Date: Tue, 19 Nov 2002 09:58:00 +0530

Hi all,

Same requirement here too. We have used text files with url_regex to block
porn and warez sites. The text files contain over 40 k entries. If using
url_regex can affect performance, can I use acls based on dstdomain or dst
with text files ?

Thanks and regards,
Michael Fuller.

----- Original Message -----
From: "Henrik Nordstrom" <hno@marasystems.com>
To: <dumpmail@gmx.net>; <squid-users@squid-cache.org>
Sent: Monday, November 11, 2002 4:23 AM
Subject: Re: [squid-users] Why squidguard or another redirektor???

> You should not make overly large regex pattern lists. If you are
> making long lists then you SHOULD investigate using dstdomain or dst
> type ACLs where appropriate. Makes a huge difference in access
> control performance.
>
> A really long url_regex acl really can slow things down, and can get
> quite unpredictable to maintain (a small typo in a regex pattern can
> change the meaning significantly)
>
> A really long dstdomain or dst acl should barely be noticeable in
> terms of performance, and are generally much easier to maintain.
>
> Regards
> Henrik
>
> On Sunday 10 November 2002 18.37, dumpmail@gmx.net wrote:
> > Hi,
> >
> > i want to block url lists with squid 2.5STABLE1.
> >
> > Now i have a small list of urls and i want to use textfiles.
> > The entries in suqid.conf look like this:
> >
> > acl blockedsites url_regex -i "/etc/squid/block.txt"
> > acl unblockedsites url_regex -i "/etc/squid/unblock.txt"
> >
> > http_access deny blockedsites !unblockedsites
> >
> > Now i wanted to add to these files the huge blacklist of
> > squidguard. Does this slowdown squid or can i just use it?
>
Received on Thu Nov 21 2002 - 10:49:52 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:11:30 MST