On Wednesday 10 September 2003 10:18 pm, sergio del villar wrote:
> those are exceptions of course. But if i access the
> page
>
> http://www.sex.com
>
> it has access! So the rules are not working
> what can i use instead of url_regex?
My recommendation would be a redirector (which you can write yourself and
check for words with boundaries), or a package such as SquidGuard or Dan's
Guardian.
What you are trying to do simply exceeds the capabilities of simple regex's.
Antony.
> --- Antony Stone <Antony@Soft-Solutions.co.uk>
> escribi�: > On Wednesday 10 September 2003 9:34 pm,
> Henrik
>
> > Nordstrom wrote:
> > > On Wednesday 10 September 2003 21.51, sergio del
> >
> > villar wrote:
> > > > sex
> > > > napster
> > > > kazaa
> > >
> > > I do not think this really is what you want.
> >
> > url_regex is a plain
> >
> > > regex match on the whole URL and "sex" will match
> >
> > any URL having the
> >
> > > letter "s" followed by "e" followed by "x"
> >
> > anywhere, not only the
> >
> > > word "sex".
> >
> > http://www.essexcc.gov.uk
> > http://www.essex.org
> > http://www.essex.edu
> > http://www.sussexenterprise.co.uk
> > http://www.sussexcricket.co.uk
> > http://www.basex-systems.com
> > http://www.wessex-aero.com
> >
> > --
> >
> > Normal people think "if it ain't broke, don't fix
> > it".
> > Engineers think "if it ain't broke, it doesn't have
> > enough features yet".
>
> _________________________________________________________
> Do You Yahoo!?
> La mejor conexi�n a internet y 25MB extra a tu correo por $100 al mes.
> http://net.yahoo.com.mx
-- I love deadlines. I love the whooshing noise they make as they go by. - Douglas Noel AdamsReceived on Thu Sep 11 2003 - 02:00:09 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:19:38 MST