Re: Web Content Suck

From: Juan Carlos Castro y Castro <[email protected]>
Date: Wed, 28 Jul 1999 19:33:37 -0300

Henrique Abreu wrote:

> Hi!
>
> Have squid a options to can this automatic "suck" the content of web sites
> in internet in predefined times, to preserver time from users conected ?
>
> Regards,
> Abreu

Best bet to me would be to create a cron job which would wget recursively the pages
in question and then delete what it got. You can instruct wget to use a proxy, by
using some environment variable, startup file or something. Can't remember exactly
now.

[]'s,

Received on Wed Jul 28 1999 - 16:16:55 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:47:36 MST