Re: Site list..

From: Maciej Kozinski <[email protected]>
Date: Tue, 22 Jun 1999 12:52:40 +0200 (EEST)

Nardone Vittorio:
>
> Hello,
> I'd like to create daily a plain txt file with all sites accessed by Squid
> proxy Server (more than X times). Something like this :
>
> http://www.asus.com.tw
> http://www.spoiledteens.com
> http://www.webyoung.com
> http://www1.zacks.com
> http://www.prexxx.com
> http://www.cdnow.com
> http://www.tetonas.com
> http://206.146.143.66
> [etc...]
>
> It's possible ?
> Thank a lot. Bye !!!

Once again (I forgot about the more than X times)

#!/usr/local/bin/perl

$log = "/usr/local/squid/logs/access.log";
# e.g.
$xtimes = 10;

open (FILE, "$log") or die "Can't open $log!\n";
while (<FILE>) {
 @line = split (/\s+/);
 $url = $line[6];
 $url =~ /^(\w+:\/\/[a-zA-z0-9\.\_\-]+)\/.*$/;
 $sites{$1}++ ;
}
close (FILE);

foreach $key (keys %sites) {
   print $key, "\n" if ($sites{$key} > $xtimes);
 }

-- 
           Maciej Kozinski         http://www.uck.uni.torun.pl/~maciek/ 
	   Remember: Un*x _IS_ user friendly... It's just selective about
		     who it's friends are.
Received on Tue Jun 22 1999 - 05:14:56 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:46:57 MST