Re: GoZilla vs. Squid

From: Clifton Royston <[email protected]>
Date: Thu, 9 Mar 2000 09:43:59 -1000

On Thu, Mar 09, 2000 at 01:53:47PM +0100, Tilman Schmidt wrote:
> More information on my problem with Squid starting multiple parallel
> transfers for the same URL:
>
> I am running Squid 2.2.STABLE4-hno.19990807 on Linux kernel 2.0.36.
> One of my users started a download of the URL
>
> http://www.geocities.com/EnchantedForest/Pond/8851/6.doc
>
> (a 5 MB MP3 file) with GoZilla. This ran for about 12 hours, at least
> half of that time filling 128 kbps of bandwidth all by its own, and
> still hadn't managed to get the file. During that time, I noticed in
> the cachemgr.cgi filedescriptors page, each time I looked, several
> (between 3 and 5) server side sockets for that URL open at the same
> time to different IPs.

Does GoZilla use multiple concurrent HTTP range requests? I believe
there were similar problems reported half a year ago with some product
called NetAnts which did that.

If so, the interaction of that with Squid and with Geocities' multiple
IPs is likely to cause problems which are not easy to resolve.

> Unfortunately, this seems to happen every time somebody tries to
> download some difficult to get URL (typically MP3s) with GoZilla.
> Can anyone confirm such a detrimental interaction between Squid
> and GoZilla, and/or suggest a configuration change to avoid it?

This would have to go to one of the real experts, not me.
  -- Clifton

-- 
 Clifton Royston  --  LavaNet Systems Architect --  cliftonr@lava.net
      The named which can be named is not the Eternal named.
Received on Thu Mar 09 2000 - 12:47:20 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:52:00 MST