RE: [squid-users] Solutions for transparent + proxy_auth?

From: Chris Robertson <[email protected]>
Date: Tue, 21 Feb 2006 14:54:45 -0900

> -----Original Message-----
> From: Steve Brown [mailto:sbrown25@gmail.com]
> Sent: Tuesday, February 21, 2006 2:20 PM
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] Solutions for transparent + proxy_auth?
>
>
> > So the plan is to run a Squid server (service?) on every
> > computer that is going to access the internet?
>
> That's the idea we're throwing around, yes.
>
> > While that should certainly work, I wouldn't want to be
> > the one responsible for the maintenance thereof.
> > Every computer's squid.conf is going to need to be hand
> > edited to supply different credentials,
>
> Why do they have to supply different credentials? That's what disc
> images are for. :-)

In other words, you don't need to differentiate access per site/computer/user.

>
> > and somehow locked down so those credentials can't
> > be changed.
>
> That's what root is for. :-)

Ah. I made some bad assumptions*.

>
> > Every computer is going to need to perform interception
> > of its own traffic.
>
> So what's the problem there?
>
> > Additionally, you have all the caveats of interception proxies.
>
> Yup.
>
> > Perhaps if we knew more about the setup and requirements,
> > alternative solutions could be proffered.
>
> My company is a third-party provider of services to automotive
> dealerships. All of our order management systems are web based so
> that we can access them from any dealership (or any computer) in the
> world. We provide computers (Apple G4s to be specific) so that there
> is no cost to the dealership to be on our program.
>
> The problem comes in at dealerships that are coporately-owned that
> will not allow our computer to access their network. And yes, I've
> tried repeatedly to use their network, but no dice. So we must
> provide our own internet connection in these dealers. The problem we
> have is that some of our staff are spending more time surfing
> myspace.com (or much, *much* shadier sites) than they are selling
> product.
>

Do these connections involve static IPs? src based ACLs would work nicely in that case.

> To resolve this problem, we setup a Squid server with an ACL of
> whitelsited domains. Then the problem we had is that first thing in
> the morning when Firefox asks for a user/pass for the proxy (since
> their last auth expired), the user, who is dumb, attempts to enter
> some other u/p to login, for example email u/p, system u/p or
> something else that they pull out of the air, which obviously makes
> them get denied access to the cache. Then they call us complaining
> that the computer "broke."
>

You could change the authentication denied page to on that describes what the problem is and gives a hint at which credentials to use...

> So we started kicking around ideas and thinking about some way to meet
> the following criteria:
>
> + Centralized domain whitelist that can be easily managed by
> our IT staff
> + Forcing all user traffic through said proxy w/out prompting
> the user for a u/p

...or not.

> + Doing all this w/out creating an open proxy.
>
> Obviously this would be much easier if all of these machines were on a
> LAN, but they aren't.
>

The approach you have outlined should work just fine. Given there isn't going to be much (if any) difference between workstations it shouldn't be difficult to care for. Another approach would be static IPs and src based ACLs (as mentioned above). Yet another approach would be to use VPN or SSH tunnels (removes the need for customer premise squids). Both of the last two eliminate the need to pass authentication credentials across the wire.

Chris

*I was envisioning something along the lines of a computer lab full of Windows machines, each needing to supply different credentials for tracking which person goes where. Words fail to describe the feelings that image invokes.
Received on Tue Feb 21 2006 - 16:54:52 MST

This archive was generated by hypermail pre-2.1.9 : Wed Mar 01 2006 - 12:00:03 MST