I've got squid 2.7 setup and running as a transparent HTTP proxy on
pfSense 2.1 snapshot from June 28th.
Now I'd like to set it up as an HTTPS transparent proxy as well.
In the proxy server's custom options box I've added :
https_port 127.0.0.1:3129 transparent \
Then I've created a NAT (Port Forward) rule to redirect all HTTPS
(destination port) traffic over to 127.0.0.1:3129, and automatically
added an associated filter rule which allows such connections.
Now when I'm trying to access to https://www.gmail.com for example, I've
got the browser warning about the name mismatch wrt the local
certificate (we're fine with that), but then I've got this message in my
(92) Protocol error
Squid's access.log contains :
1343186054.441 256 10.10.10.100 TCP_MISS/502 1481 GET https://www.gmail.com/ - DIRECT/18.104.22.168 text/html
And Squid's cache.log contains :
2012/07/25 14:14:14| SSL unknown certificate error 20 in /C=US/ST=California/L=Mountain View/O=Google Inc/CN=mail.google.com
2012/07/25 14:14:14| fwdNegotiateSSL: Error negotiating SSL connection on FD 37: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed (1/-1/0)
Any idea what I'm doing wrong ?
> Any idea what I'm doing wrong ?
This is what you're doing wrong:
> Now I'd like to set it up as an HTTPS transparent proxy as well.
HTTPS traffic is encrypted, and squid is lacking the proper
keys/certificates to decrypt it.
In theory, you could set up squid with its own certificates, but that
will turn squid into a man-in-the-middle, i.e. all your clients will
complain that the certificate doesn't match the sites they're trying to
IOW: Just don't do it.
I'd suggest looking into browser autoconfiguration using auto.pac /
I know this is man in the middle, and I even wrote that we were OK with the browser message which clearly says there's something like a man in the middle attack going on.
Since I've added its own certificate to Squid, it isn't lacking them, and so it "*should*" work from what I've read on the net about this subject. But clearly I'm missing something because instead of having the traffic decrypted by Squid and then encrypted again by Squid for local clients, I've got a Protocol Error.
So my original question was not about it being OK to do it or not, but more about why it didn't work as expected.
Thanks for your feedback anyway, if I can't do otherwise I'll play with autoconfiguration scripts.
> So my original question was not about it being OK to do it or not, but
more about why it didn't work as expected.
You need to allow for ssl cert errors or ignore ssl certificate errors.
This could be a threat because squid decides on the validity of certificate
on say name mismatch by itself without end user being informed.
> I decided to enable transparent proxy on my school firewall because I
> was getting a million requests a day to configure proxy settings on
> student laptops.
> But now that I turned on transparent proxy, students have discovered
> that they can get to banned sites (like facebook) via https.
> http://www.facebook.com is blocked but https://www.facebook.com still
> Can someone let me know how to block these? I understand I have to deny
> the 'connect method' but don't see where to do this. Can this only be
> done in command line?
You cannot transparently proxy SSL connections. You would have to deny
outbound access to port 443 and if they want SSL, they must configure
the proxy settings into their browser(s) either by hand or automatically
with something like WPAD.
If you don´t want any www.facebook.com connections at all you can use the DNS Forwarder to change its IP to something else...
> I can't block tcp 443 on a wholesale basis; we need it for lots of stuff. If I can do it for a single domain, I'm there.
The idea is to set up a non-transparent proxy for all traffic and block any traffic not using the proxy.
The whole purpose of https is to prevent a third party (in this case your firewall) from seeing anything above the minimum routing information (source and destination IP address).
I think WPAD is the way to go for this one.
(Where I went to high school, they somehow blocked certain https sites, but I think it was by IP and the subscription service they used for the block list actually listed all the IPs for facebook and other blocked sites.)
Web proxy caching is a way to store requested Internet objects (e.g. data like web pages) available via the HTTP, FTP, and Gopher protocols on a system closer to the requesting site. Web browsers can then use the local Squid cache as a proxy HTTP server, reducing access time as well as bandwidth consumption. This is often useful for Internet service providers to increase speed to their customers, and LANs that share an Internet connection. Because it is also a proxy (i.e. it behaves like a client on behalf of the real client), it can provide some anonymity and security. However, it also can introduce significant privacy concerns as it can log a lot of data including URLs requested, the exact date and time, the name and version of the requester's web browser and operating system, and the referrer.
A client program (e.g. browser) either has to specify explicitly the proxy server it wants to use (typical for ISP customers), or it could be using a proxy without any extra configuration: "transparent caching", in which case all outgoing HTTP requests are intercepted by Squid and all responses are cached. The latter is typically a corporate set-up (all clients are on the same LAN) and often introduces the privacy concerns mentioned above.