Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anything in the console? I actually didn't realize iTunes still worked on Mavericks at all. I uninstalled it as I don't like iTunes.
 
Anything in the console? I actually didn't realize iTunes still worked on Mavericks at all. I uninstalled it as I don't like iTunes.
weirdly enough, it just randomly started working again
anything network related on mavericks is very weird
they have a 50/50 chance of working
sometimes they will just work, sometimes they will not work
 
Anything in the console? I actually didn't realize iTunes still worked on Mavericks at all. I uninstalled it as I don't like iTunes.
Screen Shot 2024-11-27 at 10.13.26.png
Screen Shot 2024-11-27 at 10.13.40.png

nothing in the console, and the error goes away and comes back randomly throughout the day
 
If it's of any interest this is what happens every time I launch iTunes. On my copy of Mavericks, I stuck to version 11.4.

 
Squid seems like overkill for this, in terms of binary size and resource footprint. Seems like the only part you need is an https proxy (to be precise a proxy that supports http connect) that can intercept/mitm traffic (thus splitting the single tls negotiation into two, one between the client and proxy, and another between proxy and remote server).

Any of

should work, all very simple and readable < 500 loc implementation of this, thanks to Golang's robust networking libraries.

And as a bonus this should work on any machine (< 10.5 or > 10.11) if you pair it the legacy support dylib.
None of the examples seemed to work, but after a morning of trial and error managed to get a combination of LLMs to write something that does work, complete with intermediate certificate fetching. (Yes, this is AI Slop, I'm being up front about it! But it's the kind of boilerplate LLMs tend to be decent at, and I'm too unfamiliar with golang to code this by hand in a reasonable amount of time.)


It seems to work... very well. Advantages compared to Squid:
  1. Starts up in a few seconds.
  2. Reads certificates from the standard Apple certificate store (Keychain Access).
  3. Apple Services such as iMessage appear to work without special-casing to exclude Apple domains.
  4. Python's pip works without doing anything special.
  5. The proxy does not break Websockets. No legacy software uses these, but if you forgot to exclude Chromium Legacy or Firefox Dynasty from the proxy (which you should still do for performance), websites that use Websockets won't break.
I have absolutely no clue what is going on with #3–5. But the end result seems to be that this proxy works much more seamlessly, I can't find an instance of it randomly breaking something.

However, there is one huge downside! This proxy uses WAY more CPU than Squid! When actively loading content, CPU usage in Activity Monitor can spike as high as 50%! I have basically never seen Squid use more than 1%.

On the other hand, these spikes are very brief, and the proxy consumes no CPU when idle. It may be that the proxy isn't actually consuming more CPU, it just has a different task priority (or something like that) which results in much higher spikes which end much more quickly.

Should I move forward with updating my proxy package to use this Go-based proxy instead of Squid? Please don't say "offer a choice", as that's too complicated! I'm not going to actively purge old versions from the internet of course, but I'm only going to maintain one or the other going forward.
 
Last edited:
Cool. Your call on this. I’m still using your original release of squid on 10.6. No issues over the last few years, so i’m going to stick with it. If it ain’t broke, don’t fix it.
 
Cool. Your call on this. I’m still using your original release of squid on 10.6. No issues over the last few years, so i’m going to stick with it. If it ain’t broke, don’t fix it.
The problem is that it is broke, you just haven't run into a situation yet where it broke something, or you didn't realize it was the proxy's fault.

The current version works well 98% of the time and is very clearly superior to not using a proxy (in which case lots of things will be broken), but there are edge cases where it breaks stuff.
 
I managed to get the CPU usage spikes down to <15% (by asking LLMs to make the code faster a bunch of times and testing their changes) so I am likely to move forward with updating the proxy package to use this.
 
I'm not a go expert so I cannot review. CPU usage should not be that high though I think, I'd expect it be less than 5% since all the proxy is doing is shuffling bytes.

Possibly the snippet
Code:
// Copy data from src to dst
func proxy(dst io.Writer, src io.Reader, direction string, done chan<- struct{}) {
    defer func() { done <- struct{}{} }()
    buf := make([]byte, 64*1024)

can be optimized a bit, using a bigger buffer size? Also golang should have very good profiling tools, try using that to generate a flamegraph and see where bottleneck is.

Edit: If I had to hazard a guess, I'd first try increasing buffer size for the copy. Then perhaps check your stdout logs and verify that you're not AIA chasing unnecessarily. The on-the-flycert creation seems like it could be one bottleneck due to need to create a new key each time. Since this is all client side, you could maybe use weaker crpyto just for the local cert.

Also you could even build it with as high as go 1.18 I think.

Also what was the issue you saw with https://github.com/kr/mitm/blob/master/mitm.go ? At first glance that seems to be a lot more elegant in that it re-uses golang's existing ReverseProxy library
 
Last edited:
Also what was the issue you saw with https://github.com/kr/mitm/blob/master/mitm.go ? At first glance that seems to be a lot more elegant in that it re-uses golang's existing ReverseProxy library
I can try it again next weekend, I probably gave up on it too quickly. The fact that the code is spread across multiple files was throwing me off, I couldn't figure out where some of the functions were coming from but now I see they're just in other files in that repo. I would have to figure out how to add the AIA chasing, and make sure it reads certs from the system store (aka Keychain Access).

Also you could even build it with as high as go 1.18 I think.
When I tried it this morning, Go 1.18 seemed to have some type of issue building with the Mavericks version of clang, which I probably could have worked around by disabling CGO, but I wasn't sure if there would be a fallout from that. I know I could have used the MacPorts toolchain but I wanted to avoid that for ease of development.

I tried Go 1.10, because it was the last version to officially support Mavericks and it works without any workarounds (ie no Legacy Support), but I realized it doesn't support TLS 1.3, which isn't needed by anything today but might become needed in the future. Besides, I would eventually need Legacy Support anyway for Snow Leopard compatibility.

I subsequently tried Go 1.13 more-or-less arbitrarily, and when it worked I stopped experimenting with other versions.
 
>make sure it reads certs from the system store

Go already does this by default, and it even uses native keychain library for validation so long as it is not cross-compiled (this feature might have only been added in go 1.16 or something.)

>Go 1.18 seemed to have some type of issue building with the Mavericks version of clang,

Build it in a VM using newer version of osx, then add legacysupport dylib to get it to work on 10.9. 1.18 is the last version for which this trick works. 1.19 actually works too, but you need to polyfill an extra function. 1.20 is where it gets very difficult. If you get it working on 1.18 I'd actually recommend the slightly extra effort to add the extra polyfill to get it building with Go 1.19 since it that will use not just system certs but actual system framework for validation, which presumably has its own layer of caching built in

 
Last edited:
  • Like
Reactions: Wowfunhappy
I am currently trying to make curl work. I have installed squid with this, got the xi1 (ix1?) certificate and enforced it systemwide, compiled newest 32-bit versions of curl and openssl, and... well, it does not work. OpenSSL correctly connects to example.org but curl refuses with error 60, claiming that there is a self signed certificate in there. I am not good at networking, but I did not see any self signed certificates whilst reading openssl logs. So, I am not sure.
I am trying this on Mac OS X 10.6.8 and I am not sure what to do next. I have tried various compilations of curl but nothing is working to fix this. I updated the cacert.pem and that changed nothing, too. I tried macports to compile curl and, surprise, nothing compiles: it tries to compile openssl3 and fails to do so, and since this is a dependency of curl... yeah, it doesnt work.
Im turning to some (possible) help here because I dont know what the issue might be. I thought it might be squid doing something, just not sure what it would be..
 
I am currently trying to make curl work. I have installed squid with this, got the xi1 (ix1?) certificate and enforced it systemwide, compiled newest 32-bit versions of curl and openssl, and... well, it does not work. OpenSSL correctly connects to example.org but curl refuses with error 60, claiming that there is a self signed certificate in there. I am not good at networking, but I did not see any self signed certificates whilst reading openssl logs. So, I am not sure.
I am trying this on Mac OS X 10.6.8 and I am not sure what to do next. I have tried various compilations of curl but nothing is working to fix this. I updated the cacert.pem and that changed nothing, too. I tried macports to compile curl and, surprise, nothing compiles: it tries to compile openssl3 and fails to do so, and since this is a dependency of curl... yeah, it doesnt work.
Im turning to some (possible) help here because I dont know what the issue might be. I thought it might be squid doing something, just not sure what it would be..

Did you set all of the below environment variables?

HTTPS_PROXY="http://localhost:3128"
SSL_CERT_FILE="/Library/Squid/Certificates/squid.pem"
REQUESTS_CA_BUNDLE="/Library/Squid/Certificates/squid.pem"

The latter two are because curl (like many other UNIX command line tools, although not all of them) does not read certificates from Keychain Access by default, it will read from one of these environment variables. Off the top of my head, I don't remember which of them curl specifically uses.

You could also probably use -k to disable cert checking, but you probably shouldn't. (This basically erases the security advantages of using https over http.)

Mind, if you're using a modern copy of curl compiled with modern OpenSSL, you probably don't need to use a proxy at all.
 
Did you set all of the below environment variables?

HTTPS_PROXY="http://localhost:3128"
SSL_CERT_FILE="/Library/Squid/Certificates/squid.pem"
REQUESTS_CA_BUNDLE="/Library/Squid/Certificates/squid.pem"

The latter two are because curl (like many other UNIX command line tools, although not all of them) does not read certificates from Keychain Access by default, it will read from one of these environment variables. Off the top of my head, I don't remember which of them curl specifically uses.

You could also probably use -k to disable cert checking, but you probably shouldn't. (This basically erases the security advantages of using https over http.)

Mind, if you're using a modern copy of curl compiled with modern OpenSSL, you probably don't need to use a proxy at all.
I wouldnt really call buillds from 2019 and 2021, respectively, "modern", but i see some point here. I will try to check this though and see if that will work.
 
What about the opposite, then? If you explicitly set HTTPS_PROXY="" so curl doesn't use the proxy (because it's relatively modern and can connect on its own), does that work?

As a sanity check, what happens if you disable the proxy in System Preferences? Curl usually doesn't care about that but maybe your build does?
 
What about the opposite, then? If you explicitly set HTTPS_PROXY="" so curl doesn't use the proxy (because it's relatively modern and can connect on its own), does that work?
Yep it does, i have the proxy enabled and curl doesnt use it (while websites do, because it wouldnt work otherwise). I have a lot of variables in my bash profile but I dont want to touch it, because its working so I aint fixing. Seems like all this allowed me to finally use the pc normally. Gonna attach that image with my variables for future reference. (its a photo, because I didnt wanna bother with doing this in the browser)
 

Attachments

  • IMG_0614.jpeg
    IMG_0614.jpeg
    1,007.9 KB · Views: 45
And if it isn't possible, because I'm using my main computer to share Ethernet to my PPC, can I set up a proxy server on my main PC ?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.