undefined@lemmy.hogru.ch 3 days ago
This very much bothers me as a web developer. I go hard on Conditinal GET Request support and compression as well as using http/2+. I’m tired of using websites (outside of work) that need to load a fuckton of assets (even after I block 99% of advertising and tracking domains).
macOS and iOS actually allow updates to be cached locally on the network, and if I remember correctly Windows has some sort of peer-to-peer mechanism for updates too (I can’t remember if that works over the LAN though; I don’t use Windows).
The part I struggle with is caching HTTP. It used to be easy pre-HTTPS but now it’s practically impossible. I do think other types of apps do a poor job of caching things though too.
frongt@lemmy.zip 3 days ago
Yes, Windows peer to peer update downloads work over LAN. (In theory, I’ve never verified it.)
HTTP caching still works fine, if your proxy performs SSL termination and reencryption. In an enterprise environment that’s fine, for individuals it’s a non-starter. In this case, you’d want to have a local CDN mirror.
undefined@lemmy.hogru.ch 3 days ago
I couldn’t get SSL bumping in Squid on Alpine Linux about a year ago but I’m willing to give it another shot.
My home router is also a mini PC on Alpine Linux. I do transparent caching of plain HTTP (it’s minimal but it works) but with others using the router I do feel uneasy about SSL bumping, not to mention some apps (banks) are a lot more strict about it.
frongt@lemmy.zip 3 days ago
Yeah, you’ll have to have a bypass list for some sites.
Honestly, unless you’re actually on a very limited connection, you probably won’t see any actual value from it. Even if you do cache everything, each site hosts their own copy of jQuery or whatever the kids use these days, and your proxy isn’t going to cache that any better than the client already does.
WhyJiffie@sh.itjust.works 8 hours ago
don’t they always have a short cache timeout? the proxy could just tell the client that the cache timeout is a long time, and when the browser checks if it’s really up to date, it would redownload the asset but just return the right status code if it actually didn’t change.
and all the jquery copies could be also eliminated with a filesystem that can do deduplication, even if just periodically. I think even ext4 can do that with reflink copy, and rmlint helps there.
undefined@lemmy.hogru.ch 3 days ago
For my personal setup I’ve been wanting to do it on a VPS I have. I route my traffic through a bundle of VPNs from the US to Switzerland and I end up needing to clear browser cache often (web developer testing JavaScript, etc).
I do this in my projects (Hotwire) but I wish I could say the same for other websites. I still run into broken websites due to trying to import jQuery from Google for example. This would be another nice thing to have cached.