and internet still works… Mostly
That load-bearing “mostly” is doing a lot of work here.
I invite everybody to find out how everything “mostly” works if you disable javascript
sunzu@kbin.run 4 months ago
Noscript would fix this issue... Deny most of that shit and internet still works... Mostly
and internet still works… Mostly
That load-bearing “mostly” is doing a lot of work here.
I invite everybody to find out how everything “mostly” works if you disable javascript
I actively do this with uMatrix - granted, I only block non-first-party JavaScript. Most sites I visit only require a few domains to be enabled to function. The ones that don't are mostly ad-riddled news sites.
There are a few exceptions to this - AWS and Atlassian come to mind - but the majority of what I see on the internet does actually work more or less fine when you block non-first-party JavaScript and some even when you do that. uMatrix also has handy bundles built-in for certain things like sites that embed YouTube, for example, that make this much easier.
Blocking non-first-party like I do does actually solve this issue for the most part, since, according to the article, only bundles that come from the cdn.polyfill.io domain itself that were the problem.
You’re still trusting that the 1st party javascript won’t be vulnerable to supply chain attacks, though
In my experience, first-party JavaScript is more likely to be updated so rarely that bugs and exploits are more likely than supply chain attacks. If I heard about NPM getting attacked as often as I hear about CDNs getting attacked, I'd be more concerned.
I've been using noscript for years.
Yeah, it took me about that long to get my regular websites working right too. And then i had to reinstall for unrelated reasons and all that customisation was gone.
While you can back it up, at least once you’ve suffered the loss multiple times you can get it 90% back on first re-visit after reinstall.
I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy
Having done this for many many years, I can tell you: if you allow the site scripts (which is an acknowledgement of js at least), and a few “big” ones like ajax.google.com, jquery.com, and ytimg.com, etc., you then find a smaller subset of annoying-but-necessary-for-individual-websites that you can enable as needed or just add them as trusted if you’re into that kind of thing.
After that you have the utter garbage sites with 30 scripts of tracking data-sucking bullshit (CNN, looking at you) and for those sites I have said “Thou shalt bite my shiny metal ass” and i just don’t go there.
It’s a concession to js, yes, but it’s also not free rein to trample all over the surfing experience. Totally worth the time to figure out.
9point6@lemmy.world 4 months ago
Not a solution. Much of the modern web is reliant on JavaScript to function.
Noscript made sense when the web was pages with superfluous scripts that enhanced what was already there.
Much of the modern web is web apps that fundamentally break without JS. And picking and choosing unfortunately won’t generally protect from this because it’s common practice to use a bundler such as webpack to keep your page weight down. This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.
Not saying this is a great situation or anything, but suggesting noscript as a solution is increasingly anachronistic.
homesweethomeMrL@lemmy.world 4 months ago
“function” is doing a lot of lifting there. Trackers, ads, and assorted other bullshit is not the kind of functioning anyone needs.
It’s true the average user gets flummoxed quickly when the scripts are blocked, but they can either sink (eat ads and trackers) or swim (learn what scripts to allow). (Spoiler: they almost always sink)
dan@upvote.au 4 months ago
This wasn’t bundled. People inserted a script tag pointing to a third-party CDN onto their sites. The output changes depending on the browser (it only loads the polyfills needed for the current browser) so you can’t even use a subresource integrity hash.
parpol@programming.dev 4 months ago
9point6@lemmy.world 4 months ago
Flash was magnitudes worse than the risk of JS today, it’s not even close.
Accessibility is orthogonal to JavaScript if the site is being built to modern standards.
Unfortunately preference is not reality, the modern web uses JavaScript, no script is not an effective enough solution.
parpol@programming.dev 4 months ago
parpol@programming.dev 4 months ago
homesweethomeMrL@lemmy.world 4 months ago
100% agree. A super-fast text only internet layer is approved.
btaf45@lemmy.world 4 months ago
And much of it works better and faster without JavaScript. Some sites don’t work in Noscript, but most sites run faster and work well enough.
PopOfAfrica@lemmy.world 4 months ago
I only allow JS on a whitelist.
9point6@lemmy.world 4 months ago
A whitelist wouldn’t mitigate this issue due to bundling
kautau@lemmy.world 4 months ago
Image
dan@upvote.au 4 months ago
In this case the script wasn’t bundled at all - it was hotlinked from a third party CDN.
PopOfAfrica@lemmy.world 4 months ago
Imo, computing, like all other things, requires a little trust and risk. The problem is most people are Wayyy to trusting in general.