I think that's impossible to do with an extension, there are far too many holes to plug and it's a moving target, it has to be done in the browser. I believe GrapheneOS is trying to do that with Vanadium (security and privacy hardened fork of Chromium for Android).
I don't think that will work, because that will also provide false information to the logic.
You will have messed up layouts and unneeded quirks. Moreover, banks are using fingerprinting to detect fraud so you will have a hard time on those websites as well.
By this logic you cannot use anything, because there's likely at least one website that uses its featureset.
Of course I wouldn't farble on my bank's website, that would be pretty stupid.
But by default I would want trackers to get the farbled data, and only allowlist the websites I trust. Same trust concept as with uBlock Origin, NoScript and others.
Old Opera (before it became another Chromium-shell) had an easy JS on/off toggle in the menu, but I don't remember if it only took effect on load or immediately.
Tried to browse a while with NoScript addon. But barely any page loads, so you need to whitelist almost every page you visit, which defeats the purpose.
I have been thinking about some kind of render proxy that runs all the JS for you somewhere else in a sandbox and sends you the screenshot or rendered HTML instead. Or maybe we could leverage an LLM to turn the Bloated JS garbage into the actual information you are looking for.
> Tried to browse a while with NoScript addon. But barely any page loads, so you need to whitelist almost every page you visit, which defeats the purpose.
Nah, this is just straight up false. Many pages work fine with NoScript blocking all scripts. For those that don't, you usually only have to allowlist the root domain, but you can still leave the other 32 domains they are importing blocked. It's actually surprisingly common for blocking JS to result in a better experience than leaving it enabled (eg no popups, no videos, getting rid of fade-ins and other stupid animations).
I won't argue if you think that is too much work, and I definitely wouldn't recommend it for a non-technical user, but it's not nearly as bad as you described.
I wasn't clear, but this is about my experience. Maybe you are in a different bubble. But I'm not able to book a hotel, browse GitHub, file my taxes, make a bank transfer or even look up the menu of a restaurant.
The only exceptions for me are HN and a handful of news websites.
UMatrix has a better interface. The problem is the same, one has to find the minimum set of scripts that does not break the core functionality of the site. It's an ability that can be trained but it's the reason for I don't install it on the browsers of my friends. However I considered installing it, keeping it disabled and using it as a tool to show how much stuff each site loads from so many different sources. Many domain names are very telling even for the uninitiated.
I am still running the NoScript, whitelisting the page I am on. It has benefits of not whitelisting other domains it tries to pull stuff from, which 90% is enough to get working site that is way cleaner than with all the bullshit loaded.
Perhaps only enables js when user clicks something.