Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So given that the only mitigation to Spectre is to isolate each website into its own underlying OS process, it seems very important to know which browsers are doing that.

Chrome is doing it. What about Firefox and Safari? What about Edge? Do they implement site isolation?



Firefox reduced the timer precision, this demo does not work. https://developer.mozilla.org/en-US/docs/Web/API/Performance...

On Chrome I have "too many false negatives" on 1st gen Ryzen.


I'm not able to get it to work in firefox either, but it feels like "reduced timer precision" shouldn't be sufficient to stop this particular attack. The authors of the demo here claim to be able to "amplify timing differences" by repeatedly hitting or missing cache and measuring the combined total time.

My searching indicates firefox's performance.now precision is 20us, but the demo claims to be able to get timing differences of 2ms pretty easily.

EDIT: 20us precision might be outdated... running this repeatedly on osx and linux, I'm distinctly getting my results "bucketed" into whatever the unit on the x-axis is. Based on that, I'm guessing timer precision on firefox is actually 1ms.

Increasing the cache timer repetitions gets me to a smooth graph, but still no separation of the peaks.

The fact that I can get the graph to go from discrete buckets to smooth normal curves makes me think this can indeed overcome reduced timer precision

The fact that I can't get the peaks to separate makes me wonder if some other mitigation is protecting me.


It is worth noting that this POC was specifically targeted at Chromium based browsers. To quote the blog post, they also developed "a PoC which leaks data at 60B/s using timers with a precision of 1ms or worse". So Firefox's protection's are likely not sufficient to mitigate all Spectre attacks.


Chromium has site-isolation (with some caveats around phones with limited resources) so both Chrome and Edge have site-isolation. Firefox is getting very close with Project Fission [1] and I predict they'll ship it relatively soon. Currently Safari doesn't have site-isolation and AFAIK they have not publicly committed to anything in terms of getting there. They have done some work in this space (search around for Process Swap on Navigation (PSON)) but it isn't complete.

[1]: https://wiki.mozilla.org/Project_Fission


Oye. So that means that in Safari, it's possible for any site to run some Spectre Javascript to read the cookies and passwords for any other site that happens to be running in the user's browser, and then log in as that user on other sites.

That's pretty bad.


The proof of concept is in Chrome, so it appears Chrome's mitigations are not sufficient.


No, the attack always works, whether there's an isolated process or not. In Chrome's design you shouldn't be able to access any data of value with the attack, that is data from other sites (like cookies) or privileged data. I don't know if that's indeed true or not in Chrome, but that's why it was designed that way.


Chrome's design ensures that Spectre can only access resources that end up in an attacker controlled process. And this [1] post on "Post-Spectre Web Development" goes into detail about how a given website can ensure that its resources don't end up in an attacker controlled process. There are also a number of default protections against this like SameSite cookies and CORB that protect some resources by default.

[1]: https://w3c.github.io/webappsec-post-spectre-webdev/


No, the POC only shows the script leaking memory into javascript running within the same process, and thus the same site. Chrome is still preventing the info from leaking across sites.


The big caveat to this is that an attacker can generally get a browser to include a cross-site resource in their process. For example, `<img src="https://sensitive.com/myprofilepic.png">` will cause the image to be loaded in the attacker's process where they can then potentially steal it. The article "Post-Spectre Web Development" goes into details on how sites can defend against this (and other vectors).


That's why there was the recent W3C draft about Spectre and the policies around which sites can frame other sites.


I may be wrong about this, and about specifically how exposed browsers are to Spectre, but the only real mitigation here, since protected memory can be accessed through the same mechanism, is disabling branch prediction and CPU caches, or barring those caches being reused across threads or execution contexts.

Or completely redesigning those aspects of CPU behavior to remove the ability for similar timing attacks.


No. Process isolation still works, assuming your CPU is not broken. The real mitigation is that going forwards, you have to assume that any attacker-controlled code always has full read access to the entire process it runs in, and you need to architect your systems so that this does not result in any bad things. It is entirely possible to do this.

You can still have branch prediction and caches, which is a very good thing, because not having those would cut the performance of modern CPUs to less than 1% of what it currently is.


Or just not accepting and evaling arbitrary code from every single website the user visits, including ones that should only be static documents or forms.

That this is generally considered ok boggles the mind. That browser vendors have made it difficult for users to disable this is insane! Even MS Internet Explorer gave users at least that security tool!


the modern web doesn't work without Javascript. It's as simple as that.

People like you who want to turn js off are a very small niche. And you have better solutions. Run your browser in a remote server using rdp or vnc. I think it may be equally safe and you may actually have a larger chunk of web working for you.


The "modern web" does not want to work without JS. It definitely could work without JS for the most part.


I don’t know about that. Google Maps (or any “slippy” map) couldn’t work. Instant messaging couldn’t work. Video chat wouldn’t work. Rich document editing wouldn’t work.

I get that some people would like that because they think all those things belong in native apps, but that’s not where the “modern web” is today.


If we didn't complain about Java applets, we still would have Java applets.

If we didn't complain about Flash, we still would have Flash applets.

Yesterday there was however people that were saying, "that's how it is, deal with it" or "But, but... Without it we can't have this and that" or "The ship has sailed, get over it".

Today, we have reasons to complain about JS. Yes, it enhances interactivity, but it also abused. There's a lot of unwanted interactivity (the type of one-way interactivity that lets a server know more than necessary about their clients).


Note that I limited what I was talking about to static documents and forms. Executable code should be a permission the page has to explicitly request from the user like notifications and microphones.


Considering the increasing sandboxing and protections I think we're getting closer to browsers in VMs already. Someday I can see a permission prompt to allow performance sensitive sites lower level access.


Chrome desktop does actually allow you to run with JS disabled and allow per site with only a few clicks.


Not only that but google search, gmail, and other google services will still workout without javascript.


Not sure if it's that simple. Chrome (89.0.4389.90 64-bit) runs the tests but leaks memory for me. Firefox (86.0 64-bit) runs the tests but every one of the tests fails. Brave (1.5.112 64-bit) doesn't even run the tests. I'm on Linux (5.11.4-arch1-1).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: