Hacker Newsnew | past | comments | ask | show | jobs | submit | tomrittervg's commentslogin

In this context "a unique fingerprint" means that your fingerprint does not match any other user's. When you visit Site A and B you give a fingerprint X that is the same on A and B but no one else on the internet has ever sent.

In contrast a randomized fingerprint mean when you visit A you have a fingerprint X' and on B you have a fingerprint Y' and no one else on the internet has X' or Y' but A and B can't correlate you.

The protections we've put in place first try to do API normalization to make it so more people have a fingerprint X, and it isn't unique. And then they do API randomization so you use X' and Y'.

If a fingerprint goes to extra effort of detecting a randomized fingerprint, and ignore (or remove) the randomization, they will get the X fingerprint which - hopefully - matches many more users.


It's 'Suspected Fingerprinters' that controls the Fingerpritning Protection feature described in the blog post. But yes, naming and descriptions is hard and never seems to work.

But to disable it on a per-site basis, I would just disable ETP for the entire site. If it's a service or site you use frequently you probably trust them or otherwise have a login to them that makes trying to avoid fingerprinting illogical.


This is true, but adding a sandboxing to browsers has been a huge part in driving up the difficulty/cost of browser exploits, and driving down the frequency of their use.

And also we'll pay for a bypass of the wasm sandbox. (Actually, looking at our table, I'm going to try and get the bountyamount upped...)


The vulnerability did require JavaScript to trigger.

I think it would be a labor of love and craftsmanship to exploit a content process today without using JavaScript.


> The vulnerability did require JavaScript to trigger.

Can you back this up with a citation?


He works (or recently worked) for Mozilla on security-related projects. The code commit fixing the issue was isolated to the /dom/ directory in the source tree, and Firefox does not support CSS Animation Timelines. The Animation Timelines code is not directly accessed by web devs, and it appears the only way to execute that code is via the JS API for Animation Timelines. I'm not a web security expert, but the signs seem to point to him being correct.

Once again, JS proves to be a security risk.


This is precisely how I feel about Binary Transparency


There are more details above but the short version is that it is possible to build extensions for other browsers that work with Firefox Sync. But the only one linked seemed to be for Gnome's browser, so no one may have actually done it.


From the RFC: "Its goal is to take some source of initial keying material and derive from it one or more cryptographically strong secret keys."

In our case, the initial keying material is the output of PBKDF; and the two outputs we use are used as an encryption key and a bearer token (essentially a password but I call it an authentication token to avoid confusion with your actual password). There are less complicated ways to do this; but this one is cryptographically conservative.

"essentially requires the server to be able to reverse HMAC-Hash to find the encryption key from the the authentication token" - the server can't do that; which is why the server can't figure out your encryption key from your authentication token. (The best the server could do would be to try a password guessing attack.)


Right what I'm confused about is that first bit, my understanding from the RFC is that the implementation should have look something like

    return pbkdf2.derive(password, email, PBKDF2_ROUNDS, STRETCHED_PASS_LENGTH_BYTES)
      .then((quickStretchedPW) => {
        result.quickStretchedPW = quickStretchedPW;
        // stretch to twice the length necessary
        return hkdf(quickStretchedPW, kw('generated'), HKDF_SALT, HKDF_LENGTH * 2)
          .then((generated) => {
            // split output into two cryptographically strong keys
            result.unwrapBkey = generated.slice(0, HKDF_LENGTH);
            result.authPW = generated.slice(HKDF_LENGTH);
          }
        );
      }
    )
but my read in pseudo code of what they end up doing is closer to this:

    hashed_password = hash(password, 'salt1')
    hashed_auth_tok = hash(hashed_password, 'salt2')
    hashed_unwrap_key = hash(hashed_password, 'salt3')
which seems secure because the server can't reverse hashed_unwrap_key to find hashed_password and thus shouldn't be able to calculate hashed_auth_tok. However the point of HKDF is to make multiple cryptographic keys while it looks like in practice we are just using it as a one way funciton.


Ah okay, I understand better.

The (second) pseudocode you have is right (the second two 'hash()' should be 'hkdf()', and the first should be 'pbkdf()'.)

The first is an alternate way to do it. But for cryptographic reasons that tend to be buried in formal proofs; you generally don't want to derive twice the keylength you need and then split for two keys. (Besides the necessity for formal proofs (as I understand it) - it's just easier to make an indexing mistake and reuse key material. One also becomes more vulnerable to a collision attack, although that might not make sense in this context it related to the formal proofs.) I will note that sometimes - especially in embedded spaces - you'll see people taking this shortcut in the name of speed or codesize.

Instead you want to fully derive two keys using separate HKDF calls with separate 'labels'. This provides strong domain separation for the keys.

But I'm mostly trying to provide with a pointer to what to read about to convince yourself. I'd start at https://crypto.stackexchange.com/search?q=domain+separation

If you find out we're doing something that still seems weird though, please send me an email!


got it, thanks for the response!


I happen to also work on Firefox/Tor Browser's anti-fingerprinting work, so yea - we're trying to make improvements there too =)

Containers is a big Firefox feature (exposed through an Add-On) in this category too.

As far as Web Extension APIs, I don't know much about that, but if you have an API that would enable a use case that Mozilla doesn't have a bug on and haven't considered; you are welcome to file a bug explaining what you would like and what you would use it for, and the Web Extension team will consider it.


Thanks. Great to read you work on anti-fingerprinting, I'd name this among the most important subjects today.

I don't code privacy enhancing extensions myself (that feels like "inventing my own crypto" for me - not enough competence to be sure I won't make it worse actually), I've just noticed Firefox becomes significantly more slow when I enable them so I guess there probably are some sorts of bottlenecks in the Web Extension APIs (or maybe not really).


I wasn't aware that any distribution (besides Tor Browser) was building Firefox (or anything really) reproducibly.

There's debian's https://reproducible-builds.org/ effort, but I thought that wasn't making much progress lately, nor was it deployed.

Could you provide more info on what distro you're using, or how they're doing this?


S/he may be referring to Gentoo Linux.


It is many folks', and we appreciate the feedback. Hopefully things will get better soon: https://github.com/mozilla/multi-account-containers/issues/3...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: