Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Part of the bcrypt process includes a "work factor", which is, crudely, "how many times do I run this?" What you can do is tune your work factor so that it takes a reasonably long time to hash a password (say, 0.05 sec on your webserver = 20 passwords/sec), which won't necessarily heavily impact the performance of your website, but which would be a significant impediment to anyone trying to brute-force those passwords.

As hardware improves, you just implement a system wherein when the user submits login information, you verify their password with your old work factor, and if it passes, re-hash with your new (slower) work factor and store the updated hash. This allows you to effectively use a progressively slower algorithm over the lifetime of your application to compensate for Moore's Law.

You obviously don't want to pick a work factor that's too high for your web server hardware, since that opens you up to DOS attacks, but a reasonable work factor can easily mitigate the weaknesses of MD5 and SHA1 - notably, that they can be computed by the hundreds of millions of second on the right harware.



Ok, got it. Thanks. Yes, much better. I was under the impression bcrypt was also a fixed cost operation.

If anyone else is wondering how to implement bcrypt with a cost parameter in PHP, see http://php.net/manual/en/function.crypt.php under CRYPT_BLOWFISH.


It's probably prudent for most developers to instead use phpass [0] rather than attempt to roll their own implementation.

[0] http://www.openwall.com/phpass/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: