Part of the bcrypt process includes a "work factor", which is, crudely, "how many times do I run this?" What you can do is tune your work factor so that it takes a reasonably long time to hash a password (say, 0.05 sec on your webserver = 20 passwords/sec), which won't necessarily heavily impact the performance of your website, but which would be a significant impediment to anyone trying to brute-force those passwords.
As hardware improves, you just implement a system wherein when the user submits login information, you verify their password with your old work factor, and if it passes, re-hash with your new (slower) work factor and store the updated hash. This allows you to effectively use a progressively slower algorithm over the lifetime of your application to compensate for Moore's Law.
You obviously don't want to pick a work factor that's too high for your web server hardware, since that opens you up to DOS attacks, but a reasonable work factor can easily mitigate the weaknesses of MD5 and SHA1 - notably, that they can be computed by the hundreds of millions of second on the right harware.
As hardware improves, you just implement a system wherein when the user submits login information, you verify their password with your old work factor, and if it passes, re-hash with your new (slower) work factor and store the updated hash. This allows you to effectively use a progressively slower algorithm over the lifetime of your application to compensate for Moore's Law.
You obviously don't want to pick a work factor that's too high for your web server hardware, since that opens you up to DOS attacks, but a reasonable work factor can easily mitigate the weaknesses of MD5 and SHA1 - notably, that they can be computed by the hundreds of millions of second on the right harware.