Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's equally easy for both. But people using broswers only do it a few times, while bots need to do it many times. A second for a human every X pages is not much, but it's a death-knell for the general practice of bots (and they can't store the cookies because you can rate-limit them that way).

Imagine scrapping thousands of page, but with a X>1 second wait for each. There wouldn't be a need to use such solution if crawlers were rate-limiting themselves, but they don't.



So is the solution to stymying bots to just add a page load delay of a second or two? Enough that people won't care, but it doesn't scale for bots?


Just adding a delay wouldn't achieve anything because bots can just do something else while they wait, whereas PoW requires them to actively spend their finite resources before they can continue doing whatever they want to do.


So if you rate limited to one request per second, then use 100 cookies to make 100 requests per second, 1 request per second per cookie.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: