It's equally easy for both. But people using broswers only do it a few times, while bots need to do it many times. A second for a human every X pages is not much, but it's a death-knell for the general practice of bots (and they can't store the cookies because you can rate-limit them that way).
Imagine scrapping thousands of page, but with a X>1 second wait for each. There wouldn't be a need to use such solution if crawlers were rate-limiting themselves, but they don't.
Just adding a delay wouldn't achieve anything because bots can just do something else while they wait, whereas PoW requires them to actively spend their finite resources before they can continue doing whatever they want to do.
Imagine scrapping thousands of page, but with a X>1 second wait for each. There wouldn't be a need to use such solution if crawlers were rate-limiting themselves, but they don't.