Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

May be the teams developing AI crawlers are dogfooding & are using the AI itself(and its small context) to keep track of the sites that are already scraped. /s


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: