Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The most permanent and effective solutions (in terms of minimizing adversarial activity over time and destroying the value of what is harvested) involve serving fake content (poison!), making site failures sporadic (forcing them to maintain state), and making some of those errors look like they're upstream not something you're doing on a specific machine (really bad luck mate!).

The deadenders who felt it was worth it will keep trying for at least a while; the new exploiters will tend to give up sooner. robots.txt is a courtesy. Not everybody puts stuff on the internet with a working theory that your experience is more important than theirs.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: