Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think robots.txt should be ignored. Everyone wants people to not do things they don't like. We don't have to entertain each and every such one. The future is IPFS or something like it, so "crawling" will be a meaningless act.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: