Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> First of all, the goal is that a website operator would be able to control the use of information they disseminate to the general public via their website, such that it won't be used specifically for AI training.

This isn't the goal; the goal is to punish/demotivate poorly-behaved scrapers that hammer servers instead of moderating their scraping behaviour. At least a few of the organisations deploying Anubis are fine with having their data scraped and being made part of an AI model.

They just don't like having their server being flooded with non-organic requests because the people making the scrapers have enough resources that they don't have to care that they're externalising the costs of their malfeasance on the rest of the internet.



Ah, thanks for the clarification. I guess then it pulling a double duty against all scraping in general is not a flaw either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: