Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tens of thousands of scraper bots for a single site? Is that really the case? I would have assumed that maybe 3-5 bots send lets say 20 requests per second in parallel to scrape. Sure, they might eventually start trying different ips and bots if their others are timing out but ultimately it's still the same end result: All they will realize is that they have to increase the timeout and use headless browsers to cache results and the entire protection is gone. But yes, I think for big bot farms it will be a somewhat annoying cost increase to do this. This should really be combined with the cloudflare captcha to make it even more effective.


A lot of the worst offenders seem to be routing the traffic through a residential botnet, which means that the traffic really does come from a huge number of different origins. It's really janky and often the same resources are fetched multiple times.

Saving and re-using the JWT cookie isn't that helpful, as you can effectively rate limit using the cookie as identity, so to reach the same request rates you see now they'd still need to solve hundreds or thousands of challenges per domain.


If you're sending 20 requests per second from one IP address you'll hit rate limits quickly, that's why they're using botnets to DDoS these websites.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: