Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess it depends on what world you live in. For example, using ASPNET Core, I just drop in this https://learn.microsoft.com/en-us/aspnet/core/performance/ra... and boom I have rate limiting and I do not have to stress about threads or state or whatever.


ASP.NET Core truly is a joy to work with


That's rate limiting locally i.e single instance. It is a fairly trivial thing and isn't the topic here.


While that is true, at that point you should be rate limiting at the reverse proxy or load balancer.

- Nginx https://blog.nginx.org/blog/rate-limiting-nginx

- Caddy https://github.com/mholt/caddy-ratelimit

- Treafik https://doc.traefik.io/traefik/middlewares/http/ratelimit/


That's almost certainly per server instance though, there's no mention of any type of synchronization across multiple instances, so if you e.g run many small ones or run the service as a lambda I'd be surprised if it worked like you expected.


At that point you should be rate limiting at the reverse proxy or load balancer.

- Nginx https://blog.nginx.org/blog/rate-limiting-nginx

- Caddy https://github.com/mholt/caddy-ratelimit

- Treafik https://doc.traefik.io/traefik/middlewares/http/ratelimit/

IMO Lambda is kind of an unfair example because the author doesn't mention having multiple instances. Plus a hot take I have is you should not be building an entire web-app as a Lambda or series of Lambda functions... AWS does not have solutions for load balancing in things like APIG so you would have to architect that via DynamoDB or ElastiCache which is the "extra layer or two of overhead" the author mentioned.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: