Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Finger on the other hand is would be a very narrow API to for a certain service without ANY of the flexibility of HTTP.

That is the point. The flexibility is not free. Every conditional doubles the number of possible execution flows. This brings complexity. To some extent it is mitigated by the economies of scale because now everyone uses HTTP for something, so collectively we get that more complex code more polished. But there is no such thing as the bug free code - so every participant will have to deal with patch cycle and generally preventing bitrot.

For a small well bounded custom protocol which solves a well defined specific use case, one can hope to write a dependency-free implementation that can be tested and work well enough and left alone.

I recently was at an event with a few thousand wifi devices.

About a third of the internet traffic was updates..



No disagreement here, but this was about flexibility, not an absolute judgement of how far-reaching a protocol must be.

I think I like the idea of a "spec" inside the same "protocol" more. For example if you understand HTTP you can quickly reason about any spec of a REST API that's done with JSON payloads without caring for the HTTP wrapper layer, just as you don't care for TCP around it.


tptacek’s original comment was amounting to “finger protocol is strictly worse than http” - which reads quite as an absolute judgement.

Yeah, reusability and layering of engineering knowledge is useful - it makes dealing with complexity easier.

But it also makes it easier to build complexity without spending thinking of a simpler solutions. Because time to market.

And thus we have exhibits at https://mobile.twitter.com/internetofshit




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: