This is the opposite of pdf articles on HN. Many are titled with something like "beginners" or "for everyone", then I open up a Latex formatted document that is impenetrable from the first page using math symbols I have never seen before. this is the opposite, much easier to understand and not loaded up with mathematics symbol vomit all over the page.
For another perspective of "the study of information in the social world", check out "Variational ecology and the physics of sentient systems" (https://www.sciencedirect.com/science/article/pii/S157106451...). Also based on information theory, released just a few months after op paper.
I'm a pretty confused on how this all ties together. Are these these deep dives on examples on information theory? Would love some perspective that knows more on the topic.
It's an introduction to the fundamental concepts of information theory (entropy, KL divergence, conditional entropy), using 20 questions as an ongoing source of example applications and interpretations of these concepts. The author also sprinkles in various interpretations coming from their own research focus (the social sciences).
Definitely not a deep dive. The details can probably be found by chasing down the citations.
Why do they use the word Release? Release of entropy suddenly drops into the picture when talking abt ice to water. Why not just say info/entropy/uncertainty of the system is increasing or decreasing?
This subject is new to me but I like the 20 questions analogy.
Thinking of probability distribution of N words as a measure of info in someone's mind is interesting.
Disciplined minds that have certain concepts on their finger tips clearly have spent time and effort producing that distribution.
True, it does seem a bit weird to use "release" in reference to entropy. I've googled around a bit and can't find a similar phrasing anywhere. A lot of places talk about "energy" being released into the surroundings, and this article talks about using the entropy of the surroundings to predict whether or not ice will melt: http://www.4college.co.uk/a/O/entsurr.php .
You know the book series 'Computers for Dummies?' It's a play on that. See also Zytrax's 'X for Rocket Scientists' which covers complicated stuff like LDAP and DNS and explains why they are the way they are.
I didn't even make the "for dummies" connection! The name is a huge fail but it looks like the author meant it well, perhaps someone should tell him that it comes across as intellectual elitism and exclusionary to people who perhaps feel a little insecure reading material on the internet that is associated with fancy universities and graduate schools.
I thought it was ironic, because it starts out almost like a children's story and is very approachable. I'd rather read this than a textbook, that's for sure.
Ironically for me, the humanities people that it was aimed at often took it amiss. Dan Dennett told me as much. throwawayjava thank you for the suggestion; I'll use it in a footnote in the next iteration!
Yes, I thought this article was a very well written introduction, but I held back on sharing this link with some non-math friends because I didn't want them to feel dumb if they find it difficult to get through.
It's hard to imagine anyone improving on it.
http://math.harvard.edu/~ctm/home/text/others/shannon/entrop...