I’ve previously written notes about an information theory betting “paradox” and Kelly’s optimal betting fraction.
Information theory formalized the duality between information transmission and probability. Many interesting results come from thinking of probability in terms of information, such as using Kelly betting to determine the maximum profit from private information about a stock or how to correctly price an option.
The following are the best three papers I’ve found on information theory for someone who has just started learning. (One other, Elements of Information Theory, is also good but not free except at a library; I have the solution manual in electronic format if anyone would like it)
1) Shannon’s original 1948 paper. Page 3-7 or so is the good bit if you don’t like math, it explains the information rate of English (also good if you do like math of course).
2) Information Theory, Inference, and Learning Algorithms by David MacKay. I really like how this one is written, with exercises embedded in the text to work as you go. This is exactly how I teach myself things when I’m reading independently. For anyone interested in machine learning, this will be …
I’ve previously written notes about an information theory betting “paradox” and Kelly’s optimal betting fraction.
Information theory formalized the duality between information transmission and probability. Many interesting results come from thinking of probability in terms of information, such as using Kelly betting to determine the maximum profit from private information about a stock or how to correctly price an option.
The following are the best three papers I’ve found on information theory for someone who has just started learning. (One other, Elements of Information Theory, is also good but not free except at a library; I have the solution manual in electronic format if anyone would like it)
1) Shannon’s original 1948 paper. Page 3-7 or so is the good bit if you don’t like math, it explains the information rate of English (also good if you do like math of course).
2) Information Theory, Inference, and Learning Algorithms by David MacKay. I really like how this one is written, with exercises embedded in the text to work as you go. This is exactly how I teach myself things when I’m reading independently. For anyone interested in machine learning, this will be a fascinating book. Start from page 1. (this book is also available online from the author’s site)
3) Lecture notes: “An Introduction to Information Theory and Entropy” by Tom Carter. This is the best one of the three for anyone reading about info theory for the first time. It’s intuitively explained and goes relatively slowly- it’s the first one I would recommend to a friend.
Please feel free to leave a comment, ask a question, or email me about anything, even unrelated to the above.