Mathematical physicist extraordinaire John Baez digs in to Shannon entropy and coding over at Azimuth:
So, I want to understand Shannon’s theorems and their proofs—especially because they clarify the relation between information and entropy, two concepts I’d like to be an expert on. It’s sort of embarrassing that I don’t already know this stuff! But I thought I’d post some preliminary remarks anyway, in case you too are trying to learn this stuff, or in case you can help me.
I’d like to be an expert on these concepts too… even though my math skills, while not non-existent, are pathetic compared to Baez’s. As someone who applies thermodynamic models to problems in gene regulation, I’m very interested in the deep relationship between entropy, information, and computation – particularly when it comes to understanding how regulatory information is encoded in and read out from the genome.
So I’m boning up on the subject by reading Khinchin’s great classic, Mathematical Foundations of Information Theory. In the future I’ll share some findings here, but in the mean time, follow the link to Azimuth and read Baez’s great discussion of the Noisy Channel Coding Theorem.