Why am I passionate about this?
My primary interest is in brain function. Because the principal job of the brain is to process information, it is necessary to define exactly what information is. For that, there is no substitute for Claude Shannon’s theory of information. This theory is not only quite remarkable in its own right, but it is essential for telecoms, computers, machine learning (and understanding brain function). I have written ten "tutorial introduction" books, on topics which vary from quantum mechanics to AI. In a parallel universe, I am still an Associate Professor at the University of Sheffield, England.
James' book list on information theory
Why did James love this book?
This is a more comprehensive and mathematically rigorous book than Pierce’s book. For the novice, it should be read-only after first reading Pierce’s more informal text. Due to its vintage, the layout is fairly cramped, but the content is impeccable. At almost 500 pages, it covers a huge amount of material. This was my main reference book on information theory for many years, but it now sits alongside more recent texts, like MacKay’s book (see below). It is also published by Dover, so it is reasonably priced.
1 author picked An Introduction to Information Theory as one of their favorite books, and they share why you should read it.
Written for an engineering audience, this book has a threefold purpose: (1) to present elements of modern probability theory — discrete, continuous, and stochastic; (2) to present elements of information theory with emphasis on its basic roots in probability theory; and (3) to present elements of coding theory.
The emphasis throughout the book is on such basic concepts as sets, the probability measure associated with sets, sample space, random variables, information measure, and capacity. These concepts proceed from set theory to probability theory and then to information and coding theories. No formal prerequisites are required other than the usual undergraduate…
- Coming soon!