The most recommended information theory books

Who picked these books? Meet our 12 experts.

12 authors created a book list connected to information theory, and here are their favorite information theory books.
Shepherd is reader supported. When you buy books, we may earn an affiliate commission.

What type of information theory book?

Loading...

Book cover of Who Wrote the Book of Life?: A History of the Genetic Code

David W. Ussery Author Of Computing for Comparative Microbial Genomics: Bioinformatics for Microbiologists

From my list on the history of heredity and DNA.

Why am I passionate about this?

I love to hear stories about how people solve problems, and I have been curious about how science works since I was 12 years old. A decade later, when I was 22 years old, some of my friends joked that I "spoke DNA," and it’s true that I have been obsessed with trying to understand the physical structures of DNA for more than four decades now. I live my life vicariously through my students and help them to learn to tinker, troubleshoot, and recover from their failures.

David's book list on the history of heredity and DNA

David W. Ussery Why did David love this book?

I love the way that this book puts solving the genetic code in the context of the development of the "information age" in the 1950s.

This book explains the origins of the popular (but wrong) idea that DNA is the "book of life" and some sort of advanced information system. I call this a ‘one-dimensional’ view of life (as opposed to a four-dimensional view, which takes into account not just sequence but secondary and tertiary structures and how they change with time).

I use this book in my introductory bioinformatics lectures to help get across the "new" concept that, contrary to popular cultural beliefs, DNA is not a language, nor is it a 'sophisticated computer code’. Biology is messy and complicated and very much contextual.

By Lily E. Kay,

Why should I read it?

1 author picked Who Wrote the Book of Life? as one of their favorite books, and they share why you should read it.

What is this book about?

This is a detailed history of one of the most important and dramatic episodes in modern science, recounted from the novel vantage point of the dawn of the information age and its impact on representations of nature, heredity, and society. Drawing on archives, published sources, and interviews, the author situates work on the genetic code (1953-70) within the history of life science, the rise of communication technosciences (cybernetics, information theory, and computers), the intersection of molecular biology with cryptanalysis and linguistics, and the social history of postwar Europe and the United States.

Kay draws out the historical specificity in the…


Book cover of A Mind at Play: How Claude Shannon Invented the Information Age

Rob Conery Author Of The Imposter's Handbook: A CS Primer for Self-taught Developers

From my list on self-taught programmers.

Why am I passionate about this?

I taught myself to code back in 1994 while working the graveyard shift as a geologist in the environmental industry. My job consisted of sitting in a chair during the dark hours of the night in a shopping center in Stockton, CA, watching another geologist take samples from wells in the parking lot. A friend of mine suggested I learn to code because I liked computers. I don’t mean to make this out to be a “it’s so simple anyone can do it!” You need to have a relentless drive to learn, which is why I wrote my book, The Imposter’s Handbook - as an active step to learning what I didn’t know I didn’t know.

Rob's book list on self-taught programmers

Rob Conery Why did Rob love this book?

You’ve heard of Einstein, Turing, Newton, and Hawking - but do you know who Claude Shannon is? Would you be surprised if I told you that he’s probably done more for our current way of life than all of the others combined? It’s true, and it’s unbelievable.

Claude Shannon was a quiet, quirky man who had what you might call The Most Genius Move of the last forever years: he took an obscure discipline of mathematics (Boolean Algebra) and applied it to electrical circuits, creating the digital circuit in the process. If you’ve ever wondered how 1s and 0s are turned into if statements and for loops - well here you go. 

Oh, but that’s just the beginning. Dr. Shannon took things much further when he described how these 1s and 0s could be transmitted from point A to point B without loss of data. This was a big problem…

By Jimmy Soni, Rob Goodman,

Why should I read it?

1 author picked A Mind at Play as one of their favorite books, and they share why you should read it.

What is this book about?

Winner of the Neumann Prize for the History of Mathematics

**Named a best book of the year by Bloomberg and Nature**

**'Best of 2017' by The Morning Sun**

"We owe Claude Shannon a lot, and Soni & Goodman’s book takes a big first step in paying that debt." —San Francisco Review of Books

"Soni and Goodman are at their best when they invoke the wonder an idea can instill. They summon the right level of awe while stopping short of hyperbole." —Financial Times

"Jimmy Soni and Rob Goodman make a convincing case for their subtitle while reminding us that Shannon…


Book cover of The Information: A History, a Theory, a Flood

Michael L. Littman Author Of Code to Joy: Why Everyone Should Learn a Little Programming

From my list on computing and why it’s important and interesting.

Why am I passionate about this?

Saying just the right words in just the right way can cause a box of electronics to behave however you want it to behave… that’s an idea that has captivated me ever since I first played around with a computer at Radio Shack back in 1979. I’m always on the lookout for compelling ways to convey the topic to people who are open-minded, but maybe turned off by things that are overly technical. I teach computer science and study artificial intelligence as a way of expanding what we can get computers to do on our behalf.

Michael's book list on computing and why it’s important and interesting

Michael L. Littman Why did Michael love this book?

This remarkably thorough and well researched book gives a sense of the sweep of history of the ideas that underpin the digital revolution. These are topics that I know really well, but the book added texture and nuance and I found myself reading it with eyes wide open and jaw slightly slack.

Gleick is a great story teller and he has dug into the topics and their implications so well that I felt like I had a front-row seat to the invention of Morse Code, "memes", and the theory of information itself. Quite an accomplishment!

By James Gleick,

Why should I read it?

4 authors picked The Information as one of their favorite books, and they share why you should read it.

What is this book about?

Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing.

We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code.

In 'The Information' James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the 'bit', it is a fascinating…


Book cover of The Mathematical Theory of Communication

Chris Conlan Author Of Algorithmic Trading with Python: Quantitative Methods and Strategy Development

From my list on mathematics for quant finance.

Why am I passionate about this?

I am a financial data scientist. I think it is important that data scientists are highly specialized if they want to be effective in their careers. I run a business called Conlan Scientific out of Charlotte, NC where me and my team of financial data scientists tackle complicated machine learning problems for our clients. Quant trading is a gladiator’s arena of financial data science. Anyone can try it, but few succeed at it. I am sharing my top five list of math books that are essential to success in this field. I hope you enjoy.

Chris' book list on mathematics for quant finance

Chris Conlan Why did Chris love this book?

While studying computer networks, Claude Shannon did something pretty impressive. He reformulated the majority of classical statistics from scratch using the language and concepts of computer science. 

Statistical noise? There’s a new word for that; it’s called entropy. Also, it turns out it is a good thing, not a bad thing because entropy is equal to the information content or a data set. Tired of minimizing the squared error of everything? That’s fine, minimize the log of its likelihood instead. It does the same thing. This book challenges the assumptions of classical statistics in a way that fits neatly in the mind of a computer scientist. As a quant trader, this book will help you understand and measure the information content of data, which is critical to your success.

By Claude E. Shannon, Warren Weaver,

Why should I read it?

2 authors picked The Mathematical Theory of Communication as one of their favorite books, and they share why you should read it.

What is this book about?

Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.


Book cover of The User Illusion: Cutting Consciousness Down to Size

Brian J. McVeigh Author Of The 'Other' Psychology of Julian Jaynes: Ancient Languages, Sacred Visions, and Forgotten Mentalities

From my list on the bicameral mind, mentality, and consciousness.

Why am I passionate about this?

I have always been fascinated by how the human mind adapts, both individually and through history. Julian Jaynes, who taught me while pursuing my PhD in anthropology from Princeton University, provided me with a theoretical framework to explore how the personal and cultural configure each other. Jaynes inspired me to publish on psychotherapeutics, the history of Japanese psychology, linguistics, education, nationalism, the origin of religion, the Bible, ancient Egypt, popular culture, and changing definitions of self, time, and space. My interests have taken me to China and Japan, where I lived for many years. I taught at the University of Arizona and currently work as a licensed mental health counselor. 

Brian's book list on the bicameral mind, mentality, and consciousness

Brian J. McVeigh Why did Brian love this book?

Supported by a wide range of examples drawn from various disciplines, this book demonstrates how we are only conscious of a small amount of what our hidden psychological machinery manufactures nonconsciously.

This work provides a key perspective needed to appreciate Julian Jaynes’s theory of consciousness and, thus his ideas on bicameral mentality. 

By Tor Norretranders,

Why should I read it?

2 authors picked The User Illusion as one of their favorite books, and they share why you should read it.

What is this book about?

As John Casti wrote, "Finally, a book that really does explain consciousness." This groundbreaking work by Denmark's leading science writer draws on psychology, evolutionary biology, information theory, and other disciplines to argue its revolutionary point: that consciousness represents only an infinitesimal fraction of our ability to process information. Although we are unaware of it, our brains sift through and discard billions of pieces of data in order to allow us to understand the world around us. In fact, most of what we call thought is actually the unconscious discarding of information. What our consciousness rejects constitutes the most valuable part…


Book cover of An Introduction to Information Theory

James V. Stone Author Of Information Theory: A Tutorial Introduction

From my list on information theory.

Why am I passionate about this?

My primary interest is in brain function. Because the principal job of the brain is to process information, it is necessary to define exactly what information is. For that, there is no substitute for Claude Shannon’s theory of information. This theory is not only quite remarkable in its own right, but it is essential for telecoms, computers, machine learning (and understanding brain function). I have written ten "tutorial introduction" books, on topics which vary from quantum mechanics to AI. In a parallel universe, I am still an Associate Professor at the University of Sheffield, England.

James' book list on information theory

James V. Stone Why did James love this book?

This is a more comprehensive and mathematically rigorous book than Pierce’s book. For the novice, it should be read-only after first reading Pierce’s more informal text. Due to its vintage, the layout is fairly cramped, but the content is impeccable. At almost 500 pages, it covers a huge amount of material. This was my main reference book on information theory for many years, but it now sits alongside more recent texts, like MacKay’s book (see below). It is also published by Dover, so it is reasonably priced.

By Fazlollah M. Reza,

Why should I read it?

1 author picked An Introduction to Information Theory as one of their favorite books, and they share why you should read it.

What is this book about?

Written for an engineering audience, this book has a threefold purpose: (1) to present elements of modern probability theory — discrete, continuous, and stochastic; (2) to present elements of information theory with emphasis on its basic roots in probability theory; and (3) to present elements of coding theory.
The emphasis throughout the book is on such basic concepts as sets, the probability measure associated with sets, sample space, random variables, information measure, and capacity. These concepts proceed from set theory to probability theory and then to information and coding theories. No formal prerequisites are required other than the usual undergraduate…


Book cover of Introduction to Information Theory: Symbols, Signals and Noise

James V. Stone Author Of Information Theory: A Tutorial Introduction

From my list on information theory.

Why am I passionate about this?

My primary interest is in brain function. Because the principal job of the brain is to process information, it is necessary to define exactly what information is. For that, there is no substitute for Claude Shannon’s theory of information. This theory is not only quite remarkable in its own right, but it is essential for telecoms, computers, machine learning (and understanding brain function). I have written ten "tutorial introduction" books, on topics which vary from quantum mechanics to AI. In a parallel universe, I am still an Associate Professor at the University of Sheffield, England.

James' book list on information theory

James V. Stone Why did James love this book?

Pierce was a contemporary of Claude Shannon (inventor of information theory), so he learned information theory shortly after it was published in 1949. Pierce writes in an informal style, but does not flinch from presenting the fundamental theorems of information theory. Some would say his style is too wordy, and the ratio of words/equations is certainly very high. Nevertheless, this book provides a solid introduction to information theory. It was originally published in 1961, so it is a little dated in terms of topics covered. However, because it was re-published by Dover in 1981, it is also fairly cheap. Overall, this is a sensible first book to read on information theory.

By John R. Pierce,

Why should I read it?

1 author picked Introduction to Information Theory as one of their favorite books, and they share why you should read it.

What is this book about?

"Uncommonly good...the most satisfying discussion to be found." — Scientific American.
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future.
To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for a second edition. Beginning with the origins…


Book cover of Spikes: Exploring the Neural Code

Mark Humphries Author Of The Spike: An Epic Journey Through the Brain in 2.1 Seconds

From my list on how brains actually work.

Why am I passionate about this?

I’m a British neuroscientist and writer who’s been using computers to study the brain since 1998, and writing about it since 2016. How I ended up a neuroscientist is hard to explain, for my formative years were spent devouring science books that were not about the brain. That’s partly because finding worthwhile books about the brain is so hard – few delve into how the brain actually works, into the kinds of meaty details that, for example, Hawking offered us on physics and Dawkins on evolution. So I wrote one to solve that problem; and the books on my list are just that too: deep, insightful works on how the brain does what it does.

Mark's book list on how brains actually work

Mark Humphries Why did Mark love this book?

A magnificent synthesis of Bialek and colleagues’ research into how spikes from neurons send information. A strong contender for the most readable serious science book ever published. Even if you only understand a quarter of it (as I did on first reading as a math-shy grad student), the sheer quantity of ideas and the flow of the prose is mind-blowing. As essential a read now as it was in 1997, these ideas have not dated one bit.

By Fred Rieke, David Warland, Rob de Ruyter van Steveninck , William Bialek

Why should I read it?

1 author picked Spikes as one of their favorite books, and they share why you should read it.

What is this book about?

What does it mean to say that a certain set of spikes is the right answer to a computational problem? In what sense does a spike train convey information about the sensory world? Spikes begins by providing precise formulations of these and related questions about the representation of sensory signals in neural spike trains. The answers to these questions are then pursued in experiments on sensory neurons. Intended for neurobiologists with an interest in mathematical analysis of neural data as well as the growing number of physicists and mathematicians interested in information processing by "real" nervous systems, Spikes provides a…


Book cover of Information Theory, Inference and Learning Algorithms

Simon J.D. Prince Author Of Understanding Deep Learning

From my list on machine learning and deep neural networks.

Why am I passionate about this?

I started my career in neuroscience. I wanted to understand brains. That is still proving difficult, and somewhere along the way, I realized my real motivation was to build things, and I wound up working in AI. I love the elegance of mathematical models of the world. Even the simplest machine learning model has complex implications, and exploring them is a joy.

Simon's book list on machine learning and deep neural networks

Simon J.D. Prince Why did Simon love this book?

The best parts of this book really represent a gold standard in pedagogical clarity.

Although it’s now twenty years old, there is still much to learn from this rather unconventional book that covers the boundary between machine learning, information theory, and Bayesian methods. There are also odd tangents and curiosities, some of which work better than others but are never dull.

Just writing this review makes me want to go back to it and squeeze more out of it.

By David JC MacKay,

Why should I read it?

2 authors picked Information Theory, Inference and Learning Algorithms as one of their favorite books, and they share why you should read it.

What is this book about?

Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo…


Book cover of Elements of Information Theory

James V. Stone Author Of Information Theory: A Tutorial Introduction

From my list on information theory.

Why am I passionate about this?

My primary interest is in brain function. Because the principal job of the brain is to process information, it is necessary to define exactly what information is. For that, there is no substitute for Claude Shannon’s theory of information. This theory is not only quite remarkable in its own right, but it is essential for telecoms, computers, machine learning (and understanding brain function). I have written ten "tutorial introduction" books, on topics which vary from quantum mechanics to AI. In a parallel universe, I am still an Associate Professor at the University of Sheffield, England.

James' book list on information theory

James V. Stone Why did James love this book?

This is the modern standard text on information theory. It is both comprehensive and highly technical. The layout is spacey, and the authors make good use of the occasional diagram to explain geometric aspects of information theory. One feature I really like is the set of historical notes and a summary of equations at the end of each chapter.

By Thomas M. Cover, Joy A. Thomas,

Why should I read it?

1 author picked Elements of Information Theory as one of their favorite books, and they share why you should read it.

What is this book about?

The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The…