Amazon Prime Free Trial
FREE Delivery is available to Prime members. To join, select "Try Amazon Prime and start saving today with FREE Delivery" below the Add to Cart button and confirm your Prime free trial.
Amazon Prime members enjoy:- Cardmembers earn 5% Back at Amazon.com with a Prime Credit Card.
- Unlimited FREE Prime delivery
- Streaming of thousands of movies and TV shows with limited ads on Prime Video.
- A Kindle book to borrow for free each month - with no due dates
- Listen to over 2 million songs and hundreds of playlists
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, your membership will automatically upgrade to a monthly membership.
-24% $101.26$101.26
Ships from: Amazon.com Sold by: Amazon.com
$77.02$77.02
Ships from: Amazon Sold by: CashDoh
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Follow the authors
OK
Elements of Information Theory 2nd Edition (Wiley Series in Telecommunications and Signal Processing) 2nd Edition
Purchase options and add-ons
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
- ISBN-100471241954
- ISBN-13978-0471241959
- Edition2nd
- PublisherWiley-Interscience
- Publication dateJuly 18, 2006
- LanguageEnglish
- Dimensions6.4 x 1.62 x 9.5 inches
- Print length784 pages
Frequently bought together
Customers who viewed this item also viewed
Editorial Reviews
Review
"This book is recommended reading, both as a textbook and as a reference." (Computing Reviews.com, December 28, 2006)
From the Inside Flap
THE LATEST EDITION OF THIS CLASSIC IS UPDATED WITH NEW PROBLEM SETS AND MATERIAL
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
- Chapters reorganized to improve teaching
- 200 new problems
- New material on source coding, portfolio theory, and feedback capacity
- Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
From the Back Cover
THE LATEST EDITION OF THIS CLASSIC IS UPDATED WITH NEW PROBLEM SETS AND MATERIAL
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
- Chapters reorganized to improve teaching
- 200 new problems
- New material on source coding, portfolio theory, and feedback capacity
- Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
About the Author
THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.
JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.
Product details
- Publisher : Wiley-Interscience; 2nd edition (July 18, 2006)
- Language : English
- Hardcover : 784 pages
- ISBN-10 : 0471241954
- ISBN-13 : 978-0471241959
- Item Weight : 2.55 pounds
- Dimensions : 6.4 x 1.62 x 9.5 inches
- Best Sellers Rank: #244,376 in Books (See Top 100 in Books)
- #32 in Information Theory
- #303 in Electrical & Electronics (Books)
- #1,903 in Mathematics (Books)
- Customer Reviews:
About the authors
Discover more of the author’s books, see similar authors, read book recommendations and more.
Discover more of the author’s books, see similar authors, read book recommendations and more.
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonCustomers say
Customers find the book provides an excellent introduction to information theory. They appreciate the clear writing and modern terminology. The book is suitable for self-study, with exercises to test understanding. Readers find it enjoyable and interesting to read.
AI-generated from the text of customer reviews
Customers find the book clear and accessible. They appreciate the modern terminology and authors' effort to make most of the material user-friendly. However, some reviewers mention typos.
"...It's also a very well written edition, and useful in many different fields." Read more
"I recommend this product to anyone studying Information Theory. It is very clear and uses modern nomenclature...." Read more
"Very good book with some minor issues. The authors do a great job of making most of the material accessible to a person with an understanding of..." Read more
"...I feel like the chapters on continuous channels are much tougher to understand and less intuitive than the chapters on discrete channels...." Read more
Customers find this book an excellent introduction to information theory. They say it's readable with many exercises to test their understanding of the material. It's suitable for self-study, has more concrete examples, and the authors are the main reference in information theory. The book provides a unique and ambitious introduction to a fascinating and complex topic. The topics are easy to understand and have provided the groundwork for their physics PhDs.
"...Thomas repeatedly introduced us to exciting and unexpected applications of Information Theory, always sending us to the journals for further, more in..." Read more
"...The writing is excellent, and most topics are easy to understand, although I have a few isolated quibbles about how certain topics are..." Read more
"One of the most readable info theory books out there." Read more
"This book has given me the groundwork for my physics PhD. It is a fertile ground for new ideas!..." Read more
Customers find the book enjoyable and interesting.
"...and Thomas have written a unique and ambitious introduction to a fascinating and complex subject; their book must be judged fairly and not compared..." Read more
"...has more concrete examples, and is in my opinion more fun and interesting (which says a lot, because this book is itself quite fun and interesting)..." Read more
"...El libro es muy bueno e interesante!!! Lo recomiendo 100 % !! ." Read more
Reviews with images
Very Low Paper/Printing Quality - Possible Counterfeit
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
- Reviewed in the United States on May 16, 2008I am writing this review in response to some confusion and unfairness I see in other reviews. Cover and Thomas have written a unique and ambitious introduction to a fascinating and complex subject; their book must be judged fairly and not compared to other books that have entirely different goals.
Claude Shannon provided a working definition of "information" in his seminal 1948 paper, A Mathematical Theory of Communication. Shannon's interest in that and subsequent papers was the attainment of reliable communication in noisy channels. The definition of information that Shannon gave was perfectly fitted to this task; indeed, it is easily shown that in the context studied by Shannon, the only meaningful measure of information content that will apply to random variables with known distribution must be (up to a multiplicative constant) of the now-familiar form h(p) = log(1/p).
However, Shannon freely admitted that his definition of information was limited in scope and was never envisioned as being universal. Shannon deliberately avoided the "murkier" aspects of human communication in framing his definitions; problematic themes such as knowledge, semantics, motivations and intentions of the sender and/or receiver, etc., were avoided altogether.
For several decades, Information Theory continued to exist as a subset of the theory of reliable communication. Some classical and highly regarded texts on the subject are Gallager, Ash, Viterbi and Omura, and McEliece. For those whose interest in Information Theory is motivated largely by questions from the field of digital communications, these texts remain unrivalled standards; Gallager, in particular, is so highly regarded by those who learned from it that it is still described as superior to many of its more recent, up-to-date successors.
In recent decades, Information Theory has been applied to problems from across a wide array of academic disciplines. Physicists have been forced to clarify the extent to which information is conserved in order to completely understand black hole dynamics; biologists have found extensive use of Information Theoretic concepts in understanding the human genome; computer scientists have applied Information Theory to complex issues in computational vs. descriptive complexity (the Mandelbrot set, which has been called the most complex set in all of mathematics, is actually extremely simple from the point of view of Kolmogorov complexity); and John von Neumann's brilliant creation, game theory, which has been called "a universal language for the unification of the behavioral sciences," is intimately coupled to Information Theory, perhaps in ways that have not yet been fully appreciated or explored.
Cover and Thomas' book "Elements of Information Theory" is written for the reader who is interested in these eclectic and exciting applications of Information Theory. This book does NOT treat Information Theory as a subset of reliable communication theory; therefore, the book is NOT written as a competitor for Gallager's classic text. Critics who ask
for a more thorough treatment of rate distortion theory or convolutional codes are criticizing the authors for failing to include topics that are not even central to their goals for the text!
A very selective list of some of the more interesting topics that Cover and Thomas study includes: (1) the Asymptotic Equipartition Property and its consequences for data compression; (2) Information Theory and gambling; (3) Kolmogorov complexity and Chaitin's Omega; (4) Information Theory and statistics; and (5) Information Theory and the stock market. Item (4) on this list is only briefly introduced in Cover and Thomas's book, and appropriately so; however, readers who wish to pursue the fascinating subject of Fischer Information further should consider B. Roy Frieden's book Physics from Fisher Information: A Unification. Frieden identifies a principle of "extreme physical information" as a unifying theme across all of physics, deriving such classic equations as the Klein-Gordon equation, Maxwell's equations, and Einstein's field equations for general relativity from this information-theoretic principle.
This last point is quite typical of Cover and Thomas's book. I participated in a faculty seminar on Information Thoery at my university a few years ago, in which we studied Cover and Thomas as our primary source. We were a diverse group, drawn from five different academic disciplines, and we all found that Cover and Thomas repeatedly introduced us to exciting and unexpected applications of Information Theory, always sending us to the journals for further, more in-depth study.
Cover and Thomas' book has become an established favorite in university courses on information theory. In truth, the book has few competitors. Interested readers looking for additional references might also consider David MacKay's book Information Theory, Inference, and Learning Algorithms, which has as a primary goal the use of information theory in the study of Neural Networks and learning algorithms. George Klir's book Uncertainty and Information considers many alternative measures of information/uncertainty, moving far beyond the classical log(1/p) measure of Shannon and the context in which it arose. Jan Kahre's iconoclastic book The Mathematical Theory of Information is an intriguing alternative in which the so-called Law of Diminishing Information is elevated to primary axiomatic status in deriving measures of information content. I alluded to some of the "murkier" issues of human communication earlier; readers who wish to study some of those issues will find Yehoshua Bar-Hillel's book Language and Information a useful source.
In conclusion, I highly recommend Cover and Thomas' book on Information Theory. It is currently unrivalled as a rigorous introduction to applications of Information Theory across the curriculum. As a person who used to work in the general area of signals analysis, I resist all comparisons of Cover and Thomas' book with the classic text of Gallager; the books have vastly different goals and very little overlap.
- Reviewed in the United States on April 21, 2008I give this book five stars for its outstanding clarity, thoroughness, and choice of topics. The writing is excellent, and most topics are easy to understand, although I have a few isolated quibbles about how certain topics are presented.
I feel like the chapters on continuous channels are much tougher to understand and less intuitive than the chapters on discrete channels.
The exercises are very useful, but in my opinion, a bit too easy. There are lots of exercises at the end of each chapter, but there are very few that require deep thinking or creative insight. Most of the exercises are fairly routine. I think a few more involved ones would be welcome.
The one thing that is most lacking in this book are examples. The bulk of the text is made up in exposition of new ideas and proofs of theorems. While the exercises give lots of examples, I still feel that something is missing--especially in the chapters on continuous channels.
As a supplement, I would recommend "Information Theory, Inference, and Learning Algorithms" by MacKay. The two books are very different from each other and have less overlap than one might expect; I think everyone would do well to study both books. That book is much more suitable for self-study, has more concrete examples, and is in my opinion more fun and interesting (which says a lot, because this book is itself quite fun and interesting). It also has some more involved exercises. Also, it covers coding theory in more depth than this book (something that one might not realize from its name), and it integrates a Bayesian perspective into things more deeply.
- Reviewed in the United States on October 1, 2021One of the most readable info theory books out there.
- Reviewed in the United States on January 10, 2015This book has given me the groundwork for my physics PhD. It is a fertile ground for new ideas! I'm glad to have bought it and read it thoroughly.
It's also a very well written edition, and useful in many different fields.
- Reviewed in the United States on February 15, 2022The book has already been broken seriously as I get it. But, they think is not acceptable as I sent the book back to them. It is not my problem, but they ignored it. This is very bad.
- Reviewed in the United States on May 4, 2012The book came quickly, in two layers of bubble wrap and a layer of paper. Looking at the book, the only way you can tell it's used is that the outside of the book has the not-so-shiny, semi-smudged look of a used book. Excellent experience.
- Reviewed in the United States on December 10, 2011They often use (or overload) notation without first defining it, leaving the reader to guess at what they mean. When this occurred within the text I was usually able to figure it out by deducing definitions based on the results obtained with them, but it added unnecessary difficulty to reading the text and negated any illustrative value the example would have provided. Worse, they also did this in the problems. I never once had to obtain help solving the problems, but frequently had to obtain clarification of what the problems were asking.
The proofs often skip steps and omit justifications for most steps, including keystone steps of the proof. I would not have been able to follow many of the proofs had it not been for additional details provided in lecture. The preface is clear that this is intentional.
Finally, despite being a second edition, there are a fair number of typos, especially in the problems.
While there is always a trade-off between rigor and accessibility when writing a textbook, each of the above failings cause the book to be both less rigorous and less approachable, greatly increasing the frustration of trying to learn the material.
I have not read any other information theory texts, so I can't compare against those, but compared to other technical books I have read during my undergrad and graduate studies, this was one of the least helpful.
- Reviewed in the United States on March 3, 2014I recommend this product to anyone studying Information Theory. It is very clear and uses modern nomenclature. It also has many exercises to test your understanding of the material.
Top reviews from other countries
- Subrata nandiReviewed in India on August 30, 2024
5.0 out of 5 stars Education
Good book
- SGWEReviewed in Australia on December 26, 2024
5.0 out of 5 stars Quality
Good
- G. D. CondeReviewed in the United Kingdom on March 6, 2017
1.0 out of 5 stars I am a bit disappointed because some of the expressions are not correct
I am a bit disappointed because some of the expressions are not correct. For example, in some fractions, the horizontal line is missing.
- Gary BowersReviewed in Canada on November 24, 2014
5.0 out of 5 stars Absolute Continuity
This book is the 2nd Edition of the most well known North American source on relative entropy in the discrete case. Relative entropy has become quite topical the past ten years because it can be used as a tool to determine the absolute continuity of probability measures. Elements of Information Theory by Cover and Thomas provides some standard proofs in the discrete case, for example for the convexity of relative entropy. The 1st Edition of this book was used by one of my supervisor's former students.
-
Günther ItzelsbergerReviewed in Germany on March 28, 2014
4.0 out of 5 stars Information theory
Leider hat sich die Informationstheorie so stark zu Theorie entwickelt, sodaß selbst ich, der ein Leben lang mit dieser Theorie gelebt hat es Schwierig finde das Buch zu lesen. Trotzdem zeigt es wie umfangreich sich diese Theorie entwickelt hat