Elements of Information TheoryThe latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. |
What people are saying - Write a review
Reviews aren't verified, but Google checks for and removes fake content when it's identified
User Review - Flag as inappropriate
excellent
User Review - Flag as inappropriate
This is a good book in general, but there are many places in this books that are not so self-explanatory. The authors do not explain. They probably think it is very obvious, but it might seems as obvious to the readers even with backgrounds.
Contents
13 | |
2 | 46 |
3 | 60 |
4 | 94 |
5 | 103 |
5 | 146 |
7 | 183 |
9 | 261 |
7 | 295 |
MultipleAccess Channels | 558 |
Inequalities in Information Theory | 657 |
Bibliography | 689 |
List of Symbols | 723 |
Other editions - View all
Common terms and phrases
according achieve algorithm alphabet answer assume average binary bits bound calculate capacity channel Chapter codeword lengths coin communication compression conditional Consider constraint construct convex corresponding defined Definition denote depend describe discrete distribution elements encoding entropy rate equal error estimate example exists expected fact fair Figure Find Gaussian give given Hence horse Huffman code idea increase independent inequality information theory input joint Lemma less letter limit Markov chain matrix maximizing maximum minimal mutual information Note optimal output possible probability probability mass function probability of error problem procedure Proof properties prove questions race random variable received relative entropy represent sample satisfies sequence Shannon simple stationary statistic stochastic string sufficient Suppose symbols theorem tree true uniform uniquely decodable wealth