## Elements of Information TheoryThe latest edition of this classic is updated with new problem setsand material The Second Edition of this fundamental textbook maintains thebook's tradition of clear, thought-provoking instruction. Readersare provided once again with an instructive mix of mathematics,physics, statistics, and information theory. All the essential topics in information theory are covered indetail, including entropy, data compression, channel capacity, ratedistortion, network information theory, and hypothesis testing. Theauthors provide readers with a solid understanding of theunderlying theory and applications. Problem sets and a telegraphicsummary at the end of each chapter further assist readers. Thehistorical notes that follow each chapter recap the mainpoints. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedbackcapacity * Updated references Now current and enhanced, the Second Edition of Elements ofInformation Theory remains the ideal textbook for upper-levelundergraduate and graduate courses in electrical engineering,statistics, and telecommunications. An Instructor's Manual presenting detailed solutions to all theproblems in the book is available from the Wiley editorialdepartment. |

### What people are saying - Write a review

User Review - Flag as inappropriate

excellent

User Review - Flag as inappropriate

This is a good book in general, but there are many places in this books that are not so self-explanatory. The authors do not explain. They probably think it is very obvious, but it might seems as obvious to the readers even with backgrounds.

### Contents

Asymptotic Equipartition Property | |

Entropy Rates of a Stochastic Process | |

Data Compression | |

Gambling and Data Compression | |

Channel Capacity | |

Maximum Entropy | |

Differential Entropy | |

Information Theory and Statistics | |

Universal Source Coding | |

Kolmogorov Complexity | |

Network Information Theory | |

Information Theory and Portfolio Theory | |

Inequalities in Information Theoq | |

Bibliography | |

List of Symbols | |

### Other editions - View all

### Common terms and phrases

achievable algorithm alphabet assume asymptotic average binary symmetric channel bits broadcast channel calculate capacity region channel capacity channel coding codebook codeword codeword lengths coding theorem conditional entropy Consider convex corresponding data compression defined Definition denote density describe differential entropy discrete memoryless channel distortion measure drawn i.i.d. encoding entropy rate equal equation ergodic estimate example Fano’s inequality Fisher information Gaussian channel given growth rate Hence Huffman code IEEE Trans independent information theory input joint distribution jointly typical Kolmogorov complexity large numbers Lemma Let X1 lower bound Markov chain matrix maximizes maximum entropy minimization multiple-access channel mutual information node noise output power constraint probability mass function probability of error problem proof prove random variable rate distortion function relative entropy satisfying sender Shannon side information source coding stochastic process string symbols typical sequences typical set uniform distribution uniquely decodable vector wealth