Information Theory

Information Theory From Coding to Learning

Hardback (30 Sep 2024)

  • $90.40
Pre-order

Includes delivery to the United States

Publisher's Synopsis

This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.

Book information

ISBN: 9781108832908
Publisher: Cambridge University Press
Imprint: Cambridge University Press
Pub date:
Language: English
Number of pages: 550
Weight: -1g