Skip to content

Latest commit

 

History

History
80 lines (52 loc) · 5.44 KB

File metadata and controls

80 lines (52 loc) · 5.44 KB

Information Theory

[TOC]

Res

Related Topics

Measures (Measure Theory)

Learning Resources

Courses

🏫 Information and Entropy This subject is designed for MIT freshmen.

Spring 2008 is the sixth offering of this subject. It was offered in Spring 2003, 2004, 2005, 2006, 2007 and, before then, three times while being developed, under another number, in Spring 2000, 2001, and 2002.

This subject is offered jointly by the Department of Electrical Engineering and Computer Science and the Department of Mechanical Engineering. Students may sign up for either 2.110J or 6.050J.

🎬【mit麻省信息与熵】 https://www.bilibili.com/video/BV1D441177hA/?p=3&share_source=copy_web&vd_source=7740584ebdab35221363fc24d1582d9d

🏫 消息理論 Information Theory The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel. At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included. At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.

🎬【信息论(2019年春)台湾交通大学陈伯宁】 https://www.bilibili.com/video/BV14N41197bN/?p=2&share_source=copy_web&vd_source=7740584ebdab35221363fc24d1582d9d

🏫 Information Theory, Inference, and Learning Algorithms

An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.

--- Bob McEliece, California Institute of Technology

for teachers: all the figures available for download (as well as the whole book)

🎬【《信息论,推理与学习算法》Information Theory, Inference and Learning Algorithms】 https://www.bilibili.com/video/BV14b411G7wn/?share_source=copy_web&vd_source=7740584ebdab35221363fc24d1582d9d

🏫 The Information Theory, Pattern Recognition, and Neural Networks

Books

Information Theory and Network Coding https://iest2.ie.cuhk.edu.hk/~whyeung/post/draft2.pdf

An undergraduate level course on probability is the only prerequisite for this book. For a non-technical introduction to information theory, we refer the reader to Encyclopedia Britannica.

For biographies of Claude Shannon, a legend of the 20th Century who had made fundamental contribution to the Information Age, we refer the readers to [53] and [307]. The latter is also a complete collection of Shannon’s papers. [53] R. Calderbank and N. J. A. Sloane, “Obituary: Claude Shannon (1916-2001),” Nature, 410: 768, April 12, 2001. [307] N. J. A. Sloane and A. D. Wyner, Ed., Claude Elwood Shannon Collected Papers, IEEE Press, New York, 1993.

Information, Physics, and Computation https://web.stanford.edu/~montanar/RESEARCH/book.html This is an introduction to a rich and rapidly evolving research field at the interface between statistical physics, theretical computer science, discrete mathematics, and coding information theory. It should be accessible to graduate students and researchers without specific training in any of these three fields.

Other Resource

📄 https://arxiv.org/pdf/2206.07867 A visual introduction to information theory Henry Pinkard, and Laura Waller University of California, Berkeley, CA, USA

https://web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf INTRODUCTION TO INFORMATION THEORY Andrea Montanari Stanford University

Intro

[!quote] Information & AI and Machine Learning https://chatgpt.com/share/69712203-8028-800f-8e3f-4d56ba7e49bf

Ref