This course mainly focuses on source encoding and channel coding, which are two main subjects of the information theory. The source encoding can lead to data compression, whereas the channel coding can realize reliable data transmission. First, the basis of the probability theory is reviewed, and measures of the amount of information such as the entropy and the mutual information are defined. Then, two important relations are introduced; one is the relation between the limit of the data compression and the entropy, while the other is the relation between the limit of transmission speed of reliable data transmission and the maximum value of the mutual information. Some concrete methods of the channel coding and the fundamental encryption theory are also discussed in this course.
At the end of this course, students will be able to:
1) regard information and channels as abstract mathematical models,
2) explain the relation between the limit of data compression and the entropy,
3) explain the relation between the limit of transmission speed of reliable data transmission and the maximum value of the mutual information,
4) explain basic methods of the source encoding, channel coding, and encryption.
Source encoding, channel coding, entropy, source coding theorem, channel coding theorem, encryption
✔ Specialist skills | Intercultural skills | Communication skills | Critical thinking skills | Practical and/or problem-solving skills |
Students are provided with exercise problems about items explained in each class.
Course schedule | Required learning | |
---|---|---|
Class 1 | Introduction: source encoding and channel coding | Understand the course objectives. |
Class 2 | Representation of information and basis of the probability theory | Review the basis of the probability theory. |
Class 3 | Entropy, mutual information, divergence | Understand concepts you learn in the class. |
Class 4 | Model of information sources and entropy rate: stationary information source, Markov information source, entropy rate | Understand concepts you learn in the class. |
Class 5 | Kraft's inequality, source coding theorem (converse part) | Understand the converse part of the source coding theorem. |
Class 6 | The limit of average code length, Huffman coding | Understand the limit of averate code length and Huffman coding. |
Class 7 | Source coding theorem (direct part) | Understand the direct part of the source coding theorem. |
Class 8 | The midterm examination and the review of the first half | Review the first half of this course. |
Class 9 | Model of channels | Understand model of channels. |
Class 10 | Channel capacity | Explain channel capacity. |
Class 11 | Channel coding theorem | Understand the direct part of the channel coding theorem. |
Class 12 | Error correcting code: Humming code | Understand error correcting code and the mechanism of Humming code |
Class 13 | Encryption theory: basics adn symmetric key encryption | Understand the model of encryption. |
Class 14 | Encryption theory: public-key encryption, summarization of this course | Understand the model of public-key encryption. |
To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class.
They should do so by referring to textbooks and other course material.
All materials used in class can be found on OCW-i, or provided during class.
Tomohiko Uematsu, An Illustrated Guide to Viewpoint of Information Theory, Ormsha, 2010 (Japanese)
Kohichi Sakaniwa, Kenta Kasai, Introduction to Communication Theory, CORONA PUBLISHING CO., LTD, 2014 (Japanese)
Student achievements are assessed based on the midterm exam (50%) and the final exam (50%).
Note that when the online lecture is held, the above exams can be replaced by reports, and the course schedule will be changed accordingly.
none required