Information Theory Core

Posted By: ELK1nG

Information Theory Core
Published 9/2025
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 122.60 MB | Duration: 0h 34m

Master Entropy and Mutual Information

What you'll learn

Understand and differentiate modes of convergence - including convergence everywhere, almost everywhere, in mean square (MS) and in probability.

Apply the law of large numbers - both Bernoulli’s and Borel’s strong law — to analyze the long-term behavior of random sequences.

Explain and compute information-theoretic measures such as entropy, joint entropy, conditional entropy, and apply the chain rule in problem-solving contexts.

Work with theoretical tools and proofs - — making use of lemmas, worked examples, and illustrative problems to strengthen mathematical reasoning.

Requirements

Information Theory Fundamentals

Description

This course, Information Theory Core, is the natural sequel to Information Theory Fundamentals, also by the same instructor. While the earlier course built the essential foundation of information theory through probability concepts, this course moves decisively into the heart of information theory.We begin by consolidating probabilistic tools with lectures on convergence in mean square, convergence in probability, distributional convergence, and the law of large numbers—cornerstones that prepare us for the rigorous study of information. With this background in place, we introduce entropy, the central quantity in information theory. You will learn its meaning and its properties.The course then explores the non-negativity of entropy, the change of base property, and step-by-step worked examples to build deep intuition. These early insights serve as gateways to advanced topics such as coding theorems, data compression, and the probabilistic underpinnings of communication systems, which will be addressed in subsequent lectures.By the end of Information Theory Core, you will not only be comfortable manipulating definitions and rules but will also appreciate their conceptual power. This course is designed for learners in mathematics, computer science, electrical engineering, and related fields who are serious about mastering information theory as both a theoretical and applied discipline.

Overview

Section 1: Introduction

Lecture 1 Convergence Everywhere

Lecture 2 Convergence Almost Everywhere

Lecture 3 Convergence in the Mean Square Sense

Lecture 4 Convergence in Probability

Lecture 5 Convergence in Distribution

Lecture 6 Law of Large Numbers

Section 2: Basic Concepts

Lecture 7 Entropy

Lecture 8 Non-Negativity

Lecture 9 Change of Base

Lecture 10 Example 2.1.1

Lecture 11 Example 2.1.2

Section 3: Capstone Concept

Lecture 12 From Joint Entropy to M.I.

Seniors and graduate students looking to enter into information theory proper