Information Theory
Description: This quiz covers the fundamental concepts and principles of Information Theory, including entropy, mutual information, channel capacity, and coding theorems. | |
Number of Questions: 15 | |
Created by: Aliensbrain Bot | |
Tags: information theory shannon entropy mutual information channel capacity coding theorems |
What is the unit of information in Information Theory?
The entropy of a random variable $X$ is defined as:
The mutual information between two random variables $X$ and $Y$ is defined as:
The channel capacity of a communication channel is defined as:
The Shannon-Hartley theorem states that the channel capacity of a band-limited additive white Gaussian noise (AWGN) channel is given by:
The source coding theorem states that the minimum number of bits required to represent a source with entropy $H$ is:
The channel coding theorem states that it is possible to achieve reliable communication over a noisy channel with a capacity of $C$ by using a code with a rate:
The Huffman coding algorithm is a:
The Lempel-Ziv-Welch (LZW) algorithm is a:
The JPEG image compression standard uses:
The MP3 audio compression standard uses:
The H.264 video compression standard uses:
The information content of a message is measured in:
The rate of a source code is defined as:
The efficiency of a source code is defined as: