Full Transcript

# Lecture 18 ## Channel Capacity ### Channel Model The generic communication system has the following form: $\qquad \xrightarrow{X} \text{Channel} \xrightarrow{Y}$ Where $X$ is the input and $Y$ is the output. ### Channel Capacity The capacity of a channel is the maximum possible rate at which...

# Lecture 18 ## Channel Capacity ### Channel Model The generic communication system has the following form: $\qquad \xrightarrow{X} \text{Channel} \xrightarrow{Y}$ Where $X$ is the input and $Y$ is the output. ### Channel Capacity The capacity of a channel is the maximum possible rate at which information can be reliably transmitted over the channel. $C = \max_{p(x)} I(X;Y)$ Where: - $C$ is the channel capacity in bits per channel use. - $p(x)$ is the probability mass function of the input $X$. - $I(X;Y)$ is the mutual information between $X$ and $Y$. ### Symmetric Channel A channel is symmetric if the rows of the channel transition matrix $p(y|x)$ are permutations of each other, and the columns are permutations of each other. **Example:** Binary Symmetric Channel (BSC) $\qquad \xrightarrow{X} \text{BSC with crossover probability } p \xrightarrow{Y}$ The channel transition matrix is: $p(y|x) = \begin{bmatrix} 1-p & p \\ p & 1-p \end{bmatrix}$ In a BSC, the optimal input distribution is uniform, i.e., $p(x=0) = p(x=1) = 0.5$. The channel capacity for a BSC is: $C = 1 - H(p)$ Where $H(p)$ is the binary entropy function: $H(p) = -p\log_2(p) - (1-p)\log_2(1-p)$ ### Noiseless Binary Channel For a noiseless binary channel, $p = 0$, and thus: $C = 1 - H(0) = 1 - 0 = 1 \text{ bit}$ ### Noisy Channel with Non-Overlapping Outputs In this case, even though the channel is noisy, the inputs can still be uniquely determined from the outputs. Hence, the capacity is: $C = \log_2 |\chi|$ Where $|\chi|$ is the size of the input alphabet. ### Example: Consider the channel with transition probabilities: $p(y|x) = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$ Here, the channel capacity $C = \log_2(3) \approx 1.58 \text{ bits}$ ### Binary Erasure Channel (BEC) $\qquad \xrightarrow{X} \text{BEC with erasure probability } \alpha \xrightarrow{Y}$ In a Binary Erasure Channel, a bit is either transmitted correctly with probability $1 - \alpha$ or erased with probability $\alpha$. The channel transition matrix is: $p(y|x) = \begin{bmatrix} 1-\alpha & \alpha & 0 \\ 0 & \alpha & 1-\alpha \end{bmatrix}$ The channel capacity for a BEC is: $C = 1 - \alpha$ ### Parallel Channels If we have multiple independent channels, the total capacity is the sum of the individual capacities: $C = \sum_{i=1}^{n} C_i$ ### Capacity of Gaussian Channel For a Gaussian channel with signal power $P$ and noise power $N$, the channel capacity is given by the Shannon-Hartley theorem: $C = \frac{1}{2} \log_2(1 + \frac{P}{N})$ Where: - $C$ is the channel capacity in bits per channel use. - $P$ is the average signal power. - $N$ is the average noise power. The ratio $\frac{P}{N}$ is the signal-to-noise ratio (SNR). ### Key Points - Channel capacity represents the maximum rate of reliable communication over a channel. - Symmetric channels like BSC and BEC have specific capacity formulas. - The capacity of a Gaussian channel depends on the signal-to-noise ratio.