24AABC6F-59D2-4573-AAB6-938E8FBF9CD5.jpeg
Document Details

Uploaded by WellEducatedJasper3155
Full Transcript
# Lecture 18 ## Channel Models ### Binary Symmetric Channel (BSC) - Transition probability $P(y|x)$ is the probability of receiving $y$ given that $x$ was transmitted. - For Binary Symmetric Channel: - $P(y=0|x=0) = 1-p$ - $P(y=1|x=0) = p$ - $P(y=0|x=1) = p$ - $P(y=1|x=1) = 1-p$ - $...
# Lecture 18 ## Channel Models ### Binary Symmetric Channel (BSC) - Transition probability $P(y|x)$ is the probability of receiving $y$ given that $x$ was transmitted. - For Binary Symmetric Channel: - $P(y=0|x=0) = 1-p$ - $P(y=1|x=0) = p$ - $P(y=0|x=1) = p$ - $P(y=1|x=1) = 1-p$ - $p$ is the crossover probability. - Channel matrix: - $$ \begin{bmatrix} 1-p & p \\ p & 1-p \end{bmatrix} $$ - Channel Capacity: $C = 1 - H(p)$, where $H(p) = -p\log_2(p) - (1-p)\log_2(1-p)$ is the entropy of the channel. - Capacity is maximized when $p = 0$ or $p = 1$, i.e., when the channel is noiseless. ### Noisy Channel Coding Theorem - **Theorem:** For a channel with capacity $C$, for any rate $R < C$, there exists a coding scheme with rate $R$ and probability of error $P_e \rightarrow 0$ as the block length $N \rightarrow \infty$. - **Converse:** For a channel with capacity $C$, for any rate $R > C$, the probability of error $P_e \rightarrow 1$ as the block length $N \rightarrow \infty$. - **Implications:** - We can transmit information reliably at any rate below the channel capacity. - We cannot transmit information reliably at any rate above the channel capacity. - **Example:** BSC with $p = 0.11$. $C = 1 - H(0.11) = 1 - 0.5 = 0.5$ bits/channel use. We can transmit information reliably at any rate below 0.5 bits/channel use. ### Additive White Gaussian Noise (AWGN) Channel - $Y = X + Z$, where $Z \sim \mathcal{N}(0, \sigma^2)$ is Gaussian noise with zero mean and variance $\sigma^2$. - $X$ is the transmitted signal and $Y$ is the received signal. - Signal-to-noise ratio (SNR): $\frac{P}{\sigma^2}$, where $P$ is the power of the transmitted signal. - Channel Capacity: $C = \frac{1}{2}\log_2(1 + \frac{P}{\sigma^2})$ bits/channel use. - **Explanation:** - The capacity increases with the signal power $P$. - The capacity decreases with the noise variance $\sigma^2$. - The capacity is proportional to the bandwidth. - **Example:** AWGN channel with $P = 1$ and $\sigma^2 = 1$. $C = \frac{1}{2}\log_2(1 + \frac{1}{1}) = \frac{1}{2}$ bits/channel use. ### Fading Channel - The signal strength varies randomly with time. ### Other Channel Models - Wireless channels - Underwater acoustic channels - Optical channels