Podcast
Questions and Answers
In the general model of a communication system, what is the role of the 'channel'?
In the general model of a communication system, what is the role of the 'channel'?
The channel is the medium through which the encoded message is transmitted from the sender to the receiver. It can be susceptible to noise and interference.
Explain how self-information is related to the probability of an event occurring.
Explain how self-information is related to the probability of an event occurring.
Self-information is inversely proportional to the probability of an event. Lower probability events have higher self-information because they are more surprising or informative.
State the formula for self-information, defining all terms.
State the formula for self-information, defining all terms.
The formula for self-information is $I(s_k) = log_b(1/P(s_k))$ where $I(s_k)$ is the self-information of symbol $s_k$, $P(s_k)$ is the probability of symbol $s_k$ occurring, and $b$ is the base of the logarithm.
What does it mean for a source to be 'discrete' in the context of information theory?
What does it mean for a source to be 'discrete' in the context of information theory?
If two events A and B are independent, how is $I(AB)$ (the self-information of both A and B occurring) related to $I(A)$ and $I(B)$?
If two events A and B are independent, how is $I(AB)$ (the self-information of both A and B occurring) related to $I(A)$ and $I(B)$?
Explain why the self-information of an event with probability 1 is zero.
Explain why the self-information of an event with probability 1 is zero.
Define 'entropy' in the context of information theory.
Define 'entropy' in the context of information theory.
What condition maximizes the entropy of a source with $N$ symbols?
What condition maximizes the entropy of a source with $N$ symbols?
If a source has an entropy of 0, what does this imply about the source's output?
If a source has an entropy of 0, what does this imply about the source's output?
What is the formula for entropy $H$ of a source?
What is the formula for entropy $H$ of a source?
Differentiate between 'encoder' and 'decoder'?
Differentiate between 'encoder' and 'decoder'?
How does 'noise' affect the communication process in the general model?
How does 'noise' affect the communication process in the general model?
List the components of the general model of a communication system.
List the components of the general model of a communication system.
What is the formula for calculating the entropy when all a priori probabilities are equally likely?
What is the formula for calculating the entropy when all a priori probabilities are equally likely?
If the probability of the binary code is $P = 1$, what is the value of $H$ interpreted as?
If the probability of the binary code is $P = 1$, what is the value of $H$ interpreted as?
In a communication system if we know a-priori, that the receiver knows everything about the event occurring, what can we conclude?
In a communication system if we know a-priori, that the receiver knows everything about the event occurring, what can we conclude?
If we are observing a system, and we notice that it is non uniformly distributed data, what can we conclude about the relationship it has with entropy?
If we are observing a system, and we notice that it is non uniformly distributed data, what can we conclude about the relationship it has with entropy?
In the general model of communication, what examples of 'source' are provided?
In the general model of communication, what examples of 'source' are provided?
In the general model of communication, what examples of 'channel' are provided?
In the general model of communication, what examples of 'channel' are provided?
In the general model of communication, what examples of 'receiver' are provided?
In the general model of communication, what examples of 'receiver' are provided?
Flashcards
General Model of Communication
General Model of Communication
The general model includes a source, encoder, channel, decoder, and receiver. Noise can affect the channel.
Source (Communication Model)
Source (Communication Model)
The starting point of the communication process, which could be voice, words, pictures, or music.
Encoder
Encoder
Converts the source into a suitable signal for transmission, often reducing redundancy and adding redundancy to cater to the channel.
Channel
Channel
The medium through which the signal travels, and which could be a telephone line, radio link, or even a biological organism.
Signup and view all the flashcards
Decoder
Decoder
Reverses the encoding process to restore the information to its original form, exploiting and removing the added redundancy.
Signup and view all the flashcards
Receiver
Receiver
The destination of the transmitted information, such as a person, computer, or TV.
Signup and view all the flashcards
Noise
Noise
Interference or disturbances on the channel that can corrupt the signal.
Signup and view all the flashcards
First Stage (Encoder)
First Stage (Encoder)
Data reduction keeping important bits or removing source redundancy
Signup and view all the flashcards
Encoding
Encoding
Representation of information in another form.
Signup and view all the flashcards
Discrete Information Source
Discrete Information Source
A group of symbols from a given alphabet.
Signup and view all the flashcards
Self Information
Self Information
A function that measures the amount of information gained after observing a symbol.
Signup and view all the flashcards
Unit of Information
Unit of Information
Unit depending on the base of the log (bits, nats, Hartleys)
Signup and view all the flashcards
Information and Probability
Information and Probability
The amount of information in bits about a symbol is closely related to the probability of occurrence.
Signup and view all the flashcards
Low Probability Event
Low Probability Event
Information is closely related to its probability of occurrence.
Signup and view all the flashcards
Information of Independent Events
Information of Independent Events
The sum of the information of the individual events
Signup and view all the flashcards
Entropy
Entropy
The average number of bits per symbol required to describe a source
Signup and view all the flashcards
Maximum Entropy
Maximum Entropy
If all a priori probabilities are equally likely
Signup and view all the flashcards
Entropy and Uncertainty
Entropy and Uncertainty
Entropy is zero when one outcome is certain, so entropy refers to disorder or uncertainty of a message
Signup and view all the flashcards
Shannon on Entropy
Shannon on Entropy
The entropy measures the average of the information contained in each message of the source, irrespective the meaning of the message
Signup and view all the flashcardsStudy Notes
- Information theory is concerned with the theoretical limitations and potentials of systems that communicate, focusing on what compression or communications rate can be achieved.
- Communication involves sending information across a medium, potentially with errors, from one place and/or time to another.
General Model of Communication
- The model consists of a source, encoder, channel, decoder, and receiver, with noise affecting the channel.
- Source: Originates voice, words, pictures, or music.
- Encoder: Processes information before it enters the channel, reduces data, and inserts redundancy. Code is a mechanism for representing information and encoding is its representation.
- Channel: A medium is used, such as telephone line, high-frequency radio link, or even a biological system.
- Noise: Signals with time-varying frequency responses introduce cross-talk, thermal noise, impulsive switch noise, which is treated as random but may follow probability distribution rules.
- Decoder: Exploits and removes redundancy, fixes transmission errors and restores the information to its original form.
- Receiver: The destination which can be a person, computer, disk, analog radio, or TV internet.
Discrete Information Source
- Generates symbols from a given alphabet S = {s0, s1, ..., sK-1}, where each symbol has a probability Pk and symbols are independent.
- No gain of information is obtained when Pk = 1, because there is no uncertainty of occurrence.
- Uncertainty increases with the reception of sk, so there gain of information also increases as Pk decreases.
Self Information
- It is a function that measures the amount of information after observing symbol sk: I(sk) = logb(1/P(sk)).
- The unit is determined by the base of the log, with 2 for bits and e=2.718 for nats.
- A symbol's information in bits is closely related to its probability of occurrence, so a low probability event contains much information.
- I(sk) ≥ 0; it is a real nonnegative measure and a continuous function of p.
- I(sk) > I(sl) if Pk < Pl
Properties of Self Information
- The information from two independent events is the sum of the information from individual events : I(AB) = I(A) + I(B).
Entropy
- Entropy is the average number of bits per symbol needed to describe a source. For N independent symbols, entropy is defined as: H = Σ (from i=1 to N) PiI(si) and H = Σ (from i=1 to N) Pilogb(1/P(si)).
- If all a priori probabilities are equally likely (P=1/N for all N symbols), the entropy is maximum and given by: H = logb N.
- Entropy = zero when one outcome is certain.
- Entropy refers to disorder or uncertainty of a message.
- Per Shannon, the entropy is the average of the information contained in each message of the source, irrespective of the meaning of the message.
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.