Information Theory & Communication Model

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

In the general model of a communication system, what is the role of the 'channel'?

The channel is the medium through which the encoded message is transmitted from the sender to the receiver. It can be susceptible to noise and interference.

Explain how self-information is related to the probability of an event occurring.

Self-information is inversely proportional to the probability of an event. Lower probability events have higher self-information because they are more surprising or informative.

State the formula for self-information, defining all terms.

The formula for self-information is $I(s_k) = log_b(1/P(s_k))$ where $I(s_k)$ is the self-information of symbol $s_k$, $P(s_k)$ is the probability of symbol $s_k$ occurring, and $b$ is the base of the logarithm.

What does it mean for a source to be 'discrete' in the context of information theory?

<p>A discrete source produces symbols from a finite alphabet, each with an associated probability. The symbols are distinct and countable.</p> Signup and view all the answers

If two events A and B are independent, how is $I(AB)$ (the self-information of both A and B occurring) related to $I(A)$ and $I(B)$?

<p>If A and B are independent, then $I(AB) = I(A) + I(B)$. The self-information of the joint event is the sum of the individual self-informations.</p> Signup and view all the answers

Explain why the self-information of an event with probability 1 is zero.

<p>If an event has a probability of 1, it is certain to occur. Therefore, observing this event provides no new information, and its self-information is zero.</p> Signup and view all the answers

Define 'entropy' in the context of information theory.

<p>Entropy is the average amount of information per symbol required to describe a source. It quantifies the uncertainty associated with a random variable or a source of information.</p> Signup and view all the answers

What condition maximizes the entropy of a source with $N$ symbols?

<p>The entropy is maximized when all $N$ symbols are equally likely. This means each symbol has a probability of $1/N$.</p> Signup and view all the answers

If a source has an entropy of 0, what does this imply about the source's output?

<p>An entropy of 0 indicates that there is no uncertainty in the source's output. One symbol is certain to occur, and all others have a zero probability. In other words, the output is fixed and predictable.</p> Signup and view all the answers

What is the formula for entropy $H$ of a source?

<p>The formula for entropy is $H = \sum_{i=1}^{N} P_i I(s_i) = \sum_{i=1}^{N} P_i log_b \frac{1}{P(s_i)}$, where $P_i$ is probability of symbol $s_i$, $I(s_i)$ is self-information of $s_i$, and $N$ is the total number of symbols.</p> Signup and view all the answers

Differentiate between 'encoder' and 'decoder'?

<p>An encoder transforms the source information into a suitable format for transmission, often reducing redundancy and adding error correction. A decoder reverses this process, restoring the original information from the received signal.</p> Signup and view all the answers

How does 'noise' affect the communication process in the general model?

<p>Noise introduces unwanted signals or disturbances into the channel, potentially corrupting the transmitted message and causing errors at the receiver.</p> Signup and view all the answers

List the components of the general model of a communication system.

<p>The components consist of a source, encoder, channel, decoder, and receiver.</p> Signup and view all the answers

What is the formula for calculating the entropy when all a priori probabilities are equally likely?

<p>When all a priori probabilities are equally likely the formula for entropy is: $H = log_b N$, where $N$ is the number of symbols and $b$ is the base of the logarithm.</p> Signup and view all the answers

If the probability of the binary code is $P = 1$, what is the value of $H$ interpreted as?

<p>The entropy in this case is $H=0$, implying there is $0$ bit/symbol.</p> Signup and view all the answers

In a communication system if we know a-priori, that the receiver knows everything about the event occurring, what can we conclude?

<p>That there is no information gain and no need for communication.</p> Signup and view all the answers

If we are observing a system, and we notice that it is non uniformly distributed data, what can we conclude about the relationship it has with entropy?

<p>That the entropy received from the non uniformly distributed data is less than $log_2N$.</p> Signup and view all the answers

In the general model of communication, what examples of 'source' are provided?

<p>Voice, Words, Pictures, Music are examples of 'source'.</p> Signup and view all the answers

In the general model of communication, what examples of 'channel' are provided?

<p>Telephone line, High frequency radio link, Space communication link, Biological organism (send message from brain to foot, or from ear to brain) are examples of 'channel'.</p> Signup and view all the answers

In the general model of communication, what examples of 'receiver' are provided?

<p>Person, Computer, Disk, Analog Radio or TV internet are examples of 'receiver'.</p> Signup and view all the answers

Flashcards

General Model of Communication

The general model includes a source, encoder, channel, decoder, and receiver. Noise can affect the channel.

Source (Communication Model)

The starting point of the communication process, which could be voice, words, pictures, or music.

Encoder

Converts the source into a suitable signal for transmission, often reducing redundancy and adding redundancy to cater to the channel.

Channel

The medium through which the signal travels, and which could be a telephone line, radio link, or even a biological organism.

Signup and view all the flashcards

Decoder

Reverses the encoding process to restore the information to its original form, exploiting and removing the added redundancy.

Signup and view all the flashcards

Receiver

The destination of the transmitted information, such as a person, computer, or TV.

Signup and view all the flashcards

Noise

Interference or disturbances on the channel that can corrupt the signal.

Signup and view all the flashcards

First Stage (Encoder)

Data reduction keeping important bits or removing source redundancy

Signup and view all the flashcards

Encoding

Representation of information in another form.

Signup and view all the flashcards

Discrete Information Source

A group of symbols from a given alphabet.

Signup and view all the flashcards

Self Information

A function that measures the amount of information gained after observing a symbol.

Signup and view all the flashcards

Unit of Information

Unit depending on the base of the log (bits, nats, Hartleys)

Signup and view all the flashcards

Information and Probability

The amount of information in bits about a symbol is closely related to the probability of occurrence.

Signup and view all the flashcards

Low Probability Event

Information is closely related to its probability of occurrence.

Signup and view all the flashcards

Information of Independent Events

The sum of the information of the individual events

Signup and view all the flashcards

Entropy

The average number of bits per symbol required to describe a source

Signup and view all the flashcards

Maximum Entropy

If all a priori probabilities are equally likely

Signup and view all the flashcards

Entropy and Uncertainty

Entropy is zero when one outcome is certain, so entropy refers to disorder or uncertainty of a message

Signup and view all the flashcards

Shannon on Entropy

The entropy measures the average of the information contained in each message of the source, irrespective the meaning of the message

Signup and view all the flashcards

Study Notes

  • Information theory is concerned with the theoretical limitations and potentials of systems that communicate, focusing on what compression or communications rate can be achieved.
  • Communication involves sending information across a medium, potentially with errors, from one place and/or time to another.

General Model of Communication

  • The model consists of a source, encoder, channel, decoder, and receiver, with noise affecting the channel.
  • Source: Originates voice, words, pictures, or music.
  • Encoder: Processes information before it enters the channel, reduces data, and inserts redundancy. Code is a mechanism for representing information and encoding is its representation.
  • Channel: A medium is used, such as telephone line, high-frequency radio link, or even a biological system.
  • Noise: Signals with time-varying frequency responses introduce cross-talk, thermal noise, impulsive switch noise, which is treated as random but may follow probability distribution rules.
  • Decoder: Exploits and removes redundancy, fixes transmission errors and restores the information to its original form.
  • Receiver: The destination which can be a person, computer, disk, analog radio, or TV internet.

Discrete Information Source

  • Generates symbols from a given alphabet S = {s0, s1, ..., sK-1}, where each symbol has a probability Pk and symbols are independent.
  • No gain of information is obtained when Pk = 1, because there is no uncertainty of occurrence.
  • Uncertainty increases with the reception of sk, so there gain of information also increases as Pk decreases.

Self Information

  • It is a function that measures the amount of information after observing symbol sk: I(sk) = logb(1/P(sk)).
  • The unit is determined by the base of the log, with 2 for bits and e=2.718 for nats.
  • A symbol's information in bits is closely related to its probability of occurrence, so a low probability event contains much information.
  • I(sk) ≥ 0; it is a real nonnegative measure and a continuous function of p.
  • I(sk) > I(sl) if Pk < Pl

Properties of Self Information

  • The information from two independent events is the sum of the information from individual events : I(AB) = I(A) + I(B).

Entropy

  • Entropy is the average number of bits per symbol needed to describe a source. For N independent symbols, entropy is defined as: H = Σ (from i=1 to N) PiI(si) and H = Σ (from i=1 to N) Pilogb(1/P(si)).
  • If all a priori probabilities are equally likely (P=1/N for all N symbols), the entropy is maximum and given by: H = logb N.
  • Entropy = zero when one outcome is certain.
  • Entropy refers to disorder or uncertainty of a message.
  • Per Shannon, the entropy is the average of the information contained in each message of the source, irrespective of the meaning of the message.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

More Like This

Communication Models and Key Concepts
16 questions
Modelo de Comunicación de Shannon y Weaver
16 questions
Shannon-Weaver Communication Model Quiz
42 questions
Teoría de la Comunicación: Elementos y Ruido
10 questions
Use Quizgecko on...
Browser
Browser