Multimedia IS411P Lecture 02 - Digital Audio
24 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Which process converts continuous time into discrete values?

  • Sampling (correct)
  • Quantization
  • Filtering
  • Coding
  • What is the primary function of a Digital to Analog Converter (DAC)?

  • To convert an analog signal into a digital signal
  • To reconstruct the original analog signal from digital data (correct)
  • To quantize an analog signal into discrete values
  • To sample a continuous signal
  • What does quantization involve during the conversion process?

  • Assigning discrete values to continuous sample values (correct)
  • Dividing the signal into discrete time intervals
  • Enhancing the amplitude of the signal
  • Storing quantized values temporarily
  • Which term describes the fixed size of each quantization interval?

    <p>Quantization step</p> Signup and view all the answers

    In the context of audio signals, what does a higher sampling rate typically achieve?

    <p>Better representation of the original signal</p> Signup and view all the answers

    What is typically used to hold the sampled value constant until the next sampling interval?

    <p>Sampling and hold circuit</p> Signup and view all the answers

    Which of the following best describes the coding stage in the analog-to-digital conversion process?

    <p>Representing quantized values digitally</p> Signup and view all the answers

    What is the primary goal of using a low-pass filter in the DAC process?

    <p>To eliminate high-frequency noise from the reconstructed signal</p> Signup and view all the answers

    What does the sampling rate depend on in the ADC process?

    <p>The maximum frequency of the analog signal</p> Signup and view all the answers

    What does increasing the number of bits per sample do to the signal-to-noise ratio (SNR)?

    <p>Increases the SNR by approximately 6 dB per bit</p> Signup and view all the answers

    According to Nyquist theorem, how many samples are required per cycle to accurately represent a sound wave?

    <p>Two samples</p> Signup and view all the answers

    What is quantization error also referred to as?

    <p>Quantization noise</p> Signup and view all the answers

    What is the relationship between the number of quantization levels (Q) and the bits (b) used in representation?

    <p>b = log2(Q)</p> Signup and view all the answers

    What phenomenon occurs when a signal to be sampled has frequency components higher than half the sampling rate?

    <p>Aliasing</p> Signup and view all the answers

    Which formula expresses the signal-to-noise ratio (SNR) in decibels?

    <p>SNR = 20 log10(S/N)</p> Signup and view all the answers

    How does the number of quantization levels affect the amplitude fidelity of the digital signal?

    <p>More quantization levels increase fidelity</p> Signup and view all the answers

    What determines the volume of sound?

    <p>Amplitude</p> Signup and view all the answers

    Which unit measures the frequency of sound?

    <p>Hertz (Hz)</p> Signup and view all the answers

    What is the range of frequencies that the human ear can typically perceive?

    <p>20 Hz - 20 kHz</p> Signup and view all the answers

    What effect occurs when a sound source moves toward an observer?

    <p>Doppler Effect</p> Signup and view all the answers

    What are harmonics in relation to sound waves?

    <p>Whole number multiples of a fundamental frequency</p> Signup and view all the answers

    In audio signals, what do we call the difference between the upper and lower limit of the sound range that a person can hear?

    <p>Dynamic Range</p> Signup and view all the answers

    What does the velocity of sound depend on?

    <p>Medium through which it travels</p> Signup and view all the answers

    Which characteristic of sound represents the distance from one crest of a wave to the next?

    <p>Wavelength</p> Signup and view all the answers

    Study Notes

    Multimedia (IS411P) Lecture Notes

    • Course: Multimedia (IS411P)
    • Faculty: Computers & Information Science
    • Department: Information Systems
    • Academic Year: 2024-2025
    • Lecture: 02
    • Date: 15/10/2024

    Digital Audio

    • Amplitude: Determines sound volume, measured in decibels (dB).
    • Period: Time between two crests (peaks) in a sound wave, measured in seconds.
    • Frequency (pitch): Number of peaks per second, measured in Hertz (Hz).
    • Bandwidth (BW): Difference between highest and lowest frequencies in a signal.
    • Wavelength (λ): Distance from midpoint of one crest to the midpoint of the next.
    • Human Hearing Range: 20 Hz to 20 kHz.
    • Human Maximum Sensitivity Range: 2 kHz - 4 kHz.
    • Velocity of Sound: Varies with medium, but can be determined by measuring the time required for sound waves to travel a given distance. Values for various mediums are shown in the table on page 5.

    Doppler Effect

    • Sound waves are compressions and rarefactions of air.
    • When a sound source moves towards a listener, the frequency increases.
    • When a sound source moves away from a listener, the frequency decreases.

    Harmonics

    • Most vibrating objects produce complex sounds.
    • Harmonic series: frequencies are whole number multiples of a fundamental frequency.
    • A complex sound wave is comprised of various frequencies.

    Basic Characteristics of Audio Signal

    • Audio is caused by disturbances in air pressure.
    • Frequency range of audible sound: 20-20,000 Hz.
    • Amplitude: dynamic range is large, from audibility threshold to pain threshold; measured in decibels (dB).

    Digital Representation of Audio

    • Continuous audio waveforms converted to an electrical signal by a microphone.
    • Analog signals converted to digital signals for processing/communication (ADC, Analog to Digital Converter). Three stages of ADC: sampling, quantization, coding.

    Sampling

    • Continuous time converted to discrete values.
    • Time axis divided into fixed intervals.
    • Value of analog signal taken at the start of each interval.
    • Determined by sampling rate (sampling frequency)

    Quantization

    • Process of converting continuous sample values to discrete values.
    • Signal's range divided into fixed intervals, each interval has a number.
    • The size of each interval is referred to as the quantization step.

    Coding

    • The process of representing quantized values digitally (e.g. using binary values).
    • Eight quantized levels can be represented via 3 bits.

    Digital-to-Analog Converter (DAC)

    • Reconstructs original analog signal from digital data.
    • Each quantized value is held for the sampling interval, resulting in step signals.
    • Step signals passed through a low-pass filter to approximate the original signal.

    Signal-to-Noise Ratio (SNR)

    • Measures signal quality in decibels (dB).
    • Defined as: SNR = 20 log₁₀(S/N), where S is the maximum signal amplitude and N is the quantization noise.

    Nyquist Theorem

    • Two samples per cycle (per wave) are necessary to represent a given wave.
    • To represent a 440 Hz sound, sampling rate must be 880 samples per second at minimum.
    • Sampling rate= 2 x Highest Frequency.

    Aliasing

    • Distortion that happens when a signal has frequencies greater than half the sampling rate.
    • A serious problem in sampling systems, cannot be removed by post-processing after recording.
    • High frequencies filtered out prior to sampling for prevention.
      • falias = fsampling - ftrue for ftrue < fsampling < 2 x ftrue

    Quality of Sound

    • Telephone conversation bandwidth = 3300 Hz.
    • CD-ROM sampling rate = 44 kHz per channel.
      • CD ROMs are important media for multimedia.

    Sound Formats

    • Stereo recordings (2 channels) are more realistic, but take twice the storage space of mono recordings.

    • Formulas to calculate storage space for mono/stereo recordings:

      • Mono: File size = Sampling rate * Duration * (bits per sample/8) * 1
      • Stereo: File size = Sampling rate * Duration * (bits per sample/8) * 2

    Use of Audio in Multimedia

    • Audio can serve as either content (dialogues, instructions) or ambient (background music, sound effects).

    Video

    • Delivery of information per second is greater than with any other multimedia element.
    • DVDs make large video distribution easier, similar to the ease of analog-to-digital conversion of audio using CDs.

    Analog Video

    • Only analog video is used in broadcasting for now.
    • Some movies/video may be digitally processed prior to broadcast.
    • Three analog video standards globally: NTSC, PAL, SECAM.

    Analog Video Standards

    • NTSC (National Television System Committee): 30 frames/sec, 525 lines, 16 million colors.
    • PAL (Phase Alternation by Line): 25 frames/sec., 625 lines
    • SECAM (Sequential Couleur Avec Mémoire): 25 frames/sec., 625 lines

    Digital Video

    • Increasing trend towards digital video, even in consumer electronics.
    • Digital video is easy to access and edit.
    • Video editing involves various operations: removing/inserting frames, mixing audio, etc.

    Timecode

    • Unit for measuring video clip duration.
    • Can serve as a frame address.
    • SMPTE standard for timecode: hours:minutes:seconds:frames

    Digitizing Analog Video

    • Video capture cards accept video input from devices, audio is sampled separately, software synchronizes both.
    • Cards must support appropriate frame rates (e.g., 30 fps) to avoid issues.

    Keyframes

    • A complete image frame, unlike frames that only show differences.
    • Serve as reference points in video; rest of the frames rely on them to reconstruct the full picture.
    • Keyframe interval indicates how often a keyframe appears in a video stream.

    Compression

    • Restructuring data to reduce file size.
    • Video files are compressed during capture, decompressed during playback.
    • Various codecs (compression/decompression algorithms) are available for digital videos.
    • Important codec characteristics: whether they are symmetric or asymmetric.

    Factors Affecting Compression

    • Frames per second (fps)
    • Number of keyframes
    • Data rate for playback

    File Formats for Video

    • AVI (PC) and QuickTime (Macintosh) are common formats for saving edited digital videos.
    • Both utilize similar video compression/decompression strategies.
    • Conversion programs are available.

    Video on the Internet

    • Streaming video enables real-time video transmission via the Internet.
    • Quality depends on factors like bandwidth, sound intensity/frequency, and difference in information between successive frames.

    Surround Video

    • Provides photorealistic visuals in Web pages, enabling real-time navigation around a 360-degree image.

    Quiz (Example Questions)

    • Total frames in a clip with a given timecode and frames per second are given.
    • Uncompressed video file size calculation given video dimensions, color depth, and frames per second.
    • Given fps and keyframe interval, calculating number of keyframes in a second.

    Studying That Suits You

    Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

    Quiz Team

    Related Documents

    Description

    This quiz covers key concepts from Lecture 02 of the Multimedia (IS411P) course, focusing on digital audio fundamentals. Topics include amplitude, frequency, bandwidth, and the Doppler Effect. Test your knowledge on the properties of sound and human hearing ranges.

    More Like This

    Sound and Digital Audio Technology Quiz
    5 questions
    Digital Audio Technology - Introduction to Sound
    54 questions
    Digital Audio Technology: Sound in Production
    51 questions
    Use Quizgecko on...
    Browser
    Browser