Image Processing and Human Vision Quiz
42 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What is the approximate range of the focal length of the lens in humans?

  • 10 mm to 13 mm
  • 14 mm to 17 mm (correct)
  • 18 mm to 20 mm
  • 12 mm to 15 mm
  • What phenomenon causes the perceived brightness to appear scalloped at boundaries of different intensities?

  • Intensity modulation
  • Mach bands (correct)
  • Simultaneous contrast
  • Brightness adaptation
  • In the phenomenon of simultaneous contrast, how does the perceived brightness of a center square change?

  • It becomes darker as the background gets lighter. (correct)
  • It remains unchanged regardless of the background.
  • It becomes the same color as the background.
  • It becomes brighter as the background gets darker.
  • Which constant is utilized in the energy formula for light, denoted as E = h?

    <p>Planck's constant</p> Signup and view all the answers

    What determines the colors perceived in an object by humans?

    <p>The nature of the light reflected from the object</p> Signup and view all the answers

    What is the formula to calculate the number of bits required to store a digitized image?

    <p>b = M × N × k</p> Signup and view all the answers

    Which of the following correctly defines intensity resolution?

    <p>It indicates the smallest measurable change in intensity level.</p> Signup and view all the answers

    How is a 'k-bit image' defined?

    <p>An image with L = 2^k gray levels.</p> Signup and view all the answers

    What does spatial resolution measure in digital images?

    <p>The number of pixels per unit area or distance.</p> Signup and view all the answers

    What is the primary purpose of interpolation in image processing?

    <p>To estimate unknown pixel values using known data.</p> Signup and view all the answers

    What unit is used to measure illumination?

    <p>Lumen per square meter (lm/m2)</p> Signup and view all the answers

    How much illumination does the sun produce on a clear day?

    <p>Up to 90,000 lm/m2</p> Signup and view all the answers

    What is the gray level of an image typically represented as?

    <p>An interval from Lmin to Lmax</p> Signup and view all the answers

    What is the reflectance value for snow?

    <p>0.93</p> Signup and view all the answers

    What is the typical illumination level found in a commercial office?

    <p>1000 lm/m2</p> Signup and view all the answers

    What is the reflectance value for black velvet?

    <p>0.01</p> Signup and view all the answers

    What does the digitizing process involve in image sampling?

    <p>Digitizing coordinate values and amplitude values</p> Signup and view all the answers

    In the context of image representation, what does 'M × N' represent?

    <p>The dimensions of the image array</p> Signup and view all the answers

    What is the primary function of the cornea in the human eye?

    <p>To cover the anterior surface of the eye</p> Signup and view all the answers

    How do cones differ from rods in terms of vision?

    <p>Cones are primarily responsible for color vision</p> Signup and view all the answers

    What role does the choroid play in the structure of the human eye?

    <p>It provides a major source of nutrition to the eye</p> Signup and view all the answers

    What occurs at the blind spot in the human eye?

    <p>There are no visual receptors present</p> Signup and view all the answers

    What is the primary characteristic of rod vision?

    <p>It is sensitive to low levels of illumination</p> Signup and view all the answers

    Which of the following best distinguishes the lens of the eye from ordinary optical lenses?

    <p>The lens of the eye is flexible</p> Signup and view all the answers

    Why can humans resolve fine details with their cone vision?

    <p>Each cone is connected to its own nerve ending</p> Signup and view all the answers

    What happens when light from an object is properly focused in the eye?

    <p>It is imaged on the retina</p> Signup and view all the answers

    What is the complement of the set A in a gray-scale image defined as Ac?

    <p>{(x, y, K - z) | (x, y, z) ∈ A}</p> Signup and view all the answers

    Which operation is used to find the union of two gray-scale images A and B?

    <p>{max(a, b) | a ∈ A, b ∈ B}</p> Signup and view all the answers

    In single-pixel operations, how is the new intensity value determined?

    <p>Based solely on the intensity at the pixel itself</p> Signup and view all the answers

    What is the purpose of intensity interpolation in geometric transformations?

    <p>To assign intensity values to newly created pixels after transformation</p> Signup and view all the answers

    Which of these best describes neighborhood operations in image processing?

    <p>They use values from a specified set of neighboring pixels to determine a pixel's value.</p> Signup and view all the answers

    Which transformation is referred to as rubber-sheet transformation?

    <p>A spatial transformation combined with intensity interpolation</p> Signup and view all the answers

    What is the primary element represented by z in the triplet (x, y, z) of a gray-scale image?

    <p>Intensity level of the pixel</p> Signup and view all the answers

    How is the maximum value in the union of two sets A and B determined?

    <p>By selecting the higher pixel value from each image</p> Signup and view all the answers

    What is the main purpose of using interpolation in digital cameras?

    <p>To produce a larger image or create digital zoom</p> Signup and view all the answers

    Which interpolation method uses only the nearest neighbor to determine the intensity of a pixel?

    <p>Nearest Neighbor Interpolation</p> Signup and view all the answers

    In bilinear interpolation, how many nearest neighbors are used to determine the intensity value of a point?

    <p>Four</p> Signup and view all the answers

    What is the equation used to calculate the Euclidean distance between two pixels p and q?

    <p>De(p, q) = [(x-s)² + (y-t)²]¹/²</p> Signup and view all the answers

    In bicubic interpolation, how many coefficients are determined based on the nearest neighbors?

    <p>16</p> Signup and view all the answers

    What property describes that the distance between two pixels is always non-negative?

    <p>Non-negativity</p> Signup and view all the answers

    Which distance measure is characterized by the formula D4(p, q) = |x-s| + |y-t|?

    <p>City Block Distance</p> Signup and view all the answers

    What is the main feature of the distance measure D(p, q) = D(q, p)?

    <p>Symmetry</p> Signup and view all the answers

    Study Notes

    Course Information

    • Course Title: Introduction to Computer Vision
    • Course Code: CPS834/CPS8307
    • Instructor: Dr. Omar Falou
    • University: Toronto Metropolitan University
    • Semester: Fall 2024

    Digital Image

    • Digital Image: A two-dimensional function where x and y are spatial coordinates. The amplitude of f at the point (x,y) is called intensity or gray level (f(x,y)).
    • Pixel: The fundamental elements of a digital image.

    Image Sources

    • Electromagnetic (EM) energy spectrum (Gamma rays, X-rays, ultraviolet, visible, infrared, microwaves, radio waves).
    • Acoustic
    • Ultrasonic
    • Electronic
    • Synthetic images produced by a computer

    Electromagnetic (EM) Spectrum Examples

    • Gamma-ray imaging: Used for nuclear medicine, astronomical observations, microscopy, & biological imaging.
    • X-rays: Used for medical diagnostics, industrial applications, & astronomy.
    • Ultraviolet: Used for lithography, industrial and astronomical observations.
    • Visible & infrared bands: Used for light microscopy, astronomy, remote sensing, industry, and law enforcement.
    • Microwave band: Used for radar.
    • Radio band: Used for medicine (MRI) and astronomy.

    Examples of Imaging Techniques

    • Gamma-ray imaging (Bone scan, PET image, Cygnus Loop, Gamma radiation)
    • X-ray imaging (Chest X-ray, Aortic angiogram, Head CT, Circuit boards)
    • Ultraviolet imaging(Normal corn, Smut corn, Cygnus Loop).
    • Light microscopy imaging (Taxol, Cholesterol, Microprocessor, Nickel oxide, Surface of audio CD, Organic superconductor).
    • Visual and Infrared imaging (LANDSAT satellite images of the Washington, D.C. area).

    Automated Visual Inspection

    • Examples of manufactured goods that are often checked using image processing (A circuit board controller, Packaged pills, Bottles, Air bubbles in plastic products, Cereal, Image of intraocular implant).

    Automated Visual Inspection - Results

    • Results of automated reading of the plate content by the system.
    • The area in which the imaging system detected the plate.

    Example of Radar Image

    • Spaceborne radar image of mountains in Southeast Tibet.

    Example of MRI

    • MRI images of a human knee and spine.

    Example of Ultrasound Imaging

    • Examples of ultrasound imaging(Baby, Another view of baby, Thyroids, Muscle layers showing lesion).

    Image Processing → Computer Vision

    • High-level: Object detection, recognition, shape analysis, tracking, and use of artificial intelligence and machine learning.
    • Image Analysis: Segmentation, image registration, and matching.
    • Low-level: Image enhancement, noise removal, restoration, feature detection, and compression.

    Image Processing Problems

    • Image acquisition, Image enhancement, Image restoration, Morphological processing, Segmentation, Representation & description, Object recognition, Image compression.

    Element of Visual Perception

    • Although the digital image processing field is based on mathematical and probabilistic formulations, human intuition and analysis also play a central role. The choice of technique often depends on subjective visual judgments.
    • Developing a basic understanding of human visual perception is essential.

    Structure of the Human Eye

    • Cornea: Tough, transparent tissue covering the anterior surface of the eye.
    • Sclera: Opaque membrane enclosing the remainder of the optic globe.
    • Choroid: Lies beneath the sclera, containing a network of blood vessels, the major source of nutrition to the eye.
    • Lens: Absorbs infrared and ultraviolet, but excessive amounts can damage the eye.
    • Retina: The inner membrane lining the posterior portion of the eye's wall, where light from an object outside the eye is imaged.

    Receptors

    • Receptors (neurons): distributed over the surface of the retina.
    • Divided into cones and rods.
    • Cones (neurons 6-7 million): Primarily located in the fovea (central portion), highly sensitive to color and details.
    • Rods (75-150 million): Distributed over the retina, highly sensitive to dim light.

    Blind Spot

    • Figure 2.2 shows the density of rods and cones in a cross-section of the right eye, passing through the optic nerve's emergence point.
    • Blind spot: absence of receptors, radially symmetrical around the fovea.

    Image Formation in the Eye

    • The eye's lens is flexible, unlike ordinary optical lenses; its focal length varies.

    Brightness Adaptation and Discrimination

    • Perceived brightness is not solely determined by intensity.
    • Visual system tends to undershoot or overshoot around the boundary of regions with different intensities (Mach bands).
    • Simultaneous contrast: a region's perceived brightness depends on the surrounding regions.

    Optical Illusions

    • Examples of well-known optical illusions.

    Light and EM Spectrum

    • Colors humans perceive are determined by the reflected light wavelengths from an object.
    • Wavelengths for example: green roughly 500 to 570 nm.

    Light and EM Spectrum – continued

    • Monochromatic light: devoid of color. Intensity varies from black to white, resulting in grayscale images.
    • Chromatic light bands: the range of wavelengths that create color.
    •  Radiance (W, watts): the total energy emitted by a light source.
    • Luminance (Im, lumens): the amount of perceived light by an observer.
    •  Brightness: a subjective descriptor of light perception.

    Image Acquisition

    • Transforming light energy into digital images.
    • Components: energy, filters, power source, housing, sensing material, output voltage waveform.
    • Images can be acquired using single sensors or sensor strips (linear or circular).

    Simple Image Formation Model

    • Model of image formation: f(x,y) = i(x,y) * r(x,y).
    • i(x,y): Illumination intensity at a point (x,y).
    • r(x,y): Reflectance or transmissivity at (x,y) describing the object's light interaction.

    Components of Image Formation Models

    • f(x,y) : Intensity image at coordinates (x,y)
    • i(x,y) : Illumination component at (x,y).
    • r(x,y) : Reflectance component at (x,y), representing the object's light reflection.

    Some Typical Illumination Ranges

    • Lumen: a unit of light flow or luminous flux
    •  Lumen per square meter (lm/m²): The metric unit used to quantify illuminance of a surface.
    • Typical illumination levels: daytime (clear & cloudy), evening (moonlight), commercial office.

    Some Typical Reflectance Ranges

    • Different materials have different reflectance levels.

    Gray Level

    • Intensity of a monochrome image.
    • Common gray scale is [Lmin, Lmax] (0-black, 255-white).

    Image Sampling and Quantization

    • Digitizing image coordinates and amplitude values: converting continuous into discrete values.
    • Sampling: the division of the image into discrete samples across the x and y axis. Pixels.
    • Quantization: dividing the range of light intensities values to assigned to each coordinate into discrete values. These discrete values are often represented by the number of bits assigned to the pixel.. 

    Coordinate Convention

    • Defining the origin (0,0) and orientation (x-axis, y-axis) of the coordinate system, typically positioned at top-left corner within a frame.

    Representing Digital Images

    • Representing images as a matrix where the integer values within the matrix correspond to pixel values (intensity).

    Representing Digital Images – Continued

    • Discrete intensity interval: [0, L-1] with L = 2k (where k is the bit depth).
    • Stored number of bits, b, for an image of size M×N.
    • Image with 256 possible gray levels, often referred to as 8-bit image.

    Representing Digital Images – Table

    • Table showing number of bits necessary for different image sizes and bit depths.

    Coloured Image

    • Represented by RGB channels.
    • RGB: Red, Green, Blue

    Spatial and Intensity Resolution

    • Spatial resolution: smallest discerned detail in an image (e.g.; dots per inch).
    • Intensity resolution: the smallest discernible change in an image intensity (e.g., bits).

    Image Interpolation

    • Process of estimating unknown image values using known values.
    • Methods- Nearest Neighbor, Bilinear, Bicubic. 

    Distance Measures

    • Pixel distance metrics, such as Euclidean, City Block, and Chessboard distances, define ways to measure the proximity between pixels in a digital image.

    Mathematical Operations in Digital Image Processing

    • Array operations are distinct from matrix operations.
    • Matrix product is a mathematical operation involving matrices.
    • Array product involves the element-wise multiplication of corresponding elements in image arrays.

    Linear vs. Nonlinear Operations

    • Additivity: H(a1f1(x,y)+a2f2(x,y)) = a1H(f1(x,y)) + a2H(f2(x,y)).
    • Homogeneity: H(af(x,y)) = aH(f(x,y))

    Arithmetic Operations

    • Basic operations are used for manipulating pixel values, using addition, subtraction, multiplication, and division.

    Example: Addition of Noisy Images for Noise Reduction

    • Method of reducing noise in images, such as those from astronomical observations, by averaging multiple noisy images.

    Example: Image Subtraction: Mask Mode Radiography

    • Method of enhancing contrast in images, such as X-ray images, by subtracting a mask image from a live image.

    Example: Image Multiplication

    • Images are multiplied together to change or enhance characteristics within the images, such as modifying highlighting or shading within an image.

    Set and Logical Operations

    • Set theory operations are used for pixel-based image manipulation.
    • Set operations allow using logical operations to create new images by comparing pixel values from different image sets.

    Set and Logical Operations – Continued

    • The image values can be represented using set theory and the resulting sets can be used to perform a variety of image operations on a pixel-basis.

    Spatial Operations

    • Single-pixel operations and neighborhood operations. These affect pixel values based on either the intensity of a single pixel value, or pixel values from neighboring pixels. 

    Geometric Spatial Transformations

    • Transforming coordinate systems from one image to another, such as enlarging, shrinking, or rotating images.

    Geometric Spatial Transformations – Continued

    • Affine transform: A commonly used spatial transformation that encompasses scaling, rotation, and translation of images. 

    Intensity Assignment (Forward and Inverse Mapping)

    • Methods of assigning pixel values in output images based on their corresponding coordinates in input images to maintain spatial integrity during transformations such as rotation. 

    Example: Image Rotation and Intensity Interpolation

    • Different methods of assigning intensities to new locations in rotated images based upon nearest neighbor, bilinear, and bicubic interpolation.

    Image Registration

    • Methods for aligning images; determining how one image is positioned related to another; usually based on known correspondences between landmarks between images.

    Image Registration – continued

    • Tie points or control points are used to align (register) images. A bilinear approximation model: mathematical equation to map images (by finding a transformation function that models the relationship between images).

    Image Transform

    • Transforming input and output images for different image process applications. Forward and inverse transformation kernel functions can be used to transform between domains. The transform domain is often different than the spatial domain and may facilitate operations that are difficult or impossible to complete on direct image transformations.

    Studying That Suits You

    Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

    Quiz Team

    Related Documents

    Description

    Test your knowledge on image processing concepts and human visual perception. This quiz covers topics such as focal length, brightness perception, color detection, and image resolution. Perfect for students of optics and digital imaging.

    Use Quizgecko on...
    Browser
    Browser