Podcast
Questions and Answers
What is the approximate range of the focal length of the lens in humans?
What is the approximate range of the focal length of the lens in humans?
- 10 mm to 13 mm
- 14 mm to 17 mm (correct)
- 18 mm to 20 mm
- 12 mm to 15 mm
What phenomenon causes the perceived brightness to appear scalloped at boundaries of different intensities?
What phenomenon causes the perceived brightness to appear scalloped at boundaries of different intensities?
- Intensity modulation
- Mach bands (correct)
- Simultaneous contrast
- Brightness adaptation
In the phenomenon of simultaneous contrast, how does the perceived brightness of a center square change?
In the phenomenon of simultaneous contrast, how does the perceived brightness of a center square change?
- It becomes darker as the background gets lighter. (correct)
- It remains unchanged regardless of the background.
- It becomes the same color as the background.
- It becomes brighter as the background gets darker.
Which constant is utilized in the energy formula for light, denoted as E = h?
Which constant is utilized in the energy formula for light, denoted as E = h?
What determines the colors perceived in an object by humans?
What determines the colors perceived in an object by humans?
What is the formula to calculate the number of bits required to store a digitized image?
What is the formula to calculate the number of bits required to store a digitized image?
Which of the following correctly defines intensity resolution?
Which of the following correctly defines intensity resolution?
How is a 'k-bit image' defined?
How is a 'k-bit image' defined?
What does spatial resolution measure in digital images?
What does spatial resolution measure in digital images?
What is the primary purpose of interpolation in image processing?
What is the primary purpose of interpolation in image processing?
What unit is used to measure illumination?
What unit is used to measure illumination?
How much illumination does the sun produce on a clear day?
How much illumination does the sun produce on a clear day?
What is the gray level of an image typically represented as?
What is the gray level of an image typically represented as?
What is the reflectance value for snow?
What is the reflectance value for snow?
What is the typical illumination level found in a commercial office?
What is the typical illumination level found in a commercial office?
What is the reflectance value for black velvet?
What is the reflectance value for black velvet?
What does the digitizing process involve in image sampling?
What does the digitizing process involve in image sampling?
In the context of image representation, what does 'M × N' represent?
In the context of image representation, what does 'M × N' represent?
What is the primary function of the cornea in the human eye?
What is the primary function of the cornea in the human eye?
How do cones differ from rods in terms of vision?
How do cones differ from rods in terms of vision?
What role does the choroid play in the structure of the human eye?
What role does the choroid play in the structure of the human eye?
What occurs at the blind spot in the human eye?
What occurs at the blind spot in the human eye?
What is the primary characteristic of rod vision?
What is the primary characteristic of rod vision?
Which of the following best distinguishes the lens of the eye from ordinary optical lenses?
Which of the following best distinguishes the lens of the eye from ordinary optical lenses?
Why can humans resolve fine details with their cone vision?
Why can humans resolve fine details with their cone vision?
What happens when light from an object is properly focused in the eye?
What happens when light from an object is properly focused in the eye?
What is the complement of the set A in a gray-scale image defined as Ac?
What is the complement of the set A in a gray-scale image defined as Ac?
Which operation is used to find the union of two gray-scale images A and B?
Which operation is used to find the union of two gray-scale images A and B?
In single-pixel operations, how is the new intensity value determined?
In single-pixel operations, how is the new intensity value determined?
What is the purpose of intensity interpolation in geometric transformations?
What is the purpose of intensity interpolation in geometric transformations?
Which of these best describes neighborhood operations in image processing?
Which of these best describes neighborhood operations in image processing?
Which transformation is referred to as rubber-sheet transformation?
Which transformation is referred to as rubber-sheet transformation?
What is the primary element represented by z in the triplet (x, y, z) of a gray-scale image?
What is the primary element represented by z in the triplet (x, y, z) of a gray-scale image?
How is the maximum value in the union of two sets A and B determined?
How is the maximum value in the union of two sets A and B determined?
What is the main purpose of using interpolation in digital cameras?
What is the main purpose of using interpolation in digital cameras?
Which interpolation method uses only the nearest neighbor to determine the intensity of a pixel?
Which interpolation method uses only the nearest neighbor to determine the intensity of a pixel?
In bilinear interpolation, how many nearest neighbors are used to determine the intensity value of a point?
In bilinear interpolation, how many nearest neighbors are used to determine the intensity value of a point?
What is the equation used to calculate the Euclidean distance between two pixels p and q?
What is the equation used to calculate the Euclidean distance between two pixels p and q?
In bicubic interpolation, how many coefficients are determined based on the nearest neighbors?
In bicubic interpolation, how many coefficients are determined based on the nearest neighbors?
What property describes that the distance between two pixels is always non-negative?
What property describes that the distance between two pixels is always non-negative?
Which distance measure is characterized by the formula D4(p, q) = |x-s| + |y-t|?
Which distance measure is characterized by the formula D4(p, q) = |x-s| + |y-t|?
What is the main feature of the distance measure D(p, q) = D(q, p)?
What is the main feature of the distance measure D(p, q) = D(q, p)?
Flashcards
Focal Length Variation
Focal Length Variation
The distance between the lens center and retina changes, from 17 mm to 14 mm, as the lens's focusing power increases.
Mach Bands
Mach Bands
Seen as brightness patterns that appear scalloped, even though the intensities are constant. The effect is strongest near the boundaries of intensities.
Simultaneous Contrast
Simultaneous Contrast
A visual phenomenon where a region’s perceived brightness is not based solely on its intensity. A region's appearance can become brighter or darker based on the surrounding intensity.
Light Perception and Object Colors
Light Perception and Object Colors
Signup and view all the flashcards
Electromagnetic Spectrum (Light and EM)
Electromagnetic Spectrum (Light and EM)
Signup and view all the flashcards
Cornea
Cornea
Signup and view all the flashcards
Sclera
Sclera
Signup and view all the flashcards
Choroid
Choroid
Signup and view all the flashcards
Retina
Retina
Signup and view all the flashcards
Cones
Cones
Signup and view all the flashcards
Rods
Rods
Signup and view all the flashcards
Blind Spot
Blind Spot
Signup and view all the flashcards
Image Formation (Eye)
Image Formation (Eye)
Signup and view all the flashcards
Digital Image Intensity
Digital Image Intensity
Signup and view all the flashcards
Image Bits
Image Bits
Signup and view all the flashcards
Spatial Resolution
Spatial Resolution
Signup and view all the flashcards
Intensity Resolution
Intensity Resolution
Signup and view all the flashcards
Image Interpolation
Image Interpolation
Signup and view all the flashcards
Illumination (Lumen/m^2)
Illumination (Lumen/m^2)
Signup and view all the flashcards
Reflectance
Reflectance
Signup and view all the flashcards
Gray Level
Gray Level
Signup and view all the flashcards
Image Sampling
Image Sampling
Signup and view all the flashcards
Image Quantization
Image Quantization
Signup and view all the flashcards
Digital Image Representation
Digital Image Representation
Signup and view all the flashcards
Lumen
Lumen
Signup and view all the flashcards
Illuminance
Illuminance
Signup and view all the flashcards
Image Complement (Ac)
Image Complement (Ac)
Signup and view all the flashcards
Image Union (A ∪ B)
Image Union (A ∪ B)
Signup and view all the flashcards
Single-Pixel Operations
Single-Pixel Operations
Signup and view all the flashcards
Neighborhood Operations
Neighborhood Operations
Signup and view all the flashcards
Geometric Transformations
Geometric Transformations
Signup and view all the flashcards
Spatial Transformation
Spatial Transformation
Signup and view all the flashcards
Intensity Interpolation
Intensity Interpolation
Signup and view all the flashcards
Spatial Operations
Spatial Operations
Signup and view all the flashcards
Nearest Neighbor Interpolation
Nearest Neighbor Interpolation
Signup and view all the flashcards
Bilinear Interpolation
Bilinear Interpolation
Signup and view all the flashcards
Bicubic Interpolation
Bicubic Interpolation
Signup and view all the flashcards
Distance Measures
Distance Measures
Signup and view all the flashcards
Euclidean Distance
Euclidean Distance
Signup and view all the flashcards
City Block Distance
City Block Distance
Signup and view all the flashcards
Interpolation Types
Interpolation Types
Signup and view all the flashcards
Image Interpolation in Cameras
Image Interpolation in Cameras
Signup and view all the flashcards
Study Notes
Course Information
- Course Title: Introduction to Computer Vision
- Course Code: CPS834/CPS8307
- Instructor: Dr. Omar Falou
- University: Toronto Metropolitan University
- Semester: Fall 2024
Digital Image
- Digital Image: A two-dimensional function where x and y are spatial coordinates. The amplitude of f at the point (x,y) is called intensity or gray level (f(x,y)).
- Pixel: The fundamental elements of a digital image.
Image Sources
- Electromagnetic (EM) energy spectrum (Gamma rays, X-rays, ultraviolet, visible, infrared, microwaves, radio waves).
- Acoustic
- Ultrasonic
- Electronic
- Synthetic images produced by a computer
Electromagnetic (EM) Spectrum Examples
- Gamma-ray imaging: Used for nuclear medicine, astronomical observations, microscopy, & biological imaging.
- X-rays: Used for medical diagnostics, industrial applications, & astronomy.
- Ultraviolet: Used for lithography, industrial and astronomical observations.
- Visible & infrared bands: Used for light microscopy, astronomy, remote sensing, industry, and law enforcement.
- Microwave band: Used for radar.
- Radio band: Used for medicine (MRI) and astronomy.
Examples of Imaging Techniques
- Gamma-ray imaging (Bone scan, PET image, Cygnus Loop, Gamma radiation)
- X-ray imaging (Chest X-ray, Aortic angiogram, Head CT, Circuit boards)
- Ultraviolet imaging(Normal corn, Smut corn, Cygnus Loop).
- Light microscopy imaging (Taxol, Cholesterol, Microprocessor, Nickel oxide, Surface of audio CD, Organic superconductor).
- Visual and Infrared imaging (LANDSAT satellite images of the Washington, D.C. area).
Automated Visual Inspection
- Examples of manufactured goods that are often checked using image processing (A circuit board controller, Packaged pills, Bottles, Air bubbles in plastic products, Cereal, Image of intraocular implant).
Automated Visual Inspection - Results
- Results of automated reading of the plate content by the system.
- The area in which the imaging system detected the plate.
Example of Radar Image
- Spaceborne radar image of mountains in Southeast Tibet.
Example of MRI
- MRI images of a human knee and spine.
Example of Ultrasound Imaging
- Examples of ultrasound imaging(Baby, Another view of baby, Thyroids, Muscle layers showing lesion).
Image Processing → Computer Vision
- High-level: Object detection, recognition, shape analysis, tracking, and use of artificial intelligence and machine learning.
- Image Analysis: Segmentation, image registration, and matching.
- Low-level: Image enhancement, noise removal, restoration, feature detection, and compression.
Image Processing Problems
- Image acquisition, Image enhancement, Image restoration, Morphological processing, Segmentation, Representation & description, Object recognition, Image compression.
Element of Visual Perception
- Although the digital image processing field is based on mathematical and probabilistic formulations, human intuition and analysis also play a central role. The choice of technique often depends on subjective visual judgments.
- Developing a basic understanding of human visual perception is essential.
Structure of the Human Eye
- Cornea: Tough, transparent tissue covering the anterior surface of the eye.
- Sclera: Opaque membrane enclosing the remainder of the optic globe.
- Choroid: Lies beneath the sclera, containing a network of blood vessels, the major source of nutrition to the eye.
- Lens: Absorbs infrared and ultraviolet, but excessive amounts can damage the eye.
- Retina: The inner membrane lining the posterior portion of the eye's wall, where light from an object outside the eye is imaged.
Receptors
- Receptors (neurons): distributed over the surface of the retina.
- Divided into cones and rods.
- Cones (neurons 6-7 million): Primarily located in the fovea (central portion), highly sensitive to color and details.
- Rods (75-150 million): Distributed over the retina, highly sensitive to dim light.
Blind Spot
- Figure 2.2 shows the density of rods and cones in a cross-section of the right eye, passing through the optic nerve's emergence point.
- Blind spot: absence of receptors, radially symmetrical around the fovea.
Image Formation in the Eye
- The eye's lens is flexible, unlike ordinary optical lenses; its focal length varies.
Brightness Adaptation and Discrimination
- Perceived brightness is not solely determined by intensity.
- Visual system tends to undershoot or overshoot around the boundary of regions with different intensities (Mach bands).
- Simultaneous contrast: a region's perceived brightness depends on the surrounding regions.
Optical Illusions
- Examples of well-known optical illusions.
Light and EM Spectrum
- Colors humans perceive are determined by the reflected light wavelengths from an object.
- Wavelengths for example: green roughly 500 to 570 nm.
Light and EM Spectrum – continued
- Monochromatic light: devoid of color. Intensity varies from black to white, resulting in grayscale images.
- Chromatic light bands: the range of wavelengths that create color.
- Radiance (W, watts): the total energy emitted by a light source.
- Luminance (Im, lumens): the amount of perceived light by an observer.
- Brightness: a subjective descriptor of light perception.
Image Acquisition
- Transforming light energy into digital images.
- Components: energy, filters, power source, housing, sensing material, output voltage waveform.
- Images can be acquired using single sensors or sensor strips (linear or circular).
Simple Image Formation Model
- Model of image formation: f(x,y) = i(x,y) * r(x,y).
- i(x,y): Illumination intensity at a point (x,y).
- r(x,y): Reflectance or transmissivity at (x,y) describing the object's light interaction.
Components of Image Formation Models
- f(x,y) : Intensity image at coordinates (x,y)
- i(x,y) : Illumination component at (x,y).
- r(x,y) : Reflectance component at (x,y), representing the object's light reflection.
Some Typical Illumination Ranges
- Lumen: a unit of light flow or luminous flux
- Lumen per square meter (lm/m²): The metric unit used to quantify illuminance of a surface.
- Typical illumination levels: daytime (clear & cloudy), evening (moonlight), commercial office.
Some Typical Reflectance Ranges
- Different materials have different reflectance levels.
Gray Level
- Intensity of a monochrome image.
- Common gray scale is [Lmin, Lmax] (0-black, 255-white).
Image Sampling and Quantization
- Digitizing image coordinates and amplitude values: converting continuous into discrete values.
- Sampling: the division of the image into discrete samples across the x and y axis. Pixels.
- Quantization: dividing the range of light intensities values to assigned to each coordinate into discrete values. These discrete values are often represented by the number of bits assigned to the pixel..
Coordinate Convention
- Defining the origin (0,0) and orientation (x-axis, y-axis) of the coordinate system, typically positioned at top-left corner within a frame.
Representing Digital Images
- Representing images as a matrix where the integer values within the matrix correspond to pixel values (intensity).
Representing Digital Images – Continued
- Discrete intensity interval: [0, L-1] with L = 2k (where k is the bit depth).
- Stored number of bits, b, for an image of size M×N.
- Image with 256 possible gray levels, often referred to as 8-bit image.
Representing Digital Images – Table
- Table showing number of bits necessary for different image sizes and bit depths.
Coloured Image
- Represented by RGB channels.
- RGB: Red, Green, Blue
Spatial and Intensity Resolution
- Spatial resolution: smallest discerned detail in an image (e.g.; dots per inch).
- Intensity resolution: the smallest discernible change in an image intensity (e.g., bits).
Image Interpolation
- Process of estimating unknown image values using known values.
- Methods- Nearest Neighbor, Bilinear, Bicubic.
Distance Measures
- Pixel distance metrics, such as Euclidean, City Block, and Chessboard distances, define ways to measure the proximity between pixels in a digital image.
Mathematical Operations in Digital Image Processing
- Array operations are distinct from matrix operations.
- Matrix product is a mathematical operation involving matrices.
- Array product involves the element-wise multiplication of corresponding elements in image arrays.
Linear vs. Nonlinear Operations
- Additivity: H(a1f1(x,y)+a2f2(x,y)) = a1H(f1(x,y)) + a2H(f2(x,y)).
- Homogeneity: H(af(x,y)) = aH(f(x,y))
Arithmetic Operations
- Basic operations are used for manipulating pixel values, using addition, subtraction, multiplication, and division.
Example: Addition of Noisy Images for Noise Reduction
- Method of reducing noise in images, such as those from astronomical observations, by averaging multiple noisy images.
Example: Image Subtraction: Mask Mode Radiography
- Method of enhancing contrast in images, such as X-ray images, by subtracting a mask image from a live image.
Example: Image Multiplication
- Images are multiplied together to change or enhance characteristics within the images, such as modifying highlighting or shading within an image.
Set and Logical Operations
- Set theory operations are used for pixel-based image manipulation.
- Set operations allow using logical operations to create new images by comparing pixel values from different image sets.
Set and Logical Operations – Continued
- The image values can be represented using set theory and the resulting sets can be used to perform a variety of image operations on a pixel-basis.
Spatial Operations
- Single-pixel operations and neighborhood operations. These affect pixel values based on either the intensity of a single pixel value, or pixel values from neighboring pixels.
Geometric Spatial Transformations
- Transforming coordinate systems from one image to another, such as enlarging, shrinking, or rotating images.
Geometric Spatial Transformations – Continued
- Affine transform: A commonly used spatial transformation that encompasses scaling, rotation, and translation of images.
Intensity Assignment (Forward and Inverse Mapping)
- Methods of assigning pixel values in output images based on their corresponding coordinates in input images to maintain spatial integrity during transformations such as rotation.
Example: Image Rotation and Intensity Interpolation
- Different methods of assigning intensities to new locations in rotated images based upon nearest neighbor, bilinear, and bicubic interpolation.
Image Registration
- Methods for aligning images; determining how one image is positioned related to another; usually based on known correspondences between landmarks between images.
Image Registration – continued
- Tie points or control points are used to align (register) images. A bilinear approximation model: mathematical equation to map images (by finding a transformation function that models the relationship between images).
Image Transform
- Transforming input and output images for different image process applications. Forward and inverse transformation kernel functions can be used to transform between domains. The transform domain is often different than the spatial domain and may facilitate operations that are difficult or impossible to complete on direct image transformations.
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.
Related Documents
Description
Test your knowledge on image processing concepts and human visual perception. This quiz covers topics such as focal length, brightness perception, color detection, and image resolution. Perfect for students of optics and digital imaging.