Image Processing and Analysis

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson
Download our mobile app to listen on the go
Get App

Questions and Answers

Which of the following best describes the primary goal of image processing and analysis in the context of remotely sensed data?

  • To enhance the aesthetic appeal of satellite imagery for public consumption.
  • To identify objects and assess their importance through logical analysis of remotely sensed data. (correct)
  • To archive historical satellite images for future reference without analytical interpretation.
  • To precisely measure the distance between physical features on the ground.

A digital number (DN) is associated with each pixel in a digital image, representing what aspect of the area within a scene?

  • The average elevation above sea level.
  • The total amount of electromagnetic radiation emitted.
  • The precise geographic coordinates.
  • The average radiance of a relatively small area. (correct)

What is the primary effect of reducing pixel size in digital representation of a scene?

  • It preserves more scene detail in digital representation. (correct)
  • It diminishes the precision of geographic coordinates.
  • It reduces the radiometric resolution of the image.
  • It decreases the amount of data required for storage.

What is CEOS's (Committee on Earth Observation Satellites) role in data formats for digital satellite imagery?

<p>CEOS's format is becoming an accepted standard, although a worldwide standard is yet to be formally agreed upon. (B)</p>
Signup and view all the answers

What is the main advantage of using raster data structures in digital image analysis?

<p>It is easy to find and manipulate individual pixel values. (B)</p>
Signup and view all the answers

Which of the following is the MOST accurate definition of 'spatial resolution' in the context of remote sensing?

<p>The ability to distinguish between closely spaced point targets. (A)</p>
Signup and view all the answers

Instantaneous Field of View (IFOV) is most commonly defined as which of the following?

<p>The angle subtended by a single detector element. (B)</p>
Signup and view all the answers

What primarily determines the temporal resolution of a satellite imaging system?

<p>The orbital characteristics and swath width. (C)</p>
Signup and view all the answers

Which of the following is an example of an analog image processing technique?

<p>Using optical photogrammetric techniques. (B)</p>
Signup and view all the answers

What is the purpose of 'pre-processing' remotely sensed images?

<p>To correct for geometric and radiometric distortions. (B)</p>
Signup and view all the answers

What is the PRIMARY aim of feature extraction in the pre-processing of remotely sensed data?

<p>To reduce the data dimensionality by isolating the most useful components. (B)</p>
Signup and view all the answers

Which of the following situations would necessitate radiometric corrections?

<p>When the sensor records errors in the measured brightness values of pixels. (D)</p>
Signup and view all the answers

What is the typical method for correcting 'line dropouts' in remotely sensed imagery?

<p>Replacing the defective line with a duplicate or average of adjacent lines. (B)</p>
Signup and view all the answers

What is the primary goal of 'de-striping' a remotely sensed image?

<p>To remove systematic horizontal banding patterns caused by detector misadjustments. (B)</p>
Signup and view all the answers

Which type of image distortion do 'Geometric Corrections' address?

<p>Distortions that arise from earth curvature and sensor motion. (B)</p>
Signup and view all the answers

What is the fundamental principle behind 'Resampling Methods' used in geometric correction?

<p>To estimate pixel values at new grid locations to match geometric coordinates. (C)</p>
Signup and view all the answers

Which resampling method preserves the original values in the altered scene but may create noticeable errors, especially in linear features?

<p>Nearest Neighbor (B)</p>
Signup and view all the answers

What is 'Scan Skew' in the context of geometric distortions in remotely sensed images?

<p>The geometric distortion caused by forward motion of the spacecraft. (C)</p>
Signup and view all the answers

What is the primary effect of atmospheric scattering on remotely sensed data?

<p>It leads to radiometric distortion in image data. (B)</p>
Signup and view all the answers

Why is atmospheric correction necessary when deriving ratios of different spectral bands in multispectral imagery?

<p>To mitigate the effects of atmospheric scattering, which varies by wavelength. (A)</p>
Signup and view all the answers

What is the main goal of 'Image Enhancement Techniques' in remote sensing?

<p>To make satellite imageries more informative and assist image interpretation. (D)</p>
Signup and view all the answers

In spectral enhancement techniques, what does 'Density Slicing' involve?

<p>Mapping a range of contiguous grey levels to a point in the RGB color cube. (A)</p>
Signup and view all the answers

Which of the following best describes the underlying principle of 'histogram equalisation'?

<p>Ensuring that each level in the displayed image contains roughly the same number of pixels. (A)</p>
Signup and view all the answers

In the context of multi-spectral enhancement techniques, what is 'Band Ratioing' primarily used for?

<p>Highlighting spectral variations in different target materials. (A)</p>
Signup and view all the answers

What is the main purpose of 'Principal Component Analysis' (PCA) in image processing?

<p>To remove redundancy in multispectral data by transforming correlated bands into uncorrelated components. (B)</p>
Signup and view all the answers

Flashcards

Image Processing and Analysis

Examining images to identify objects and judge their significance, including detecting, identifying, and classifying features.

Digital Image

An array of numbers depicting spatial distribution of field parameters, such as EM radiation reflectivity.

Pixel

A discrete picture element in a digital image, associated with a DN representing average radiance.

Digital Number (DN)

A number representing the average radiance of a relatively small area within a scene.

Signup and view all the flashcards

Image Resolution

The ability of an imaging system to record fine details in a distinguishable manner.

Signup and view all the flashcards

Spectral Resolution

Refers to the width of the spectral bands; helps distinguish materials based on reflectance and emissivity.

Signup and view all the flashcards

Radiometric Resolution

Refers to the number of digital levels used to express the data collected by a sensor.

Signup and view all the flashcards

Spatial Resolution

The geometric properties of the imaging system that define the ability to distinguish between point targets.

Signup and view all the flashcards

Instantaneous Field of View (IFOV)

The angle subtended by the geometrical projection of a single detector element to the Earth's surface.

Signup and view all the flashcards

Temporal Resolution

Refers to the frequency with which images of a given geographic location can be acquired.

Signup and view all the flashcards

Analog Processing Techniques

Applying visual techniques to hard copy data to interpret and analyze images.

Signup and view all the flashcards

Digital Image Processing

A collection of techniques for the manipulation of digital images by computers.

Signup and view all the flashcards

Pre-processing

Operations that prepare data for subsequent analysis by correcting for systematic errors.

Signup and view all the flashcards

Feature Extraction

The process of isolating the most useful components of the data for further study.

Signup and view all the flashcards

Image Enhancement

Operations carried out to improve the interpretability of an image by increasing apparent contrast.

Signup and view all the flashcards

Information Extraction

The last step toward the final output of image analysis, involving quantitative analysis to assign pixels to specific classes.

Signup and view all the flashcards

Image Restoration

Removal of unwanted elements from image data due to limitations in sensing or transmission.

Signup and view all the flashcards

Radiometric Errors

Errors in the measured brightness values of pixels, often due to instruments or atmospheric effects.

Signup and view all the flashcards

Line-Dropouts removal

Cosmetic operations for removing defects like line dropouts from images.

Signup and view all the flashcards

De-striping

Occurs when one or more detectors go out of adjustment, causing horizontal banding.

Signup and view all the flashcards

Geometric Distortions

Serious geometrical distortions in raw digital images from earth curvature, platform motion, etc.

Signup and view all the flashcards

Rectification

The process of projecting image data onto a plane to conform to a map projection system.

Signup and view all the flashcards

Registration

Process of making image data conform to another image. A map coordinate system is not necessarily involved.

Signup and view all the flashcards

Non-Systematic Distortions

Distortions caused by variations in spacecraft variables.

Signup and view all the flashcards

Atmospheric Effects

Atmospheric scattering and absorption processes degrade image data.

Signup and view all the flashcards

Study Notes

Image Processing and Analysis: An Overview

  • Image Processing and Analysis Definition: Examining images to identify objects and judge their significance.
  • Image analysts use logical processes to detect, identify, classify, measure, and evaluate physical and cultural objects, patterns, and spatial relationships from remotely sensed data.

Digital Data Explained

  • Digital images are arrays of numbers representing the spatial distribution of field parameters like EM radiation reflectivity, emissivity, temperature, or geophysical/topographical elevation.
  • Pixels: Discrete picture elements.
  • DN (Digital Number): A number associated with each pixel that depicts the average radiance of a small area in a scene, typically ranging from 0 to 255.
  • A smaller pixel size preserves more scene detail in digital representation.
  • Remote sensing images are recorded and processed digitally for interpretation, available in photographic film and digital forms.
  • Variations in brightness on photographic films represent scene characteristics. Variations in energy are shown when brighter parts of a scene reflect more energy, while darker parts reflect less.

Data Formats for Digital Satellite Imagery

  • Digital data is supplied via computer-readable tapes or CD-ROMs.
  • CEOS (Committee on Earth Observation Satellites) format is becoming the accepted standard, despite the lack of a worldwide standard for storage and transfer.
  • Common formats include Band Interleaved by Pixel (BIP), Band Interleaved by Line (BIL), and Band Sequential (BQ).

Image Resolution: Key Aspects

  • Resolution Definition: The ability of an imaging system to record fine details distinctly.
  • Knowledge of resolution is essential in remote sensing for both practical and conceptual understanding.
  • Resolution characteristics are crucial in determining the suitability of remotely sensed data for specific applications.
  • Key characteristics of imaging remote sensing instruments include spectral, radiometric, spatial, and temporal resolution.

Spectral Resolution

  • Spectral Resolution Definition: Width of the spectral bands.
  • Different materials on Earth exhibit unique spectral reflectance and emissivity, defining spectral position and sensitivity for distinguishing materials.

Radiometric Resolution

  • Radiometric Resolution Definition: The number of digital levels used to express sensor-collected data.
  • It's expressed as the number of bits needed to store the maximum level.

Spatial Resolution

  • Spatial Resolution Definition: The geometric properties of the imaging system, distinguishing point targets, measuring periodicity, and measuring spectral properties in small targets.
  • IFOV (Instantaneous Field of View) is commonly quoted and is the angle subtended by the geometrical projection of a single detector element.
  • IFOV may also be a distance D measured along the ground, dependent on sensor height: D = hb, where h is height and b is the angular IFOV in radians.

Temporal Resolution

  • Temporal Resolution Definition: The frequency with which images of a given location are acquired.
  • Satellites offer frequent and regular data coverage
  • Temporal resolution depends on orbital characteristics and swath width (imaged area width).
  • Swath width formula: 2htan(FOV/2), where h is sensor altitude and FOV is the angular field of view.

Improving Images: Techniques

  • Remotely sensed data analysis uses various image processing techniques, including analog and digital image processing.

Visual or Analog Processing

  • Visual techniques are applied to hard copy data like photographs or printouts.
  • Image analysis adopts interpretation elements that depend on knowledge of the study area and the analyst.
  • Texture is useful for distinguishing objects with similar tones (e.g., water and tree canopy).
  • Association is a powerful tool combined with site knowledge.
  • Multi-concept examination combines multispectral, multitemporal, and multiscale data with multidisciplinary knowledge.
  • Analog techniques include optical photogrammetric techniques for precise measurement of height, width, and location.

Digital Image Processing: An Overview

  • Digital Image Processing Definition: A set of techniques for digital image manipulation by computers.
  • Digital image processing corrects flaws and deficiencies in raw data from satellite platforms.
  • Processing steps depend on image format, initial condition, data of interest, and scene composition.
  • General steps are: Pre-processing, display and enhancement, and information extraction.

Pre-processing Steps

  • Pre-processing Definition: Operations that prepare data for subsequent analysis by correcting or compensating for systematic errors.
  • Digital imageries undergo geometric, radiometric, and atmospheric corrections.
  • The investigator selects relevant pre-processing techniques based on the information sought.
  • Feature extraction, used post-preprocessing, reduces data dimensionality by isolating useful components while discarding the rest.

Image Enhancement: Improving Interpretability

  • Image Enhancement Definition: Operations to improve image interpretability by increasing contrast among features.
  • Enhancement techniques depend on digital data characteristics (spectral bands, resolution) and interpretation objectives.
  • Common enhancements include image reduction, rectification, magnification, transect extraction, contrast adjustments, band ratioing, spatial filtering, and transformations.
  • Often for visual interpretation only.

Information Extraction

  • Information Extraction Definition: The final step toward image analysis output.
  • After pre-processing and enhancement, remotely sensed data undergoes quantitative analysis to assign pixels to classes.
  • Classification is based on known and unknown identities, evaluated for accuracy by comparing classified images with ground-truth areas.
  • Analysis results include maps, data, and reports, providing comprehensive information on source data, method, outcome, and reliability.

Pre-processing Remotely Sensed Images

  • Raw data from satellite sensors contains flaws and deficiencies.
  • Pre-processing includes a range of operations from simple to complex, categorized as feature extraction, radiometric corrections, geometric corrections, and atmospheric correction.
  • Techniques remove unwanted elements like image/system noise, atmospheric interference, and sensor motion.

Restoring Digital Data

  • Removal of effects "restores" digital data to its original condition, although complete accuracy is unattainable.
  • Correction attempts may introduce errors for both radiometric and geometric aspects.

Feature Extraction

  • Feature Extraction Definition: "Statistical" characteristics, not geographical features, such as individual bands or combinations, carrying information about scene variation.
  • Used in multispectral data to highlight necessary image elements and reduce spectral bands for analysis.
  • After feature extraction, the analyst works with desired channels or bands, and individual bandwidths become more informative.
  • Pre-processing increases speed and reduces analysis cost.

Radiometric Corrections

  • Radiometric Corrections Definition: Corrections carried out when errors occur in measured pixel brightness values, referred to as radiometric errors.
  • Errors can result from instruments or atmospheric effects.
  • Radiometric processing corrects for sensor malfunctions and atmospheric degradation by adjusting brightness values.
  • Radiometric distortion includes differences in brightness distribution over an image compared to the ground scene and distortions in relative brightness of a pixel from band to band.

Methods for Defect Removal

  • Methods define cosmetic operations for removing defects like line-dropouts, banding or striping and random noise.

Line-Dropouts

  • Line-Dropouts Definition: A string of adjacent pixels in a scan line containing spurious DN.
  • Occurs when detectors malfunction or are overloaded by sudden high radiance.
  • Correction: Replacing the defective line with a duplicate or average of preceding/subsequent lines.

De-Striping

  • De-Striping Definition: Banding or striping that occurs when one or more detectors go out of adjustment.
  • Horizontal banding patterns on images from electro-mechanical scanners show lines with consistently high or low DN.
  • De-striping reasons include improved visual appearance and interpretability, and equal pixel values representing equal ground radiance areas.
  • De-Striping Methods: Constructing histograms for each detector, equalizing means and standard deviations across detectors.

Random Noise

  • Random Noise Definition: Odd pixels with spurious DN appearing frequently in images that are suppressed by spatial filtering.
  • Defects can be identified by marked DN differences from adjacent pixels.
  • Correction of Noisy Pixels: Replacing/substituting spurious Noise Pixels with an average value of the neighborhood DN using moving windows

Geometric Corrections

  • Raw digital images contain geometric distortions from earth curvature, platform motion, relief displacement, and scanning motion non-linearities.
  • Distortions are classified as non-systematic and systematic.
  • Rectification: Projecting image data onto a plane to conform to a map projection system.
  • Registration: Making image data conform to another image, regardless of map coordinate system.

Non-Systematic Distortions

  • Non-Systematic Distortions Definition: Distortions caused by variations in spacecraft variables.
  • These are evaluated using resampling methods based on ground control points (GCPs).

Resampling Methods

  • Methodology: Deriving output pixel locations from ground control points (GCPs).
  • Original pixels are resampled to match geometric coordinates using different resampling methods:

Nearest Neighbor Method

  • Methodology: Assigning corrected pixel value from nearest uncorrected pixel with simplicity and original value preservation advantages.
  • Produces noticeable errors, notably severe in linear features where realignment is obvious.

Bilinear Interpolation Method

  • Bilinear Interpolation: Output pixel value calculation based on weighted average of four nearest input pixels creating a natural looking image.
  • Brightness values in the input image can be obscured as Output image is resampled potentially reducing spatial resolution

Cubic Convolution Method

  • Cubic Convolution: A sophisticated, complex resampling method using average values within 25 adjacent pixels
  • Yields attractive images, but drastically altered compared to nearest neighbor and bilinear interpolation.

Image Correction Using Mapping Polynomial

  • Polynomial equations convert source coordinates to rectified coordinates using 1st and 2nd order transformations.
  • Coefficients (ai and bi) are calculated by least square regression for relating any point in a map to its corresponding image point.
  • The application requires GCPs for calculating transformation matrix and inverse transformation converting reference coordinates to source coordinates, enabling RMS error determination.

Systematic Distortions

  • Systematic Distortions: Geometric distortions that are constant and predictable in advance from Scan skew and Known mirror velocity variation:

Scan Skew

  • Scan Skew Definition: Caused by spacecraft's forward motion during mirror sweep, resulting to a non-normal ground swath

Known Mirror Velocity Variation

  • Known Mirror Velocity Variation Definition: Known mirror velocity variation are used to correct the minor distortion due to the velocity of the scan mirror fluctuating

Cross Track Distortion

  • Cross Track Distortion Definition: Occur in unrestored images from cross track scanners, from sampling pixels along scan line at constant intervals.
  • Pixel width is proportional to the tangent of the scan angle, wider at scan line margins, compressing the pixel.
  • This distortion is restored using trigonometric functions.

Atmospheric Corrections

  • The output from instruments on a satellite is affected by the atmosphere, causing signal attenuation and augmentation depending on the spectral intensity
  • Atmospheric issues will arise if correct radiation properties of a target body on the earth surface cannot be reconstructed.
  • Atmospheric Correction Definition: The need to correct varying levels of brightness and uneven geometry to improve accuracy during interpretation

Atmospheric Correction Reasons

  • Used to derive ratios in multi-spectral images because the wavelength effect of atmospheric scattering unevenly affects channels.
  • When determining land surface reflectance or sea surface temperature.
  • To compare or mosaic images taken at different times.

Correction Methods

  • Rectifying data requires modeling the scattering and absorption in the atmosphere using
  • Ignore the atmosphere
  • Collecting ground measurements or calibrating quantities.
  • Modeling absorption or scattering effects per the composition and temperature profile
  • Using the information about the atmosphere inherent to remotely sensed data

Correction for Atmospheric Scattering

  • Involves ratio analysis of two image bands because atmospheric scattering scatters short wavelengths (like haze causing atmospheric distortion and reduced atmospheric contrast
  • Atmospheric Models: LOWTRAN, MODTRAN, and HITRAN provide standards for sensor type, target altitudes, and viewing angle.

Image Enhancement Techniques

  • Image Enhancement Techniques Definition: Designed to make satellite imagery more informative and help to achieve the ultimate goal of image interpretation.
  • Enhancement Definition: Altering an image's appearance so information can more rapidly interpreted for a particular need.
  • Image enhancement involves two types including.
  • Spectral Enhancement Techniques
  • Multi-Spectral Enhancement Techniques

Spectral Enhancement Techniques

  • Spectral Enhancement Techniques: Techniques applied to single-band images or separately to individual bands of a multiband image set.

Density Slicing

  • Density Slicing Definition: Mapping a range of contiguous grey levels of a single-band image to a point in the RGB color cube.
  • DN's slices are divided up into distinct classes which are displayed as multiple gray levels for applications like weather forecasting and displaying temperature maps.

Contrast Stretching

  • Contrast Stretching Definition: Designed with a variety of potential applications, but the full dynamic range of the sensor and result in a more dull image
  • Involves remapping DN distribution across full display resulting in vivid images. This includes Linear Contrast Stretch, Histogram Equalization and Gaussian Stretch.

Multi-Spectral Enhancement Techniques

  • This includes Image Arithmetic Operations

Image Arithmetic Operations

  • Performed on two+ co-registered images of the same area like addition, subtraction, Multiplication, Division and complicated algebra
  • Sea surfaces are sometimes derived with multispectral imaging called split Windows or multichannel techniques.

Band Subtraction

  • Definition: Operations performed to co-register scenes from different times for change detection.

Multiplication of Images

  • Definition: Use of a single real image and binary image formed of ones and zeros.

Band Ratioing

  • Definition: Division of one spectral band by another for image enhancement in remote sensing (ecological, geological applications..)
  • Ratio images are enhancements resulting from the division of DN values one spectral band via corresponding DN in other spectral band
  • Multi-ratio images used often to drive red, blue and green monitor guns for colour

Principal Component Analysis (PCA)

  • Definition: Spectrally adjacent bands in multispectral remotely sensed imagery are often highly correlated
  • Multiband visible/near-infrared images of vegetated land will show negative correlations along visible red/near infrared
  • Positive correlations among the visible brands also exist in PCA, redundancy in Multispectral data images are targeted by Principal Component Analysis
  • To transform the original data onto principal component axes, transformation coefficients acquired are applied linearly
  • Transformations are contextually applied as enhancements or prior data classification, information means variance or scatter around means
  • Data sets have typically less dimensionality of spectral brands

Decorrelation Stretch

  • Principal Components can be stretched/transformed back to RGB colours

Canonical Components

  • Is an appropriate source when there is less prior info, Canonical Component locates axis for various user types to maximize separability

HIS Transforms

  • Definition: Coordinate mapping on the red, green and blue colour axis as hue saturation and intensity is identified where perceived is dominant with color

Fourier Transformation

  • Definition: Transformation that operates on singular bands and breaks down images for scale in sinusoidal form using coordinates
  • Frequency is expressed in cycles as its main function to convert images

Spatial Filtering

Definition: Selective Spatial Emphasis that emphasise all across a wide spatial scale and implements fourier transformation to domains

The 2 types of Spatial Domain

  • Convolution Filters
  • low and high pass

Low-Pass Filters

  • Also known as smoothing filter with long wavelength where is low frequency Information can can have background at expense to high higher spatial

High-Pass Filters

  • Also known as sharpening filter by subtracting low frequency at high space

Frequency Domain Filters

Fourier transforms breakdown the frequency for amplitude spectrum to the point where it is broken down and attenuated.

Image Classification

This is a key aspect in this field and is responsible for sorting data which can be based on assumptions of the following

  • DN or Multi Channel data
  • Relationships With Data

forms of classifications

  • Pattern Recognition and Spectral

Supervised Classifications

This often uses pixel supervision of numerical algorithm of the data by specifying class types

Steps involved

  • Training for identification in attributes
  • Classification to match pixels of numerical approaches

Parallel piped Classifier

  • This determines the estimate maximum and band pixel values and tests if it falls inside to limit its usage

Bayesian probability

Is the functions probability function with defined values in order to limit data

Unsupervised Classification

  • Does not make use of training but the data pixels and groups on base

Iso Data Clustering(Iterative Self-Organising Data Analysis Techniques)

  • Continues with re calculation with a set of data which may create a biased image

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

Image Processing and Analysis

More Like This

Use Quizgecko on...
Browser
Browser