🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

RS_EE&ASE_1_merged.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

Remote Sensing and Geographic Information Systems (GIS) Shaurya Rahul Narlanka [email protected] WHAT IS REMOTE SENSING ? ❖ Science and art of obtaining information about an object, area or phenomenon without actual physical contact with the objects...

Remote Sensing and Geographic Information Systems (GIS) Shaurya Rahul Narlanka [email protected] WHAT IS REMOTE SENSING ? ❖ Science and art of obtaining information about an object, area or phenomenon without actual physical contact with the objects. example, reading. ❖ Restricted to methods that employ Electromagnetic Energy such as light, heat and radio wave as the means of identifying and collecting information about objects. ❖ We can identify and categorize these object by class or type, substance, and spatial distribution. 4 Remote sensing: art and/or science ❖ Science because ❖ Mathematical and statistical algorithms are employed to extract information from the measured data by various RS instruments ❖ Function harmoniously with other data sources, tools of mapping (Cartography), GIS etc. ❖ Combining scientific knowledge with real world allows the interpreter to develop some heuristic thumb rules so that valuable information can be extracted. ❖ Art Because, Some image analysts are superior to others because they, 1. Understand the scientific principles better 2. More widely travelled & seen many different landscapes 3. Can synthesize the scientific and real world knowledge to reach logical and correct conclusions because of the experience & exposure FORMS OF REMOTELY COLLECTED DATA ❖ Variation in gravity force distribution - Gravity meter ❖ Acoustic wave – SONAR ❖ Electromagnetic energy - Eye ADVANTAGES OF REMOTE SENSING ❖ Regular revisit capabilities ❖ Broad regional coverage ❖ Good spectral resolution ❖ Good spatial resolution ❖ Ability to manipulate/enhance digital data ❖ Ability to combine satellite with other digital data ❖ Cost effective data ❖ Map-accurate data ❖ Possibility of stereo viewing ❖ Large archive of historical data Time series One year of daily AVHRR at 1km of the Amazon Basin KEY MILESTONES IN REMOTE SENSING OF THE ENVIRONMENT 1826 – Joseph Niepce takes first photograph 1858 – Gaspard Tournachon takes first aerial photograph from a balloon 1913 – First aerial photograph collected from an airplane 1935 – Radar invented 1942 – Kodak patents color infrared film 1950s – First airborne thermal scanner 1957 – First high resolution synthetic aperture radar 1962 – Corona satellite series (camera systems) initiated by the Intelligence community 1962 – First airborne multispectral scanner 1972 – ERTS-1 Launched – First Landsat satellite Early photograph by J. Niepce circa1830 Nadar in his balloon Nadar photograph of Paris Thaddeus Lowe’s Civil War Balloons U.S.Army of the Potomac 1861-1865 Massachusetts’ man, Professor and visionary, Lowe Observatory/Calif. Platform: Balloon Sensor: Telescope Data System: Telegraph Thaddeus Lowe, circa 1861- 1865 used remote sensing for military purposes. Then, and as now, newest developments are always in the military sphere Remote sensing early in the airplane era U-2 Spy Plane 1954-1960 Flew at 70,000’ over USSR air defenses SR-71 Blackbird super-sonic spy plane CIA’s Corona Program 1960-1972 >100 missions Followed after U-2s Platform: Spacecraft Sensor: Camera Data System: Film Drop Started: August 1960 Coverage: 7.6 Bil mi2 Spatial Resolution: early missions @ 13 m, later missions @ 2 m Spectral Resolution: visible and visible-near infrared (both film) Radiometric Resolution: equivalent 24 to 26 (4 to 6 bits) CIA’s Corona Program Washington Monument 1967 REMOTE SENSING PROCESSES AND STAGES BASIC PROCESSES ❖Statement of the problem ❖ Data acquisition ❖ Data analysis ❖ Information presentation PROCESS OF REMOTE SENSING 12 65 C 28 33 E 76 A A. Radiation and the atmosphere B. Interaction with target D A C. Energy recorded and B converted by sensor D. Reception and processing E. Interpretation and analysis DIFFERENT STAGES IN EM REMOTE SENSING ❖ Origin of electromagnetic radiation (EMR) - Sun - self emission ❖ Transmission of EMR from source to earth surface and its interaction with atmosphere ❖ Interaction of EMR with the earth's surface ❖ Reflection/absorption/transmission or self emission ❖ Transmission of reflected/emitted EMR to remotely placed sensor ❖ Sensor data output ❖ Collection of ground truth and other collateral information ❖ Data processing and analysis Data acquisition ❖ Energy Sources, their interactions with atmosphere and earth surface features, and Retransmission of Energy thro Atmosphere ❖ INSITU - Field, Laboratory and Collateral Data ❖ Airborne/Spaceborne sensors - Active and Passive ❖ Generation of sensor data in pictorial and/or digital form Data analysis ❖ Data Processing - Geometric and Radiometric Corrections, ❖ Contrast enhancements, Derivation of vegetation indices, etc. ❖ Visual / Digital Interpretation using various devices / methods to extract information viz., type, extent, location, and condition pertaining to various resources Information Presentation ❖ Compilation and Output in the form of soft copies, hard copies, Graphs and tables ❖ Distribution to users for decision making MAJOR REMOTE SENSING APPLICATIONS 1. Agriculture 2. Geology 3. Environment 4. Marine-based explorations 5. Infrastructure development 6. Forestry 7. Urban planning 8. Disaster management 9. Water resources planning And many more. Fig. 2 Electromagnetic remote sensing of earth 1 REMOTE SENSING (LIGHT SOURCES, LIGHT INTERACTION WITH ATMOSPHERE & OBJECTS, SPECTRAL SIGNATURE & SENSORS) ELECTRO MAGNETIC RADIATION ELECTROMAGNETIC RADIATION: ❖ Two diff. models – Wave model & Particle Model 1. WAVE MODEL: ❖ James Clark Maxwell in 1860 conceptualized that EMR or electromagnetic radiation as wave that travels in the space at the speed of light (299792.46 km/s) ❖ EM wave consists – 2 fluctuating fields – Electric & Magnetic ❖ 2 fields are perpendicular to each other and also perpendicular to the direction of propagation ❖ Both have same amplitudes (strength) – reach their maxima & minima at the same time. ❖ EMR can transmit through space & generated wherever electric charge is accelerated ❖ 2 imp. Characteristics – Wavelength & Frequency WAVELENGTH – length of one complete wave cycle, measured as the distance between 2 successive crests/troughs ❖ Represented by λ, & measured by meters/Km/cm/micrometer etc. Electromagnetic wave, E-Electrical Component, M-Magnetic Component E Crest E C Velocity of Light M M Trough ν = Frequency (No. of cycles passing per second of a fixed point Electromagnetic wave (comprising both magnetic and electric fields at 90° to each other) FREQUENCY – No. of cycles a wave passing a fixed point per unit time ❖ Represented by ν, & measured by Hertz/KHz/MHz/GHz etc. ❖ A wave completing one cycle per second is having a frequency of 1 Hz. Relationship between wavelength & frequency: λ = c/ν ❖ Frequency is inversely proportional to wavelength ❖ Longer the wavelength, lower the frequency λ Longer wavelength Shorter frequency Longer frequency λ Shorter wavelength PARTICLE MODEL – ❖ Light is a stream of particles called ‘PHOTONS’ – similar to subatomic particles like neutrons. ❖ Quantum theory applicable to this type of motion describes that light is transferred in discrete pockets called ‘quanta’ or ‘photons’. ❖ Photons – physical form of quantum, subatomic massless particles - basic particle for EM force. ❖ Comprise radiation emitted by the matter when it is excited thermally. ❖ Energy in photons - represented in terms of electron volts ❖ Rate of energy from one place to another – Flux & measured by ‘Watts’ ❖ Amount of energy by a photon is determined by Plank’s equation: Q=hν ❖ Q – energy measured in watts ❖ h – Plank’s constant (6.6260×10-34 J) Light: Dual Wave & Particle nature ❖ We have, λ = c/ν = hc / hν = hc / Q i. e., Q = hc / λ ❖ Quantum energy is inversely proportional to its wavelength ❖ Photons with shorter wavelengths (at higher frequency) – more energetic) ELECTROMANGETIC SPECTRUM The electromagnetic spectrum is the range of frequencies of electromagnetic radiation and their respective wavelengths and photon energies. Electromagnetic Spectrum – Useful Wavelength Bands ❖ Visible range - Blue - 0.4 - 0.5 m Green - 0.5 – 0.6 m Red - 0.6 – 0.7 m ❖ Infrared (IR) Near IR (NIR) 0.7 – 1.3 m Mid IR 1.3 – 3.0 m Thermal IR - beyond 3 m to 14 m ❖ Micro wave 1 mm – 1 m ❖ Earth's atmosphere absorbs energy in  rays, x rays and ultraviolet bands – hence, not used in Remote Sensing 12 Electromagnetic Radiation Ultraviolet Radiation - 0.4 micrometers ❖ Not much RS activities are done with UV since these shorter wavelengths are easily scattered by the atmosphere Electromagnetic Radiation Visible Radiation ❖ BLUE (.4-.5 micrometers) ❖ GREEN (.5-.6 micrometers) ❖ RED (.6-.73 micrometers) Electromagnetic Radiation Infrared Radiation - 0.72 - 15 micrometers 1. Near Infrared - reflected, can be recorded on film 2. Mid Infrared - reflected, can be detected using electro-optical sensors. 3. Thermal Infrared - emitted, can only be detected using electro-optical sensors Electromagnetic Radiation Microwave Radiation - radar sensors, wavelengths range from 1mm to 1m ❖ Sun – main source of EMR for remote sensing. ❖ However, all matter at temperature above absolute zero (0 K or - 273C) emits EMR. ❖ Amount of energy emitted from an object is a function of temperature of the object; M=  t4 (Boltzmann Law) where M= total energy from the surface of a matter watts m-2  = Stefan – Boltzmann Constant 5.669710-8 W m-2 W M-2 K-4 T= absolute temperature (K) of the emitting material This law is expressed for an energy source that behaves as a blackbody. Blackbody is a hypothetical, ideal radiator that totally absorbs and re-emits all energy incident on it. 17 Spectral distribution of energy radiated from black bodies at 18 various temperatures Dominant Wavelength LIGHT INTERACTION WITH THE ATMOSPHERE Atmosphere has profound effect mainly on: ❖the intensity ❖ the spectral composition of radiation available to sensors Net effect of Atmosphere varies with ❖ differences in path length ❖ magnitude of energy signal being sensed ❖ the atmospheric conditions present ❖ wave lengths involved Atmospheric modification of incoming and outgoing EM radiation includes scattering, refraction and absorption 21 Absorption ❖ Caused primarily by three atmospheric gases: ozone, carbon dioxide and water vapour ❖ Process by which radiant energy is absorbed and converted into other forms of energy. ❖ May take place at atmosphere & by the targets ❖ An absorption band is a range of wavelengths in which radiant energy is absorbed by the particles ❖ Atmosphere can close down RS activity in these bands due to cumulative effect of these wavelengths. no attenuation attenuation ❖ Thus, atmospheric gases absorb EMR in different specific regions of the spectrum ❖ They influence where in the spectrum we can ‘look’ for RS purpose. ❖ Those areas in the spectrum which are not severely affected by absorption & useful for RS activities – ATMOSPHERIC WINDOWS. ❖ These regions are less affected by absorption & nearly transparent. Atmospheric transmittance and atmospheric windows (Note that wave length scale is logarithmic.) Scattering - Unpredictable, diffusion of radiation by particles in the atmosphere - Depends on relative size of particles and the radiation wavelength Adverse effects in remote sensing - Reduces image contrast - Changes spectral signature of ground objects as seen by sensor 25 Rayleigh scattering - Atmosphere molecules and particle size 8  ❖ diffuse reflection – (Lambertian) ❖ reflection is scattered and equal in all direction (uniform) ❖ - contains information on the colour of reflecting surface In remote sensing “diffuse reflectance” properties of terrain are measured ❖ energy is reflected equally in all directions ❖ many natural surfaces act as a diffuse reflector to some extent. 36 Specular vs diffuse reflectance 37 CONCEPT OF SPECTRAL SIGNATURE Reflectance characteristics of earth surface features are quantified by measuring the portion of the incident energy that is reflected Energy of wave length () reflected Spectral reflectance = -------------------------------------------- x 100 Energy of wave length () incident 38 SPECTRAL REFLECTANCE CURVES ❖ These curves show the relationship between EM spectrum with the associated % reflectance for any given material. ❖ Plotted as a chart that represents wavelength on horizontal axis & percent reflectance on the vertical axis. ❖ Spectral reflectance - responsible for color or tone in visible band ❖ Signatures are not deterministic , but statistical in nature Spectral Signatures SPECTRAL SIGNATURES ❖ Signal received by sensor depends on land cover 50 50 50 Spectral Signature % Reflectance unique to healthy vegetation 0 0 Water Bare Earth 0.4.6.8 1.0 1.2 1.4 Green - Highest reflectance hence we see green trees SPECTRAL REFLECTANCE OF VEGETATION: Visible band 0.4 m - 0.7 m ❖ At wave length bands 0.45 m and 0.67 m chlorophyll in plant absorbs energy ❖ Human eye perceive healthy vegetation as green Infrared band – 0.7 m to 1.3 m ❖ 40 to 50 % energy reflects – Rest is transmitted , absorption is minimal (5%) ❖ Reflectance is mainly due to the internal structure of the plant leaves. Hence discrimination among the species is possible in this band. ❖ Beyond Infrared band –1.3 m, Energy absorbed or reflected, but no transmission ❖ Dips at 1.4 m, 1.9 m and 2.7 m – because of water presence in the leaf absorbs energy ❖ Reflectance is inversely proportional to the total water present Typical vegetation reflectance spetrum Soil ❖ Less peak and valley variations in reflectance ❖ Factors effecting soil reflectance - moisture content - soil texture (sand, silt and clay) - surface roughness - presence of iron oxide - organic matter content 46 Water: ❖ Little reflectance in only blue and green wave band. ❖ Presence of turbidity increase the reflectance ❖ Absorbs most of the radiation in the near IR ❖ Middle infrared region - helps in delimiting even small water bodies. ❖ Dissolved gases and many inorganic salts do not manifest any change in the spectral response of water. 47 Comparison of reflectance spectra of a few land covers ❖ Spatial effects - same type of features have different characteristics at different geographic location at given point of time - identified by - shape, size & texture of objects ❖ Temporal effects – changes in reflectivity or emissivity of a feature over time - helps in the study of changes in the growing cycle of a crop 49 Polarization – refers to the changes in polarization of radiation reflected or emitted by an object - helps to distinguish the objects – useful in microwave remote sensing 50 Wayanad Floods-2024 Source: NRSC SENSORS Sensors – Mounted on the platform – capture the reflected energy from the objects Classification 1. Based on the type of illumination: Two types - Active or passive sensors ❖ Passive: sensors measure the amount of energy reflected from the earth’s surface ❖ Active: sensor emits radiation in the direction of the target, it then detects and measures the radiation that is reflected or backscattered from the target. ❖ Most systems rely on the sun to generate all the EM energy needed to image terrestrial surfaces - passive sensors. ❖ Other sensors generate their own energy, called active sensors, transmits that energy in a certain direction and records the portion reflected back by features within the signal path. Active Remote Sensing ❖ Transmit their own signal and measure the energy that is reflected or scattered back from the target ❖ Advantages: ability to “see” regardless of time of day or season; use wavelengths not part of solar spectrum; better control of the way target is illuminated Ex. of active RS - Satellite Radar Altimetry ❖ Satellite radar altimeters work on the principle of active remote sensing. ❖ Transmit very 'sharp' electromagnetic pulses of a predetermined wavelength and frequency (a typical radar altimeter operates at 13.5 GHz) ❖ The satellite's altitude (i.e. absolute height above the surface) is obtained by measuring the time required for the pulse to travel from the altimeter antenna to the earth's surface and back to the satellite's receiver. Passive Remote Sensing ❖ Measure natural radiation emitted by target or/and radiation energy from other sources reflected from the target ❖ examples: passive microwave radiometers, LandSat, SPOT Examples of Passive Sensors: ❖ Advanced Very High Resolution Radiometer (AVHRR) Sea Surface Temperature ❖ Sea-viewing Wide Field-of-View Sensor (SeaWiFS) Ocean Color Classification 2. Based upon the process of scanning: whiskbroom scanner: ❖visible / NIR / MIR / TIR ❖point sensor using rotating mirror, build up image as mirror scans ❖Landsat MSS, TM Pushbroom scanner: ❖mainly visible / NIR ❖array of sensing elements (line) simultaneously, build up line by line ❖SPOT (Satellite Pour l’Observation de la Terre) Classification 3. Based upon the EMR range: 1. Optical Sensors: ❖ Sensors that operate in the optical portion of spectrum, which extends from approximately 0.3 to 14 mm. ❖ Can do more with these data because \. ❖ look at differences in colors ❖ look at differences over time ❖ Applications: meteorological, ocean monitoring (i.e. chlorophyll absorption). Optical Sensors ❖ show how much energy from the sun was being reflected or emitted off the Earth's surface when the image was taken. ❖ Clear water reflects little radiation, so it looks black. ❖ Pavement and bare ground reflect a lot of radiation, so they look bright. ❖ Urban areas usually look light blue-grey. ❖ Vegetation absorbs visible light but reflects infrared, so it looks red 2. Microwave Sensors ❖ sensors that operate in the microwave portion of the spectrum ❖ advantages: capable of penetrating atmosphere under virtually all conditions, different view of the environment. ❖ disadvantage: Radar instruments have a hard time identifying water bodies because the wavelength is much longer than the general character of the surface roughness ❖ applications: sea ice and snow, geologic features, ocean bottom contours, other planets. Satellite orbits Characteristics of Satellite Orbits 1. Orbital period 2. Altitude 3. Apogee and perigee 4. Inclination 5. Nadir, zenith and ground track 6. Swath 7. Side lap and overlap Orbital Period Time taken by a satellite to complete one revolution around the earth Spatial and temporal coverage of the imagery depends on the orbital period It varies from around 100 minutes to 24 hours Altitude The height of the satellite in a orbit from the point directly beneath it on the surface of the earth Low altitude ( altitude < 2000 km) Moderate altitude High altitude (altitude ~36000 km) Apogee and Perigee Apogee: The point at which the satellite is at the farthest distance from the earth in an orbit is defined as the apogee. Perigee: The point at which the satellite is at the closest distance from the earth in an orbit is defined as the perigee. Inclination Inclination of an orbit is its deviation from the equator measured in the clockwise direction. This is usually 99 degrees for remote sensing satellites. Nadir, Zenith and Ground Track Nadir – It is the point on the earth’s surface where the imaginary radial line between the center of the earth and the satellite meet. Nadir, Zenith and Ground Track Zenith: It is the point directly opposite to the nadir point above the satellite Ground Track: The projection of a satellites orbit on the earth’s surface. Swath The width of the area on the earth’s surface which has been ‘sensed’ by a satellite during a single pass. Overlap Areas of the earth’s surface which are common in two consecutive images along the flight path. Sidelap Overlapping areas in the images of the earth’s surface on two adjacent flight paths. Sensors in Indian remote sensing satellites ❖ Linear Self-scanning Sensors (LISS) – for multi spectral scanning. This scanner has four generations – LISS-I, LISS-II, LISS-III, & LISS-IV. ❖ Panchromatic (PAN) sensors to collect data in single Band ❖ Wide Field Sensors (WiFS) – to collect data in a wide swath with two bands ❖ Advanced Wide Field Sensors (AWiFS) - to collect data in a wide swath with four bands ❖ Ocean Color Monitor (OCM) – Operating in 8 narrow spectral bands for oceanographic applications ❖ Multispectral Optoelectronic Scanner (MOS) – in 3 & 19 bands for oceanographic applications ❖ Multi-frequency Scanning Microwave Radiometer (MSMR) – for passive RS in 4 different frequencies ❖ Synthetic Aperture Radar (SAR) for active microwave RS PAN LISS III WiFS IRS 1C Sensors overview Sun Synchronous Orbits Earth observation satellites usually follow the sun synchronous orbits. A sun synchronous orbit is a near polar orbit whose altitude is such that the satellite will always pass over a location at a given latitude at the same local solar time. In this way, the same solar illumination condition (except for seasonal variation) can be achieved for the images of a given location taken by the satellite. POLAR ORBITING SATELLITES ❖ Polar-orbiting satellites are those in which the position of the satellite’s orbital plane is kept constant relative to the sun. ❖ Landsat satellite series, IRS Series Geostationary Orbit (appr.36.000 km) ❖ Geostationary orbiting satellites are those that remain stationary relative to a point on the surface of the earth ❖ Communications and meteorological satellites Satellite Remote Sensors: TYPES OF ORBITS Sun-synchronous polar orbits ❖Circle the planet in a roughly north-south ellipse while the earth revolves beneath them - a particular place is imaged ❖Global coverage, fixed crossing, repeat sampling ❖Typical altitude 500-1,500 km Ex: Terra/Aqua, Landsat Non-Sun-synchronous orbits ❖Tropics, mid-latitudes, or high latitude coverage, varying sampling ❖Typical altitude 200-2,000 km Ex: TRMM, ICESat Geostationary orbits ❖Regional coverage, continuous sampling ❖Over low-middle latitudes, altitude 35,000 km Ex: GOES, INSAT Thank you Optical image of Montreal area during ice storm of 1998. Ice snow and clouds appear as various colors of white, vegetation is green. REMOTE SENSING PART-3 (IMAGE AND RESOLUTION) Nature of the image: ❖ Pixel - picture element having both spatial and spectral properties. ❖ Spatial property - defines the "on ground" height and width. ❖ Spectral property - defines the intensity of spectral response for a cell in a particular band Example of aof picture Composition a Pixel How will this pixel appear in the image? RS_Vis_8 Land Cover %age of pixel Reflectance Building 15 90 Sand 15 100 Trees 50 50 Water 20 10 Composition of a Pixel 55.5 RS_Vis_10 Nature of the Image: Image Resolution ❖ Spatial Resolution -- what size we can resolve ❖ Spectral Resolution -- what wavelengths do we use ❖ Radiometric Resolution -- degree of detail observed ❖ Temporal Resolution -- how often do we observe Spatial Resolution The fineness of detail visible in an image. – (coarse) Low resolution – smallest features not discernable – (fine) High resolution – small objects are discernable Factors affecting spatial resolution – Atmosphere, haze, smoke, low light, particles, blurred sensor systems, pixel size and Instantaneous field of view. Instantaneous Field of View It is defined as the angle subtended by a single detector element on the axis of the optical system. IFOV has the following attributes: Solid angle through which a detector is sensitive to radiation. The IFOV and the distance from the target determines the spatial resolution. A low altitude imaging instrument will have a higher spatial resolution than a higher altitude instrument with the same IFOV. Instantaneous Field of View Angular cone of visibility of the sensor Depends upon – Altitude of sensor – Viewing angle of the sensor Instantaneous Field of View Angular cone of visibility of the sensor Depends upon – Altitude of sensor – Viewing angle of the sensor Focal length and scale Shorter focal lengths have wider field of views, while longer focal lengths have smaller field of views. Therefore sensors with a longer focal length will produce an image with a smaller footprint compared to that of a shorter focal length. Spatial Resolution Target and background characteristics – Contrast and Shadows Image Scale – the distance on an image to the corresponding distance on the ground Large scale –objects seen better (1:50,000) Small scale –objects not clear (1:250,000) Image Scale A sensor with a 152 mm focal length takes an aerial photograph from an altitude of 2780m. What is the scale of the photograph? Elevation of ground = 500MSL. Q. The scale of an aerial photograph is 1:15,000. In the photo you measure the length of a bridge to be 0.25 inches, what is the length of the bridge in feet in real life? Spatial Resolution A photographic image has a true scale – 1:500 – engineering & surveying – 1:12,000 – resource management – 1:50,000 – large-area assessments A photograph on film cannot be resampled for higher resolution. The image captured cannot be physically manipulated. Spatial Resolution A digital image does not have a fixed scale. It’s the imaging instrument that has a fixed scale or Ground Sample Distance (GSD) in the original digital image – GSD is the distance between two consecutive pixel centers measured on the ground. The bigger the value of the image GSD, the lower the spatial resolution of the image and the less visible details. The GSD is related to the flight height: the higher the altitude of the flight, the bigger the GSD value. – Sometimes resampled where the pixels are modified to suit or change the image size Spatial Resolution Interpretability for assessing spatial resolution – Detectability – the ability to record the presence or absence of an object, although the identity of the object may be unknown. An object may be detected even though it is smaller than the resolving power of the imaging system – Recognizability – the ability to identify an object from the image. Objects can be detected and resolved an yet not be recognizable. Example – roads, railroads, canals could all look linear and have been detected but which are they? – Identification – the ability to distinguish category features such as cars from trucks or species of trees. Spatial Resolution Spectral Resolution The term spectral resolution refers to the width of spectral bands that a satellite imaging system can detect. Often satellite imaging systems are multi- spectral meaning that they can detect in several discrete bands, it is the width of these bands that spectral resolution refers to. The narrower the bands, the greater the spectral resolution. Spectral Resolution Temporal Resolution Remote Sensor Data Acquisition June 1, 2020 June 17, 2020 July 3, 2020 16 days Temporal Resolution Depends on: The orbital parameters of the satellite Latitude of the target Swath width of the sensor Pointing ability of the sensor Radiometric Resolution Radiometric resolution, or radiometric sensitivity refers to the number of digital levels used to express the data collected by the sensor. In general, the greater the number of levels, the greater the detail of information. Radiometric Resolution 7-bit 0 (0 - 127) 8-bit 0 (0 - 255) 0 9-bit (0 - 511) 10-bit 0 (0 - 1023) Radiometric Resolution Suppose you have a digital image which has a radiometric resolution of 6 bits. What is the maximum value of the digital number which could be represented in that image? Radiometric Resolution The number of digital values possible in an image is equal to the number two (2 - for binary codings in a computer) raised to the exponent of the number of bits in the image. The number of values in a 6-bit image would be equal to 26 = 2 x 2 x 2 x 2 x 2 x 2 = 64. Since the range of values displayed in a digital image normally starts at zero (0), in order to have 64 values, the maximum value possible would be 63. Radiometric Resolution The radiometric resolution of an imaging system describes its ability to discriminate very slight differences in energy The finer the radiometric resolution of a sensor, the more sensitive it is to detecting small differences in reflected or emitted energy. Signal Strength Depends on Energy flux from the surface Altitude of the sensor Spectral bandwidth of the detector IFOV Dwell time Signal to noise ratio Signal to noise ratio is defined as the ratio between the power of the signal and the background noise. Higher the SNR, the easier it is to differentiate between signal and noise Fine spatial resolution → small IFOV → less energy Difficult to detect fine energy differences → Poor radiometric resolution Poor spectral resolution Narrow spectral bands →High spectral resolution → Less energy Difficult to detect fine energy differences → Poor radiometric resolution Poor spatial resolution Wide spectral band → Poor spectral resolution→ more reflected energy Good spatial resolution Good radiometric resolution These three types of resolutions must be balanced against the desired capabilities and objectives of the sensor Images Monochromatic Images Panchromatic Images Multispectral Images ❖ Multispectral sensors detect light reflectance in more than one or two bands of the EM spectrum. ❖ These bands represent different data - when combined into the red, green, blue of a color monitor, they form different colors Nature of the Image: ❖ Multispectral image is composed of 'n' rows and 'n' columns of pixels in each of three or more spectral bands Multispectral Images The images received from each of the spectral bands of a multispectral sensors can be viewed independently as greyscale images. Or they can also be combined to form one multispectral image called color composite images. They are of three kinds: »True Color composites »False color composites »Natural color composites True Color Composites True color composite images are composed of three primary colors i.e., red, blue and green. If a multispectral sensor can detect the three visual color bands, then the three bands can be combined to give a true color composite. False color composite The display color assignment can also be chosen arbitrarily when the multispectral sensor does not sense in the primary visual color band or in the visible range of the electromagnetic spectrum for that matter. Though the colors can be chosen arbitrarily, some sets of colors are used more because they help in distinguishing certain ground features. Natural Composite Image When a multispectral scanner does not sense one or more of the primary colors, the spectral bands that it can sense can be combined to generate a image that closely resembles the visual color photograph. Natural Composite Image Example: The SPOT HRV multispectral scanner does not have a blue band. The three bands it can sense are XS1, XS2 and XS3 which correspond to green, red and NIR bands. Reasonably good natural composite image can be obtained by combining the three spectral bands. R = XS2 G = (3 XS1 + XS3)/4 B = (3 XS1 - XS3)/4 Hyperspectral Sensors Acquire images in several, narrow, contiguous spectral bands in the visible, NIR, MIR, and thermal infrared regions of the EMR spectrum o Typically more than 100 bands are recorded o Enables the construction of a continuous reflectance spectrum for each pixel – Hyperspectral sensors are also known as imaging spectrometers – Hyperspectral scanners may be along-track or across-track Example: Hyperion sensor : 220 bands (from 400 -2.5 μm) AVIRIS sensor : 224 individual CCD detectors each with 10nm spectral resolution Hyperspectral Image Interpretation Spectral curves of the pixels are compared with the existing spectral library to identify the targets All pixels whose spectra match the target spectrum to a specified level of confidence are marked as potential targets Depending on whether the pixel is a pure feature class or the composition of more than one feature class, the resulting plot will be either a definitive curve of a "pure" feature or a composite curve containing contributions from the several features present GEOLOGICAL APPLICATIONS ❖ Surficial deposit / bedrock mapping ❖ Lithological mapping ❖ Structural mapping ❖ Sand and gravel (aggregate) exploration/ exploitation ❖ Mineral exploration ❖ Hydrocarbon exploration ❖ Environmental geology ❖ Sedimentation mapping and monitoring ❖ Geo-hazard mapping STRUCTURAL MAPPING HYDROLOGICAL APPLICATIONS ❖ Wetlands mapping and monitoring ❖ Soil moisture estimation ❖ Snow pack monitoring ❖ Measuring snow thickness ❖ River and lake ice monitoring ❖ Flood mapping and monitoring ❖ Glacier dynamics monitoring ❖ Drainage basin mapping and watershed modeling ❖ Irrigation mapping ❖ Groundwater exploration FLOODS AND DISASTER RESPONSE DROUGHT MONITORING JULY 2001 JULY 2002 LANDSAT THEMATIC MAPPER (SOURCE: CCRS 2002) LAND-USE LAND-COVER APPLICATIONS ❖ Natural resource management ❖ Wildlife habitat protection ❖ Urban expansion / encroachment ❖ Damage delineation (tornadoes, flooding, volcanic, seismic, fire) ❖ Legal boundaries for tax and property evaluation ❖ Target detection - identification of landing strips, roads, clearings, bridges, land/water interface Land Cover Classification OCEANOGRAPHIC APPLICATIONS ❖Ocean pattern identification ❖Storm forecasting ❖Fish stock and marine mammal assessment ❖Water temperature monitoring ❖Water quality ❖Ocean productivity, phytoplankton concentration and drift ❖Mapping and predicting oilspill extent and drift ❖Strategic support for oil spill emergency response decisions ❖Shipping navigation routing ❖Mapping shoreline features / beach dynamics ❖Coastal vegetation mapping IMAGE INTERPRETATION AND DATA ANALYSIS VISUAL IMAGE INTERPRETATION ❖ In order to take advantage of and make good use of remote sensing data, we must be able to extract meaningful information from the imagery - interpretation and analysis ❖ Interpretation and analysis of remote sensing imagery involves the identification and/or measurement of various targets in an image in order to extract useful information about them. ❖ Targets in remote sensing images may be any feature or object which can be observed in an image, and have the following characteristics: ❖ Targets may be a point, line, or area feature. ❖ They can have any form, from a bus in a parking lot or plane on a runway, to a bridge or roadway, to a large expanse of water or a field. ❖ The target must be distinguishable; it must contrast with other features around it in the image. ❖Visual interpretation may also be performed by examining digital imagery displayed on a computer screen. ❖Both analog and digital imagery can be displayed as black and white (also called monochrome) images, or as colour images by combining different channels or bands representing different wavelengths. ❖When remote sensing data are available in digital format, digital processing and analysis may be performed using a computer. ❖Digital processing may be used to enhance data as a prelude to visual interpretation. ❖Digital processing and analysis may also be carried out to automatically identify targets and extract information completely without manual intervention by a human interpreter. However, rarely is digital processing and analysis carried out as a complete replacement for manual interpretation. ❖Often, it is done to supplement and assist the human analyst. ❖Manual interpretation and analysis dates back to the early beginnings of remote sensing for air photo interpretation. ❖Digital processing and analysis is more recent with the advent of digital recording of remote sensing data and the development of computers. ❖Both manual and digital techniques for interpretation of remote sensing data have their respective advantages and disadvantages. ❖Generally, manual interpretation requires little, if any, specialized equipment, while digital analysis requires specialized, and often expensive, equipment. ❖ Manual interpretation is often limited to analyzing only a single channel of data or a single image at a time due to the difficulty in performing visual interpretation with multiple images. ❖ The computer environment is more amenable to handling complex images of several or many channels or from several dates. In this sense, digital analysis is useful for simultaneous analysis of many spectral bands and can process large data sets much faster than a human interpreter. ❖ Manual interpretation is a subjective process, meaning that the results will vary with different interpreters. ❖ Digital analysis is based on the manipulation of digital numbers in a computer and is thus more objective, generally resulting in more consistent results. ❖ However, determining the validity and accuracy of the results from digital processing can be difficult. ❖ It is important to reiterate that visual and digital analyses of remote sensing imagery are not mutually exclusive. Both methods have their merits. ❖ In most cases, a mix of both methods is usually employed when analyzing imagery. ❖Ultimate decision of the utility and relevance of the information extracted at the end of the analysis process, still must be made by humans. ❖Recognizing targets is the key to interpretation and information extraction. ❖Observing the differences between targets and their backgrounds involves comparing different targets based on any, or all, of the visual elements of tone, shape, size, pattern, texture, shadow, and association. ELEMENTS OF IMAGE INTERPRETATION VISUAL IMAGE INTERPRETATION ELEMENTS OF IMAGE INTERPRETATION SHAPE: ❖ Refers to the general form, structure, or outline of individual objects. ❖ Can be a very distinctive clue for interpretation. ❖ Regular geometric shapes are usually indicators of human presence and use. ❖ Some objects can be identified almost solely on the basis of their shapes: for example - the Pentagon Building, (American) football fields, cloverleaf highway interchanges ❖ Straight edge shapes typically represent urban or agricultural (field) targets. ❖ Natural features, such as forest edges, are generally more irregular in shape, except where man has created a road or clear cuts. ❖ Farm or crop land irrigated by rotating sprinkler systems would appear as circular shapes. SIZE: ❖ Must be considered in the context of the scale of a photograph. ❖ Thus, size is a function of scale. ❖ The scale will help to determine if an object is a stock pond or a lake ❖ Important to assess the size of a target relative to other objects in a scene, as well as the absolute size, to aid in the interpretation of that target. ❖ A quick approximation of target size can direct interpretation to an appropriate result more quickly. TONE: ❖ Refers to the relative brightness or color of elements on a photograph. ❖ It is, perhaps, the most basic of the interpretive elements because without tonal differences none of the other elements could be discerned. ❖ Thus, it is a fundamental element for distinguishing between different targets or features. ❖ Variations in tone also allows the elements of shape, texture, and pattern of objects to be distinguished. Three aspects of tone used in photointerpretation are: 1. Relative tonality (white, light gray, dull gray, dark gray or black) 2. Uniformity of tone (uniform, mottled, banded, scrabbled) 3. Degree of sharpness of tonal variations (sharp, gradual) TEXTURE: ❖ The impression of "smoothness" or "roughness" of image features is caused by the frequency of change of tone in photographs. ❖ Produced by a set of features too small to identify individually. Refers to the arrangement and frequency of tonal variation in particular areas of an image. ❖ Rough textures would consist of a mottled tone where the grey levels change abruptly in a small area, whereas smooth textures would have very little tonal variation. ❖ Smooth textures are most often the result of uniform, even surfaces, such as fields, asphalt, or grasslands. ❖A target with a rough surface and irregular structure, such as a forest canopy, results in a rough textured appearance. ❖Texture is one of the most important elements for distinguishing features in radar imagery. ❖Grass, cement, and water generally appear "smooth", while a forest canopy may appear "rough". SHADOW: ❖ Aid interpreters in determining the height of objects in aerial photographs. ❖ Helpful in interpretation as it may provide an idea of the profile and relative height of a target or targets which may make identification easier. ❖ However, shadows can also reduce or eliminate interpretation in their area of influence, since targets within shadows are much less (or not at all) discernible from their surroundings. ❖ Shadow is also useful for enhancing or identifying topography and landforms, particularly in radar imagery. ASSOCIATION: ❖ Some objects are always found in association with other objects. ❖ The context of an object can provide insight into what it is. For instance, a nuclear power plant is not (generally) going to be found in the midst of single-family housing. ❖ Takes into account the relationship between other recognizable objects or features in proximity to the target of interest. ❖ The identification of features that one would expect to associate with other features may provide information to facilitate identification. ❖ In the example, commercial properties may be associated with proximity to major transportation routes, whereas residential areas would be associated with schools, playgrounds, and sports fields. In our example, a lake is associated with boats, a marina, and adjacent recreational land. PATTERN: ❖ Refers to the spatial arrangement of visibly discernible objects. ❖ The patterns formed by objects in a photo can be diagnostic. ❖ Typically an orderly repetition of similar tones and textures will produce a distinctive and ultimately recognizable pattern. ❖ Orchards with evenly spaced trees, and urban streets with regularly spaced houses are good examples of pattern. Patterns resulting from particular distribution of gently curved or straight lines are common and are of geological significance. They may represent faults, joints, dykes or bedding. A single line or lineation is also an illustration of pattern and may result from an orderly arrangement of stream segments, trees, depressions or other features. Drainage patterns are important in the geologic interpretation of aerial photographs; they may reflect underlying structure or lithology. Vegetation patterns may reflect structural features or lithologic character of the rock types. Soil pattern used in engineering geology refers to the combination of surface expressions, such as landforms, drainage characteristics, and vegetation, that are used in the interpretation of ground conditions SITE: ❖ Refers to topographic or geographic location. ❖ This characteristic of photographs is especially important in identifying vegetation types and landforms. ❖ For example, large circular depressions in the ground are readily identified as sinkholes in central Florida, where the bedrock consists of limestone. ELEMENTS OF IMAGE INTERPRETATION IMAGE INTERPRETATION STRATEGIES ❖ Field Observations – Identification in the field by observation, photography, GPS, etc. ❖ Direct Recognition – Direct recognition intuitiveness ❖ Inference – inference based on knowledge and possible surrogates ❖ Interpretive Overlays – Utilizing addition data in the form of overlays to reveal relationships ❖ Photomorphic Regions – Areas of relatively uniform tone and texture ❖ Interpretation depends on the interpretation keys which an experienced interpreter has established from prior knowledge and the study of the current images. ❖ The eight interpretation elements as well as the time the photograph taken, season, film type and photo-scale should be carefully considered when developing interpretation keys. ❖ Keys usually include both a written and image component. ❖Much interpretation and identification of targets in remote sensing imagery is performed manually or visually, i.e. by a human interpreter. ❖In many cases this is done using imagery displayed in a pictorial or photograph-type format, independent of what type of sensor was used to collect the data and how the data were collected - In this case data as being referred in analog format. ❖Remote sensing images can also be represented in a computer as arrays of pixels, with each pixel corresponding to a digital number, representing the brightness level of that pixel in the image - Digital data Digital image processing What is digital image processing? ❖ Digital image processing is the study of representation and manipulation of pictorial information using a computer. ❖ Improve pictorial information for better clarity (human interpretation) Examples: 1 Enhancing the edges of an image to make it appear sharper 2 Remove “noise” from an image 3 Remove motion blur from an image ❖ Automatic machine processing of scene data (interpretation by a machine/non-human, storage, transmission) Examples: 1 Obtain the edges of an image 2 Remove detail from an image Assign meaning to an ensemble of recognized objects ❖ Digital Image Processing refers to processing of digital images (DN values) by means of a computer. IT IS A NUMBER PLAY! ❖Digital image processing focuses on two major tasks ❖ Improvement of pictorial information for human interpretation ❖ Processing of image data for storage, transmission and representation for autonomous machine perception ❖Some argument about digital image processing –”where image processing ends and fields such as image analysis and computer vision start?” CATEGORIZATION (STEPS) OF DIGITAL IMAGE PROCESSING RAW DIGITAL IMAGE PRE-PROCESSING CORRECTED DIGITAL IMAGE ENHANCEMENT TRANSFORMATION CLASSIFICATION ENHANCED TRANSFORMED CLASSIFIED IMAGE IMAGE IMAGE IMAGE OUTPUT ANALOG DIGITAL (HARDCOPY PRINTOUT) (DIGITAL DATABASE) The continuum from image processing to computer vision can be broken up into low-, mid- and high-level processes Low Level Process Mid Level Process High Level Process Input: Image Input: Image Input: Attributes Output: Image Output: Attributes Output: Understanding Examples: Noise Examples: Object Examples: Scene removal, image recognition, understanding, sharpening segmentation autonomous navigation Interpretation and Analysis ❖ Digital image processing in remote sensing involves formatting and correcting the data, enhancements and classification. ❖ Preprocessing (radiometric & geometric correction) ❖ Image enhancement. ❖ Image transformation. ❖ Image classification and analysis. INTERPRETATION AND ANALYSIS ❖ In order to make good use of remote sensing data we must be able to extract meaningful information from the imagery. ❖ Interpretation and analysis of Remote Sensing Images involves the identification and/or measurement of various targets/ objects in an image in order to extract useful information about them. ❖ For this the target must be distinguishable in contrast to its neighborhood. INTERPRETATION AND ANALYSIS ❖ Interpretation can be done visually by a human interpreter on an image in analog form (photograph) or in digital form as a display on a monitor screen. ❖ Visual interpretation is based on tone, shape, size, pattern, texture, shadow and association. ❖ Digital processing and analysis may be performed using a computer. ❖ Digital processing is used to enhance images as a preprocessing for better visual interpretation. INTERPRETATION AND ANALYSIS ❖ Automatic analysis of images (is difficult) is carried out to identify objects and to extract information. ❖ Normally a mix of visual and digital analysis of Remote Sensing Imagery is done. DIFFERENT OPERATIONS IN DIGITAL 1. IMAGE IMAGE PROCESSING: RECTIFICATION AND RESTORATION: ❖To correct distorted /degraded image - represents the original scene properly ❖Involves initial processing of raw image to correct for geometric distortion, radiometric calibration & noise removal. ❖Processes depend mainly sensor characteristics ❖Also known as Preprocessing - these operations precede further manipulation and analysis. 2. DIFFERENT OPERATIONS IMAGE ENHANCEMENT: IN DIGITAL ❖Involves IMAGE the techniques PROCESSING: which increase the visual distinction between the features ❖Applied to image data to display or record the image data more effectively ❖New images are created from the original - information increase ❖helps in visual interpretation ❖Several enhancements often necessary from the same raw image ❖Many enhancement methods available - Spatial filtering, Convolution, Edge enhancement, Level slicing, Contrast manipulation, Spectral ratioing, Principal and Canonical component analysis, Vegetation components, Intensity-Hue-Saturation color space transformation. ❖No simple rule to produce best enhanced image - depends upon application ❖Enhanced images can be displayed on interactive monitor/hard copy prints 3. IMAGE CLASSIFICATION: ❖This is a replacement for visual interpretation ❖Quantitative Technique of identification of features ❖Process is based on statistical rules to determine land cover type in each pixel ❖Intent is to categorize the all pixels in the digital image to categorize one of the several land cover classes ❖Thematic maps of landuse/landcover developed at the end ❖Two types - 1. ‘Spectral pattern Recognition’ - Decision rules are entirely based on spectral radiance of data 2. ‘Spatial Pattern Recognition’ - Decision rules are based on shape, size and pattern ❖Spectral pattern recognition - more predominant ❖3 types - Supervised, Unsupervised and Hybrid. 4. DATA MERGING AND GIS INTEGRATION: ❖Objective is to combine remote sensing data with data from other sources of the same area ❖Other source data may also include data generated by the same sensor or other sensors on the same study area on some other date ❖Eg: Remote sensing data with soil, topographic data, ownership, demographic, zoning etc. 5. HYPERSPECTRAL IMAGE ANALYSIS: ❖ The dataset is voluminous ❖ Various image processing techniques developed for Multi Spectral data can be extended to Hyper spectral data 6. BIOPHYSICAL MODELING: ❖ Quantitative relationship between remote sensing data and biophysical features and phenomena measured on ground surface ❖ Remote sensing data coupled with GIS techniques are often used for the purpose Ex: Crop yield estimation, Environmental modeling, Pollution estimation, Water Quality and Quantity estimations etc. 7. IMAGE TRANSMISSION AND COMPRESSION: ❖High volume of RS data available on the internet ❖Image compression needed for better transmission of info on the internet ❖hence image compression is part of DIP DIGITAL IMAGE PROCESSING IMAGE RECTIFICATION AND RESTORATIONS (IMAGE CORRECTION) ❖ Operations - Geometric correction, Radiometric correction and Noise removal. ❖ Different procedures depend on - a) Digital Image Acquisition type (Digital Camera, Along Track Scanner, Across Track Scanner) b) Platform (Airborne or Satellite) c) Total field of View GEOMETRIC CORRECTION: ❖ While the satellite is taking pictures of the earth along a path, the earth rotates and therefore the path traced by the satellite track is helical (distorted). ❖ Also, the measured altitude of the sensor and orbit of the satellite can have errors due to inaccuracy of measurements. ❖ For scanning mirror type of sensors (such as the MSS & Thematic mapper in Landsat Satellite) there is the distortion due to mirror scan non-linearity (non-uniform velocity). ❖ For sensors covering large swaths panoramic distortion occurs. GEOMETRIC CORRECTION Source of Errors Internal External a. Systematic errors a. Non systematic errors b. Sensor driven b. Due to perturbations in platform /atmospheric scene characteristics c. Applied to all images c. Applicable to individual images from the platform GEOMETRIC CORRECTION (a) Systematic or predictable errors E.g.: scan skew, platform velocity, aspect ratio etc. (b) Non systematic or random where we cannot easily characterize the type of distortion. E.g. variations in earth’s orbit, attitude (pitch, roll and yaw) of the satellite. GEOMETRIC CORRECTION ❖ Geometric error correction is done to remove these distortions and to make the image map compatible. ❖ This is done by undoing the process in case of systematic errors ❖ Using some ground control points (reference points) whose latitude and longitude values are known exactly on the ground and in the image. GEOMETRIC CORRECTION ❖ Geometric registration involves- ❖ Identifying the image coordinates (row, column) of several clearly discernible ground control points (GCPs) in the distorted image and matching them to their true position in ground coordinates (lat, long). ❖ The true ground coordinates are typically measured from a map. ❖ This is also called image to map registration. GEOMETRIC CORRECTION ❖ Geometric registration may also be performed by registering one or more images to another reference image (instead to geographic coordinates). ❖ This is called image to image registration for multi temporal image comparison. ❖ In actual geometric correction, the original distorted image is resampled to determine the digital image values to place in the new pixel locations of the corrected output image. ❖ The resampling process calculates the new pixel values, from the original digital pixel value in the uncorrected image. GEOMETRIC CORRECTION ❖ Three methods are available; nearest neighbor, bilinear interpolation and cubic convolution. ❖ Nearest neighbor resampling uses the digital value from the pixel in the original image, which is nearest to the new pixel location in the corrected image. ❖ This is simplest and does not alter the original pixel values, but may result in some pixel values being duplicated while others are lost. ❖ However this method can produce a disjointed or blocky image appearance. NEAREST NEIGHBOUR INTERPOLATION GEOMETRIC CORRECTION ❖ Bilinear interpolation resampling takes a weighted average of four pixels in the original image nearest to the new pixel location. ❖ The averaging process alters the original pixel values and creates entirely new digital values in the output image. ❖ This is not desirable if classification based on spectral values is to be done, in which case resampling can be done after classification. BILINEAR INTERPOLATION ❖ Cubic convolution resampling calculates a distance weighted average of a block of sixteen pixels from the original image, which surround the new pixel location. ❖ The last two methods, however, produce much sharper appearance. CUBIC CONVOLUTION IMAGE REGISTRATION IMAGE REGISTRATION ❖ Precise image to image registration is necessary to form image mosaics, to map temporal changes, comparing two images or for combining multispectral images in a color composite. ❖ Locations if taken as references can cause misregistration since changing solar elevations will results in different shadow patterns or the areas of certain objects like lakes may change with time etc. ❖ Similarly multispectral images can differ from each other considerably making band to band registration difficult. ❖ Automatic digital registration can be done using correlation between two overlapping portions (areas) as a similarity Image Registration (Contd…) ❖ If the two areas are registered, then the correlation value is maximum. ❖ Since calculation of the correlations are computationally expensive for large areas, one chooses small areas and will work well as long as the x-y shifts in the two images are small. ❖ There are techniques that significantly increase the speed of spatial domain correlation. Image Registration (Contd…) ❖ The Sequential Similarity Detection Algorithm (SSDA) uses a small number of randomly chosen pixels within the window and search areas to quickly find first the approximate points of registration and then the full calculation using all the pixels. RADIOMETRIC CORRECTIONS RADIOMETRIC CORRECTIONS Different factors influencing radiance measured: Viewing geometry Instrumental response characteristics Atmospheric conditions Changes in scene illumination RADIOMETRIC CORRECTION Two major corrections: Earth-sun distance correction Sun elevation correction EARTH-SUN DISTANCE CORRECTION EARTH-SUN DISTANCE CORRECTION RADIOMETRIC CORRECTIONS SUN ELEVATION CORRECTION Correction is done to counter the effect of seasonal variations in the position of the sun relative to the earth. Done by normalizing the pixel brightness values by assuming the sun was at zenith on each date of sensing. SUN ELEVATION CORRECTION SUN ELEVATION CORRECTION The correction is applied by dividing each pixel value in a scene by the sine of the solar elevation angle for the particular time and location. LIMITATION Both correction procedures ignore topographic and atmospheric effects. ATMOSPHERIC EFFECTS Haze: Scattered light that reduces the contrast of an image HAZE COMPENSATION Haze Compensation: The best way to compensate for haze is by observing the radiance over target areas of absolutely known reflectance. Example: water in the near-infrared region has near zero reflectance. Therefore, any signal observed over such an area can be considered haze and the value obtained can be subtracted from all the pixels in that picture. NOISE REMOVAL (PREPROCESSING) ❖ Source of noise depends from malfunction of the detector, electronic interference between sensor components, hiccups in the data transmission to recording sequence. ❖ Noise can potentially degrade or completely mask the true radiometric information of the image. ❖ Therefore, noise removal usually precedes any subsequent enhancement or classification of image data. SYSTEMATIC ERRORS ❖ Striping or banding and dropped lines is a common problem due to variation and drift in the response of the sensors in remote sensing satellites. BANDING DEBANDING Normalization with the neighboring observations. STRIPING DE-STRIPING Compile a set of histograms – one for each sensor involved in a given band. Histograms then compared in terms of their mean and median values to identify the problematic sensor. Grey-scale adjustments done accordingly. LINE DROP LINE DROP Replace the defective DNs with the average values of the pixels occurring in the lines above or below. Suitable interpolation can also be performed. RANDOM NOISE Characterized by non-systematic variations in gray levels from pixel to pixel. This is called bit error. Causes image to have a “salt and pepper” or “snowy” appearance. NOISE REDUCTION EXAMPLE RANDOM NOISE Bit errors have values that change more abruptly than true image values Can be recognized by comparing each pixel with that of its neighbor A threshold is set by the examiner. If the difference exceeds the threshold, then the pixel is said to have bit error. The DN of the noisy pixel is replaced by the average of the DNs of the neighboring pixels. NOISE REDUCTION EXAMPLE DE-BLURRING THANK YOU

Use Quizgecko on...
Browser
Browser