LSGI536 Remote Sensing Image Processing PDF
Document Details
Uploaded by GratifyingMaclaurin
The Hong Kong Polytechnic University
Zhiwei Li
Tags
Related
- Lecture 1_Introduction to Remote Sensing.pdf
- Remote Sensing Image Pre-processing Lecture 3 PDF
- LSGI536 Remote Sensing Image Processing Lecture 4 PDF
- Lecture 5 Remote Sensing Image Interpretation PDF
- Lecture 6: Machine Learning for Remote Sensing Image Processing - Part I PDF
- Digital Image Processing Course Study Guide PDF
Summary
This document is a syllabus for a remote sensing image processing course at The Hong Kong Polytechnic University. It outlines lecture topics, lab exercises, project work, and final exam details. The course covers remote sensing fundamentals, image pre-processing, enhancement, interpretation, and machine learning applications.
Full Transcript
Orientation LSGI536 Remote Sensing Image Processing Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] LSGI536 Remote Sensing Image Processing Instructor: Dr. LI Zhiwei Email: zhiwei.li@polyu....
Orientation LSGI536 Remote Sensing Image Processing Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] LSGI536 Remote Sensing Image Processing Instructor: Dr. LI Zhiwei Email: [email protected] Office: ZN613, Block Z, PolyU Website: https://zhiweili.net/ Teaching Assistants: YE Longjie, [email protected], Z418 XU Shaofen, [email protected], ZS1015 (For Lab Exercise only) Lecture: 18:30 to 21:20 PM, Wednesday, Room Z406 Lab Exercise: 18:30 to 21:20 PM, Wednesday, Room ZN604 2 Objectives principles and technology for remote sensing image acquisition, characteristics of remote sensing image data; methodology for geometric and radiometric processing of remote sensing images for quality improvement; techniques for interpreting information from remote sensing image data; principles of machine learning; the applications of machine learning for remote sensing image processing and analysis 3 Intended Learning Outcomes be familiar with the basic physical principles of remote sensing imaging; be familiar with common remote sensing platforms, sensors, and images; master practical skills in remote sensing image processing and analysis, including geometric and radiometric pre-processing, image enhancement, and image interpretation; discuss the various factors that influence the accuracy of information and features extracted from remote sensing images; acquire knowledge of machine learning algorithms; master how to apply machine learning for remote sensing image processing and analysis; design and implement research projects in the fields of remote sensing and machine learning; 4 Tentative Teaching Schedule Week 1 Date (Wednesday) Jan 17 2 Jan 24 3 Jan 31 4 Feb 7 Lecture Topic Lab Exercise Lecture 1. Introduction to Remote Sensing Lecture 2. Remote Sensing Fundamentals: Platforms, Sensors, and Image Characteristics Lecture 3. Remote Sensing Image Preprocessing Exercise #1. Remote Sensing Image Pre-processing Lunar New Year Break Lecture 4. Remote Sensing Image Enhancement 5 Feb 21 6 Feb 28 7 Mar 6 Lecture 5. Remote Sensing Image Interpretation 8 Mar 13 Lecture 6. Machine Learning for Remote Sensing Image Processing: Part I 9 Mar 20 10 Mar 27 11 Apr 3 12 Apr 10 13 Apr 17 Apr 25 - May 11 Exercise #2. Remote Sensing Image Enhancement Exercise #3. Remote Sensing Image Interpretation Lecture 7. Machine Learning for Remote Sensing Image Processing: Part II Exercise #4. Deep Learning for Remote Sensing Image Processing Lecture 8. Selected Topics in Advanced Remote Sensing Applications Project presentation Final examination Assessment Methods Tasks Weighting Level Lab exercises 30% Easy Requirements Submit lab report for each lab exercise ▪ Group project 40% (15% for presentation, 25% for report) ▪ ▪ ▪ Final examination 30% Total 100% Easy Each group includes 4-5 students (male & female). Project covers the topic, introduction, data, method, results, discussion, and conclusions. Final PPT presentation (April 17, 15 min for presentation and Q&A). Final project report (April 30, similarity < 15% and AI rate < 15% in Turnitin). 2 hours, close-book exam 6 Assessment Policy To pass this subject, students must attain a minimum grade of "D-" in final examination. There will be 3-5 random checks of class attendance, and their outcomes will be considered in the final grade calculation. Absence from in-class tests must be accompanied by a valid certificate. Mark ≥ 93 ≥ 88 and < 93 ≥ 83 and < 88 ≥ 78 and < 83 ≥ 73 and < 78 ≥ 70 and < 73 ≥ 68 and < 70 ≥ 63 and < 68 ≥ 60 and < 63 ≥ 58 and < 60 ≥ 53 and < 58 ≥ 50 and < 53 < 50 Grade A+ A AB+ B BC+ C CD+ D DF 7 Reading List and References Journals and Magazines Remote Sensing of Environment ISPRS Journal of Photogrammetry and Remote Sensing IEEE Transactions on Geoscience and Remote Sensing IEEE Geoscience and Remote Sensing Magazine International Journal of Applied Earth Observation and Geoinformation GIScience & Remote Sensing Books Campbell, J. B., & Wynne, R. H., Thomas V. A. (2022). Introduction to Remote Sensing (6th Edition). Guilford Press. Gonzalez, R. C. (2017). Digital Image Processing (4th Edition). Pearson. Lillesand, T., Kiefer, R., Chipman, J. (2015), Remote Sensing and Image Interpretation (7th Edition). Wiley. Morain, S. (2019), Manual of Remote Sensing (4th Edition), American Society for Photogrammetry and Remote Sensing. 8 LSGI536 Remote Sensing Image Processing Lecture 1 Introduction to Remote Sensing Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] Acknowledgment Instructors at LSGI who have taught this subject previously, including Dr. ZHU Rui, provided most of the slides for this lecture. 10 Outlines 1. Introduction to Remote Sensing 2. Physical Principles of Imaging 3. Digital Image Processing and Analysis 4. Remote Sensing Applications 11 Section 1 Introduction to Remote Sensing 12 What is “Remote Sensing”? “Remote sensing is the art and science of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object, area, or phenomenon under investigation” — Lillesand and Keifer, 1987 “Remote sensing is the process of collecting, storing and extracting environmental information from images of the ground acquired by devices not in physical contact with the features being studied” — Robinson et al. 13 What is “Remote Sensing”? “Remote sensing is the art and science of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object, area, or phenomenon under investigation” — Lillesand and Keifer, 1987 “Remote sensing is the process of collecting, storing and extracting environmental information from images of the ground acquired by devices not in physical contact with the features being studied” — Robinson et al. 14 What is “Remote Sensing”? Video Link 15 Scope of Remote Sensing Platform carries instruments to measure Electromagnetic Radiation reflected/emitted from the Earth Atmospheric system. The sensor may be at different locations with respect to the object under consideration. Yamazaki, F. (2001). Applications of remote sensing and GIS for damage assessment. 16 System Types Passive or Active Passive systems record reflected, naturally occurring radiation (e.g., natural light, infrared radiation) Active systems record the reflection of emitted radiation (e.g., radar, sonar) 17 Remote Sensing Process 1. Energy source or illumination 2. Radiation and the atmosphere 3. Interaction with the target 4. Recording of energy by the sensor 5. Transmission, reception, and processing 6. Interpretation and analysis 7. Application Source: Dr. Tanushree Kain, 2023 18 Why “Remote Sensing”? Because it can Observe and collect data across a large region, in which some areas are extremely difficult or even dangerous to access, e.g. arctic, ocean; Provide global coverage; Allow repetitive measurements over years as historical data and provide long-term change; Cost effective, once data is collected, it can be shared with many people; and Instruments/sensors can detect light that beyond human vision. 19 Brief History ▪ 1910-20 World War I: the beginning of photo interpretation. ▪ 1962 The term "Remote Sensing" first appeared. ▪ 1972 The launch of Landsat-1, originally ERTS-1, remote sensing has been extensively investigated and applied. Today images are recorded in different ways - close to the ground, aircraft, satellites, and ocean vessels. Each of these has many different forms but each records information about the object of interest without contact through the recording of radiation of some form of energy. 20 Aerial image of Paris in 1858 21 First aerial photograph Gaspard-Félix Tournachon (1820 – 1910), known by the pseudonym Nadar, was a French photographer... In 1858, he took the first ever aerial photograph of Paris from a balloon. Experiments were also conducted with pigeon cameras. Afterwards, cameras were mounted in aero planes taking photos for large ground areas. https://en.wikipedia.org/wiki/Nadar 22 Famous photograph: San Francisco in Ruins San Francisco lies in ruins on May 28, 1906, about six weeks after the 1906 San Francisco earthquake and fire. It was taken from a camera suspended on a kite, perhaps 1,000 feet above the city. It is one of the most well-known photographs of George R. Lawrence. https://en.wikipedia.org/wiki/File:San_Francisco_in_ruin 23 Aerial photograph in World War I Some English observers started using cameras to record enemy positions and found aerial photography easier and more accurate than sketching and observing. The aerial observer became the aerial photographer. [Link1, Link2] 24 Aerial photograph in World War II Photo taken June 23, 1943 of the V-2 test lunch site at Peenemunde after bombing raid [link] 25 Spy Planes U-2 Spy Plane 1954-1960 Flew at 21,300 meters over USSR 26 Spy Planes U-2 aerial photograph of an airfield in the Soviet Union [link] 27 Satellites 705 km 28 Section 2 Physical Principles of Imaging Electro-magnetic radiation Energy interactions in the atmosphere 29 Units of measurement Common units for wavelength: Meters (m) Centimeters (cm) Micrometers (μm) Nanometers (nm) For example: Visible light is in the range from about 0.4 μm to approx. 0.7 μm or about 400 nm to approx. 700 nm. Light Year astronomical distances 30 Electro-magnetic radiation https://youtu.be/lwfJPc-rSXw 5 mins > What is crest and what is trough? > What is the wavelength? > What is the frequency of the wave? What is the unit? > What are the shortest and highest energy waves? Video Link 31 Electro-magnetic radiation What is crest and what is trough? What is the wavelength? What is the frequency of the wave? What is the unit? What are the shortest and highest energy waves? 32 Electro-magnetic radiation What is crest and what is trough? The highest and the lowest points of the wave. What is the wavelength? The distance between two consecutive crests or troughs. What is the frequency of the wave? What is the unit? The number of crests/troughs that pass a given point within one second is described as the frequency of the wave. Hertz or Hz What are the shortest and highest energy waves? Gamma rays. 33 Electro-magnetic spectrum 34 Electro-magnetic spectrum 35 Electro-magnetic spectrum Gamma Rays, Approximate locations of the 16 channels we use to peer through the atmosphere with the GOES-R series satellites. > The places where energy passes through are called “atmospheric windows”. > The wavelength ranges in which the atmosphere is particularly transmissive of energy are referred as Atmospheric Windows. > We use these "windows" in remote sensing to peer into the atmosphere from which we can obtain much information concerning the weather. 51 Atmospheric effects in visible (due to scattering) Radiance indicates how much of the power emitted, reflected, transmitted or received by a surface, which will be received by an optical system looking at that surface from a specified angle of view. The atmosphere can scatter light, adding to the radiation that is detected by the sensor. This is known as path radiance. ρ=Reflectance of object E=irradiance, incoming energy T=transmission of atmosphere https://en.wikipedia.org/wiki/Radiance 52 Energy interactions in the atmosphere All radiation detected by remote sensors passes through some distance, or path length, of Earth’s atmosphere. Particles and gases in the atmosphere can affect the incoming light and radiation. The net effect of the atmosphere varies with: Difference in path length Magnitude of the energy signal Atmospheric condition Wavelengths involved Two types of interactions are normally observed: Scattering Absorption 53 Energy interactions in the atmosphere 54 Atmospheric scattering Scattering occurs when particles or large gas molecules present in the atmosphere interact with and cause electromagnetic radiation to be redirected from its original path. The intensity of scattering depends on: the wavelength of the radiation the abundance of particles or gases, and the distance the radiation travels through the atmosphere Three types of scattering Rayleigh or molecular scattering Mie or non-molecular scattering Non-selective scattering 55 Why is the sky blue? https://youtu.be/ehUIlhKhzDA 3 mins 56 Why is the sky blue? The sunlight is scattered in all directions by the gases in the air when it reaches the Earth’s atmosphere. The types of gases mostly scatter the shorter and choppier waves of blue lights. 57 Atmospheric absorption Absorption is the other main type of interaction that electromagnetic radiation interacts with the atmosphere. Ozone: absorb the harmful ultraviolet radiation from the sun. Without Ozone layer our skin would get burnt when exposed to sunlight. Carbon dioxide: absorbs strongly in the far infrared region – the area associated with thermal heating – thus heat is trapped inside the atmosphere. Water vapor: absorbs much of the infrared and microwave radiation (between 22μm and 1m). 62 Section 3 Digital Image Processing and Analysis 63 Atmospheric correlation A MODIS image over South Africa Color processed to Atmospheric Optical Thickness (AOT) 64 Feature enhancement https://apps.sentinel-hub.com/sentinel-playground/ Enhance land features with different image band combinations 65 Contrast enhancement Example of Image Enhancement on IKONOS image 66 Image enhancement False color Infra-red SPOT image over Yuen Long 67 Image enhancement 68 Missing data reconstruction 69 Image fusion A small region of the (a) panchromatic image (0.6 m), (b) multispectral image (2.4 m), and (c) fused image (0.6 m). [link] 70 Image super-resolution An example of the 275 m Multi-angle Imaging Spectro Radiometer (MISR) red band image (left) super-resolved to 68.75 m super-resolution restoration (SRR) image (right). [link] 71 Object segmentation A framework for city-scale rooftop solar PV potential estimation was developed. Labor cost of DL was significantly reduced with proposed spatial optimization sampling strategy. Rooftop extraction model was proved to be robust in different districts. 311,853 GWh rooftop solar PV potential was estimated for Nanjing in 2019. 330.36 km2 rooftop area and 66 GW installed capacity were estimated for Nanjing. 72 Remote sensing image analysis tasks Sensor Sun Atmosphere Radiation Reflection Earth surface Imaging process of optical satellite imaging system for Earth Observation (Image Source: Prof. Shen H.F. et al.) 73 Remote sensing image analysis tasks 1. Image pre-processing Fundamental image processing procedures (e.g. image quality/resolution improvement) 2. Image processing and interpretation 3. Geophysical parameters retrieval Identify objects in images from a qualitative perspective Measuring the physical properties of objects from a quantitative perspective (e.g. object detection, image classification) (e.g. estimation of air pollutants, soil moisture) Example tasks of remote sensing image processing of water Image pre-processing (enhance water features, radiometric accuracy, spatial resolution, etc.) Water extent mapping Water quality monitoring (extent and its changes) (color, turbidity, etc.) 74 Remote sensing image analysis tasks 1. Image pre-processing (Restoration) Impulse noise reduction Stripe noise removal Deblurring Image Source: Prof. Shen H.F. et al. 75 Remote sensing image analysis tasks 1. Image pre-processing (Correction) Thin cloud removal Building shadow removal Radiometric normalization Image Source: Prof. Shen H.F. et al. 76 Remote sensing image analysis tasks 1. Image pre-processing (Reconstruction) Gap filling Thick cloud removal Land surface temperature reconstruction Image Source: Prof. Shen H.F. et al. 77 Remote sensing image analysis tasks 2. Image interpretation (Scene classification) General pipeline of three types of scene classification methods AID: A Benchmark Data Set for Performance Evaluation of Aerial Scene Classification 78 Remote sensing image analysis tasks 2. Image interpretation (Object detection) DOTA: A Large-scale Dataset for Object Detection in Aerial Images 79 Remote sensing image analysis tasks 2. Image interpretation (Image classification) Land use and land cover (LULC) classification 80 Remote sensing image analysis tasks 2. Image interpretation (Land use/cover change detection) Landsat data used for illustrating the Continuous Change Detection and Classification (CCDC) results 81 Remote sensing image analysis tasks 3. Geophysical parameters retrieval (Earth surface monitoring) PM2.5 mapping Urban heat island monitoring Monitoring of historical glacier recession Image Source: Prof. Shen H.F. et al. 82 Remote sensing image analysis tasks 3. Geophysical parameters retrieval (Earth surface monitoring) Vegetation index Soil moisture O3 mapping Image Source: Prof. Shen H.F. et al. 83 Section 4 Remote Sensing Applications 84 Land reclamation in Hong Kong Corona (1969) Landsat-8 (2016) http://www.geocarto.com.hk/edu/PJ-HKRECLAM/main_RECL_bootstrap.html 85 Meteorological satellite - FengYun-2 China's first geostationary meteorological satellites were named FengYun-2, or FY-2 satellites. https://youtu.be/Q-KXI9Ha3PU?si=-9TwOGXeI6nrAdfE 86 Nighttime lights of Earth Lights of Human Activity Shine in NASA's Image of Earth at Night https://www.youtube.com/watch?v=8dc58ZrOuck&ab_ channel=NASAGoddard What are the phenomena revealed from Earth lights? 87 Nighttime lights of Earth Night Lights Change in the Middle East (2012 & 2016) https://earthobservatory.nasa.gov/images/90100/night-lights-change-in-the-middle-east 88 Lake Chad 1963 to 2013 60,000 sq.km in 1963 40x size of HK The 6th largest lake in the world 89 Farmers around Lake Chad 90 SPOT image of fire Overview 2.5m resolution SPOT 5 images on 8th August 2005 showing smoke from forest fires in Sumatra 93 IKONOS image of fire 1m resolution 94 Terra ASTER image over Chengdu region before and after earthquake February 19, 2003 May 23, 2008 95 SPOT image over New Orleans October 30, 2001 and August 30, 2005 (24hrs after passage of Katrina hurricane) 96 Natural hazards in the World https://earthobservatory.nasa.gov/topic/natural-event 97 Landsat 7 ETM+ Collection 2 level-2 products Left: Landsat 7 level-2 surface reflectance image. Right: Landsat 7 level-2 surface temperature image. The data was acquired on August 19, 2020 (path 179 row 28). 98 Landsat 8 Ground-level PM2.5 Concentration https://doi.org/10.1117/12.2068886 99 Nuclear reactor in North Korea July 2012 US Military satellite images 101 Questions What are the remote sensing applications introduced? Can you introduce some different applications? 102 Summary Introduction to Remote Sensing Physical Principles of Imaging Digital Image Processing and Analysis Remote Sensing Applications 103 Homework - Satellite sensor specifications ❑ Landsat-8/9 ❑ Sentinel-1/2 ❑ MODIS ❑ SPOT ❑ ASTER ❑ PlanetScope ❑ WorldView 104 End of Lesson 1 105 LSGI536 Remote Sensing Image Processing Lecture 2 Remote Sensing Fundamentals: Platforms, Sensors, and Image Characteristics Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] Outline 1. Basic Specifications of Satellite and Sensor 2. Common Satellite Platforms and Sensors 3. Characteristics of Remote Sensing Images 2 Section 1 Basic Specifications of Satellite and Sensor 3 Basic Specifications of Satellite and Sensor Sensors and Imaging Scanners Satellite Orbit Swath Width Repeat Cycle and Revisit Time FOV and IFOV 4 Measurement Techniques 5 Sensors Sensors are instruments that collect data about Earth processes or atmospheric components. Along with being carried aboard satellites or aircraft, sensors also can be installed on the ground (in situ). There are two types of sensors: o Active sensors provide their own source of energy to illuminate the objects they observe. o Passive sensors detect energy emitted or reflected from the environment. How do sensors work? [link1, link2] 6 Multispectral Scanners The imaging technologies utilized in satellite programs have ranged from traditional cameras to mechanical scanners that record images of the earth’s surface by moving the instantaneous field of view of the instrument across the earth’s surface to record the upwelling energy. The system that can collect data in many spectral bands and over a wider range of EM spectrum is called multispectral scanner. Multispectral scanners are either ✓ Across-track (whisk broom) or ✓ Along-track (push broom) 7 Multispectral Scanners Across-track scanner (whisk broom) Along-track scanner (push broom) 8 Multispectral Scanners Across-track scanner (whisk broom) A. Oscillating mirror B. Detectors C. IFOV D. GSD E. Angular field of view (θ) F. Swath (2H*tan(θ/2)) Along-track scanner (push broom) A. Linear array of detectors B. Focal plane of the image C. Lens D. GSD 9 Satellite Orbit Types of Satellite Orbits https://www.youtube.com/watch?v=n70zjMvm8L0 5 min (The 6th and 7th chapters) 10 Satellite Orbit Geostationary orbit Polar orbit Sun-synchronous orbit 11 Geostationary orbits (GEO) About 35,780 km above ground Geostationary orbits ▪ Period of rotation equal to that of Earth (24 hours) so the satellite always stays over the same location on Earth. ▪ Constant spatial coverage. ▪ Ideal orbit for telecommunications or for monitoring continent-wide weather patterns and environmental conditions. ▪ Multiple observations per day. Kepler’s Third Law 12 Geostationary orbit example - metrological satellite Real-time image of Himawari-9 https://www.data.jma.go.jp/mscweb/data/himawari/sat_img.php?area=fd_ 13 Polar orbits Pass over the Earth’s polar regions from north to south. The orbital track of the satellite does not have to cross the poles exactly for an orbit to be called polar, an orbit that passes within 20 to 30 degrees of the poles is still classed as a polar orbit. Global coverage Larger swath size means higher temporal resolution. Sometimes orbital gaps. 14 Sun Synchronous orbits Near-polar orbits Cover each area of the world at a constant local time of day called local sun time. Since there are 365 days in a year and 360 degrees in a circle, it means that the satellite has to shift its orbit by approximately one degree per day. This ensures consistent illumination conditions when acquiring images. 15 Swath Width As a satellite revolves around the Earth, the sensor “sees” a certain portion of the Earth’s surface. The area imaged on the surface, is referred to as the swath. Imaging swaths for spaceborne sensors generally vary between tens and hundreds of kilometres wide. As the satellite orbits the Earth from pole to pole, its east-west position wouldn’t change if the Earth didn't rotate. However, as seen from the Earth, it seems that the satellite is shifting westward because the Earth is rotating (from west to east) beneath it. This apparent movement allows the satellite swath to cover a new area with each consecutive pass. The satellite’s orbit and the rotation of the Earth work together to allow complete coverage of the Earth’s surface, after it has completed one complete cycle of orbits. 17 Swath Width Landsat 8 Swath Animation https://youtu.be/xBhorGs8uy8 ~ 1 min 18 Repeat Cycle A satellite generally follows a path around the Earth. The time taken to complete one revolution of the orbit is called the orbital period. The satellite traces out a path on the earth surface, called its ground track, as it moves across the sky. As the Earth is also rotating, the satellite traces out a different path on the ground in each subsequent cycle. The time interval in which nadir point of the satellite passes over the same point on the Earth’s surface for a second time (when the satellite retraces its path) is called the repeat cycle of the satellite. For instance, Radarsat-1 was designed in a Sunsynchronous orbit with a 343/24 repeat cycle that implies after 343 revolutions and 24 nodal days, the satellite shall, within an error, return to the same spot over the Earth. 19 Revisit Time i.e. temporal resolution The satellite revisit time is the time elapsed between observations of the same point on earth by a satellite. Different from the repeat cycle which only depends on the orbit, the revisit time is relevant to the payload of the satellite. It depends on the satellite's orbit, target location, and swath of the sensor. Repeat cycle and revisit time https://hsat.space/wp-content/uploads/2020/09/SSO.mp4?_=1 20 FOV and IFOV FOV, or Field of View, is the whole area that your sensor can see at a set distance. IFOV, or Instantaneous Field of View is the smallest detail within the FOV that can be detected or seen in an instant. The IFOV is normally expressed as the cone angle (β) within which incident energy is focused on the detector. FOV and IFOV are related to spatial resolution. 21 Size of IFOV Most airborne and satellite systems IFOV=0.5-5 mRad Small IFOV good for high spatial detail (i.e., high spatial resolution). Large IFOV means large amount of energy focused on the detector. ✓ more sensitive to scene radiance; ✓ better radiometric resolution; ✓ can distinguish very slight energy differences; Thus, there is a trade-off between high spatial resolution and high radiometric resolution in the design of multispectral scanner systems. 22 IFOV and signal-to noise ratio For large IFOV, signal much greater than background electronic noise - thus higher S/N ratio than one with a small IFOV. Thus, spatial resolution is sacrificed for these higher signal levels. System noise in Landsat image band 2 23 Questions Which type of orbit is commonly used for weather forecasting? List and describe the four types of image resolutions. Explain the difference between FOV and IFOV. 24 Section 2 Common Satellite Platforms and Sensors 25 Common Satellite Platforms and Sensors Medium Resolution Satellites - Landsat Moderate Resolution Satellites - Terra (MODIS) Copernicus Satellites constellation - Sentinels High Resolution Satellites o IKONOS o QuickBird o WorldView Small Satellites and Sensors 26 Medium Resolution Sensors Landsat A joint program of National Aeronautics and Space Administration (NASA) and United States Geological Survey (USGS) www.nasa.gov/landsat landsat.usgs.gov 27 Landsat History Landsat: Celebrating 50 Years (5 mins) https://www.youtube.com/watch?v=7XKVSTX1vdE 28 Landsat Missions Overview Landsat Missions Timeline 29 Landsat Missions Overview Spectral Bandpasses for all Landsat Sensors 30 Landsat 7 Landsat 7 carries the Enhanced Thematic Mapper Plus (ETM+) sensor, an improved version of the Thematic Mapper instruments that were onboard Landsat 4 and Landsat 5. Eight spectral bands, including a pan and thermal band: 1. 2. 3. 4. 5. 6. 7. 8. Band 1 Blue (0.45 - 0.52 µm) 30 m Band 2 Green (0.52 - 0.60 µm) 30 m Band 3 Red (0.63 - 0.69 µm) 30 m Band 4 NIR (0.77 - 0.90 µm) 30 m Band 5 SWIR-1 (1.55 - 1.75 µm) 30 m Band 6 Thermal (10.40 - 12.50 µm) 60 m Low Gain / High Gain Band 7 SWIR-2 (2.08 - 2.35 µm) 30 m Band 8 Panchromatic (PAN) (0.52 - 0.90 µm) 15 m 31 Landsat 7 On May 31, 2003 The Scan Line Corrector (SLC) that compensates for the forward motion of the satellite was failed The failure is permanent The sensor’s line of sight traces a zig-zag pattern Repeat cycle 16 days 32 Landsat 7 Daily Landsat-scale evapotranspiration estimation over a forested landscape in North Carolina, USA [link] 33 Landsat 8 Operational Land Imager (OLI) - Built by Ball Aerospace & Technologies Corporation Nine spectral bands, including a pan band: 1. 2. 3. 4. 5. 6. 7. 8. 9. Band 1 Coastal/Aerosol (0.43 - 0.45 µm) 30 m Band 2 Blue (0.450 - 0.51 µm) 30 m Band 3 Green (0.53 - 0.59 µm) 30 m Band 4 Red (0.64 - 0.67 µm) 30 m Band 5 NIR (0.85 - 0.88 µm) 30 m Band 6 SWIR-1 (1.57 - 1.65 µm) 30 m Band 7 SWIR-2 (2.11 - 2.29 µm) 30 m Band 8 Panchromatic (PAN) (0.50 - 0.68 µm) 15 m Band 9 Cirrus (1.36 - 1.38 µm) 30 m Thermal Infrared Sensor (TIRS) - Built by NASA Goddard Space Flight Center Two spectral bands: 1. 2. Band 10 TIRS 1 (10.6 - 11.19 µm) 100 m Band 11 TIRS 2 (11.5 - 12.51 µm) 100 m Repeat cycle 16 days 34 Landsat 7 vs Landsat 8 36 Landsat 9 OLI-2 sensor TIRS-2 sensor Illustration of Landsat 9 Observatory Source: Landsat 9 Data Users Handbook 37 Landsat 9 ❑ The OLI–2 improves radiometric precision (14-bit quantization increased from 12 bits for Landsat 8). Nine spectral bands: 1. 2. 3. 4. 5. 6. 7. 8. 9. Band 1 Coastal/Aerosol (0.43 - 0.45 µm) 30 m Band 2 Blue (0.450 - 0.51 µm) 30 m Band 3 Green (0.53 - 0.59 µm) 30 m Band 4 Red (0.64 - 0.67 µm) 30 m Band 5 Near-Infrared (0.85 - 0.88 µm) 30 m Band 6 SWIR 1(1.57 - 1.65 µm) 30 m Band 7 SWIR 2 (2.11 - 2.29 µm) 30 m Band 8 Panchromatic (PAN) (0.50 - 0.68 µm) 15 m Band 9 Cirrus (1.36 - 1.38 µm) 30 m ❑ Thermal Infrared Sensor 2 (TIRS-2) 1. Band 10 TIRS 1 (10.6 - 11.19 µm) 100 m 2. Band 11 TIRS 2 (11.5 - 12.51 µm) 100 m 38 Landsat 8 + Landsat 9 Landsat 9 will replace Landsat 7 (launched in 1999), taking its place in orbit (8 days out of phase with Landsat 8). The combined Landsat 8 and Landsat 9 revisit time for data collection with be every 8 days. 39 Worldwide Reference System (WRS) The Worldwide Reference System (WRS) is a global notation system for Landsat data. It enables a user to inquire about satellite imagery over any portion of the world by specifying a nominal scene center designated by PATH and ROW numbers. Landsat satellites 1, 2 and 3 follow WRS-1, and Landsat satellites 4, 5, 6, 7, 8, and 9 follow WRS-2. [link] 40 Worldwide Reference System (WRS) A map of the Worldwide Reference System-2 [link] 41 Worldwide Reference System (WRS) [link] 42 Worldwide Reference System (WRS) A B 2 1 3 C Landsat WRS Path Row that covers HK 43 Download Landsat imagery https://earthexplorer.usgs.gov/ 44 Terra Instruments Since 1999, the Terra satellite has been continually observing Earth. Terra is an international mission carrying five instruments that observe Earth’s atmosphere, ocean, land, snow and ice, and energy budget. United States: MODIS (Moderate-resolution Imaging Spectroradiometer) CERES (Clouds and Earth’s Radiant Energy System) MISR (Multi-angle Imaging SpectroRadiometer) Japan: ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) Canada: MOPITT (Measurements of Pollution in the Troposphere) 45 Terra Instruments Twenty Years of Terra in Our Lives 3 min 46 Moderate Resolution Imaging Spectroradiometer (MODIS) MODIS is a key instrument aboard the Terra (originally known as EOS AM-1) and Aqua (originally known as EOS PM-1) satellites. Terra MODIS and Aqua MODIS are viewing the entire Earth's surface every 1 to 2 days, acquiring data in 36 spectral bands, or groups of wavelengths. Orbit: 705 km, 10:30 a.m. descending node (Terra) or 1:30 p.m. ascending node (Aqua), sun-synchronous, near-polar, circular. Swath Dimensions: 2330 km (cross track) by 10 km (along track at nadir). Design Life: 6 years These data will improve our understanding of global dynamics and processes occurring on the land, in the oceans, and in the lower atmosphere. https://modis.gsfc.nasa.gov/about/specifications.php 47 Moderate Resolution Imaging Spectroradiometer (MODIS) Spatial Resolution: 1. 250m (bands 1-2) 2. 500m (bands 3-7) 3. 1000m (bands 8-36) Wavelengths of 36 bands: 1. 1-19 from 405 nm to 2155 nm 2. 20-36 from 3.66 µm to 14.28 µm Four different categories of bands for earth observation Bands 1 to 7: Land Bands and Cloud Bands, Bands 8 to 16: Ocean Colour Bands, Bands 17 to 36: Atmosphere and Cloud Bands, Bands 20 to 36: Thermal Bands (collect measurements in night mode). 48 Moderate Resolution Imaging Spectroradiometer (MODIS) 36 channels in MODIS image [link] 49 Application examples of MODIS imagery “Massive sandstorm forms over Mongolia, hits China's capital Beijing as the worst in a decade.” Early Season Dust Storm Hits Beijing (March 15, 2021, Link) 50 ESA Sentinels for Copernicus ESA (European Space Agency) is developing a new family of missions called Sentinels specifically for the operational needs of the Copernicus programme. Each Sentinel mission is based on a constellation of two satellites to fulfil revisit and coverage requirements, providing robust datasets for Copernicus Services. These missions carry a range of technologies, such as radar and multispectral imaging instruments for land, ocean and atmospheric monitoring. Sentinel family 51 ESA Sentinels for Copernicus Sentinel-1 is a polar-orbiting, all-weather, day-and-night synthetic aperture radar for land and ocean services. Sentinel-1A was launched on 3 April 2014 and Sentinel-1B on 25 April 2016. Sentinel-2 is a polar-orbiting, multispectral high-resolution imaging mission for land monitoring to provide, for example, imagery of vegetation, soil and water cover, inland waterways and coastal areas. Sentinel-2A was launched on 23 June 2015 and Sentinel-2B followed on 7 March 2017. https://rus-copernicus.eu/portal/ 52 ESA Sentinels for Copernicus Sentinel-2 Data Products 53 ESA Sentinels for Copernicus Sentinels for Copernicus (Optional) https://youtu.be/xcflQZJ5n88 ~ 5 mins Know all about the Sentinel Mission https://youtu.be/W3fv7TUmqf8 ~ 3 mins 55 DigitalGlobe Commercial satellites DigitalGlobe Satellite Imagery: Worldview, GeoEye and IKONOS 56 IKONOS Commercial satellite Panchromatic: 0.45-0.90 μm, 0.82 m. Multispectral (3.2 m): #1: Blue 0.45–0.52 μm, #2: Green 0.52–0.60 μm, #3: Red 0.63–0.69 μm, #4: Near IR 0.76–0.90 μm Data quantization: 11-bit (2 bytes per pixel) pixel values 0-2047 Swath width: 11.3 km Areas of interest: single image at 13 km x 13 km Revisit time: less than 3 days Decommissioned in March 2015 57 QuickBird Panchromatic resolution 0.6m Multispectral resolution 2.4m Mission duration: 13 years and 2 months De-orbited on January 27, 2015 58 WorldView-4 Specification Details Orbit Sun-synchronous Altitude 617km Mission Lifetime Decommissioned Spatial Resolution Panchromatic 31 cm at nadir (GSD)* Multispectral 1.24 m at nadir (GSD)* *Ground Sample Distance Accuracy < 4 m CE90 Spectral Bands Panchromatic: 450 – 800 nm Blue: 450 – 510 nm Green: 510 – 580 nm Red: 655 – 690 nm Near infrared: 780 – 920 nm Stereo Available Yes Largest Scale 1:1000 Dynamic Range 11 bit Coverage: Up to 680,000km² per day Pan sharpened WorldView-4 image at 0.3m resolution 59 PROBA-1 (Project for On-Board Autonomy) Compact high-resolution imaging spectrometer (CHRIS) weight 14kg This small (60×60×80 cm; 95 kg) boxlike system, with solar panel collectors on its surface, has remarkable image-making qualities. It hosts two instruments including a hyperspectral system (200 narrow bands) that images at 17 m resolution, and a monochromatic camera that images visible light at 5 m resolution. 60 CubeSat | Mini cube satellites What is a CubeSat? https://youtu.be/nsdMcqiBmvY (2 mins) Cubesats | Mini cube satellites https://youtu.be/-BGXRGoEnAc (15 mins) Educational CubeSat project: https://github.com/oresat/getting-started 61 Unmanned Aerial Vehicle (UAV) – Micro-drones [link] How are drones helping farmers keep an eye on crops? [link] (2:10 - 3:15) 62 Questions Describe the strengths and limitations of the satellites introduced 63 Section 3 Characteristics of Remote Sensing Images 64 Characteristics of Remote Sensing Images Raster image Binary data Color Image Image resolutions 65 Digital data A data model is a way of defining and representing reality in a system, different models are used for different systems. In particular, GIS or geospatial data models are used to describe geographic features in the reality. In general, we have the following two models Vector data model represents features in term of discrete points, lines, and polygons Raster data model represents features in term of a matrix of cells In raster, each cell carries value representing the geographic characteristic. 66 Raster Image 67 Raster Image 68 Raster Image 2 20 35 50 43 35 50 43 35 50 50 73 96 119 119 134 28 43 58 50 43 58 50 43 58 58 81 103 126 126 141 50 66 81 73 66 81 73 66 81 81 103 126 149 149 149 66 81 96 88 81 96 88 81 96 96 119 141 164 164 157 50 66 81 73 66 81 73 66 81 81 103 126 149 149 164 58 73 88 81 73 88 81 73 88 88 111 134 157 157 172 66 81 96 88 81 96 88 81 96 96 119 141 164 164 179 73 88 103 96 88 103 96 88 103 103 126 149 172 172 187 66 81 96 88 81 96 88 81 96 96 119 141 164 164 194 66 81 96 88 81 96 88 81 96 96 119 141 164 164 202 73 88 103 96 88 103 96 88 103 103 126 149 172 172 210 88 103 119 111 103 119 111 103 119 119 141 164 187 187 217 111 126 141 134 126 141 134 126 141 141 164 187 210 210 225 119 134 149 141 134 149 141 134 149 149 172 194 217 217 232 126 141 157 149 141 157 149 141 157 157 179 202 225 225 240 69 Comparisons between remote sensing images and natural images Remote Sensing Images Landsat-8 OLI true color images illustrating the coverage over the coastal waters of French Guiana (4 scenes) Natural Images Examples in the ImageNet dataset Image Source: Zorrilla et al., 2019, Optics Express; Deng et al., 2009, CVPR 70 Comparisons between remote sensing images and natural images Image resolution (size) Remote Sensing Images The width/height of a Landsat-8 image is 6000-8000 Large image size Natural Images The average size of an ImageNet image is 469x387 Relatively small image size Image Source: Zorrilla et al., 2019, Optics Express; Deng et al., 2009, CVPR 71 Comparisons between remote sensing images and natural images Image channel Remote Sensing Images 11bands (Coastal aerosol, B, G, R, NIR, SWIR1, SWIR2, Panchromatic, Cirrus, TIRS1, TIRS2) Multiple channels/bands Natural Images 3 R-G-B bands RGB channels Image Source: Zorrilla et al., 2019, Optics Express; Deng et al., 2009, CVPR 72 Comparisons between remote sensing images and natural images Geo-location information Remote Sensing Images With geo-location information Natural Images Without geo-location information (might contain geo-tags) Image Source: Zorrilla et al., 2019, Optics Express; Deng et al., 2009, CVPR 73 Comparisons between remote sensing images and natural images Remote Sensing Images Natural Images Large image size Relatively small image size Multi-channel RGB channel Large data value range (e.g. 0-65535) Small data value range (0-255) With geo-location (georeferenced) information Without geo-location information 74 Spectral characteristics: bits and bytes All instructions carried out within a computer are in Binary Code which consists of the digits 0 & 1. These are called binary digits or bits. This machine code is executed by a series of electrical pulses which send signals off (0) or on (1). The number 2 is the base of the binary number system, just as 10 is the base of the decimal number system Any binary number comprises a string of bits that are worked from right to left in increasing powers of 2. Thus the binary number 101 is interpreted in decimal (our number system) as 5 (1 × 20=1, 0 × 2¹=0, 1 × 2²=4→5). 75 Examples of binary data MODIS Data product Quality Assessment flags Landsat-8 Quality Assessment Band Bits Bit 0 = 0 = not fill Bit 1 = 0 = not a dropped frame Bit 2 = 0 = not terrain occluded Bit 3 = 0 = not determined Bit 4-5 = 01 = not water Bit 6-7 = 00 = not determined Bit 8-9 = 00 = not determined Bit 10-11 = 01 = not snow / ice Bit 12-13 = 10 = could be cirrus cloud Bit 14-15 = 11 = cloudy Source: MODIS Surface Reflectance User’s Guide (Collection 6) Landsat 8 Data Users Handbook 76 Conversion between binary digits and decimal numbers (10111)₂ = (1 × 2⁰) + (1 × 2¹) + (1 × 2²) + (0 × 2³) + (1 × 2⁴) = (23)₁₀ Mod(x, y): function return remainder of two numbers after division Mod(23,2) =1 Mod(11,2) =1 Mod(5,2) =1 Mod(2,2)=0 Mod(1,2)=1 77 Conversion between binary digits and decimal numbers (1010101)₂ = (?)₁₀ (99)₁₀ = (?)₂ 78 Conversion between binary digits and decimal numbers (1010101)₂ = (85)₁₀ (99)₁₀ = (1100011)₂ 79 Counting in Binary With 6 bits the maximum number that can be represented is 63. What is the maximum number with 7 bits? What is the maximum number with 8 bits? (8-bit scale) 80 Counting in Binary With 6 bits the maximum number that can be represented is 63. What is the maximum number with 7 bits? 63 + 1×26= 127 What is the maximum number with 8 bits? 127 + 1×27= 255 The number of bits needed to represent most remote sensing data is 8 This is convenient since the standard storage unit in computers is 8 bits This corresponds to 1 byte, which takes up only 1 unit of computer storage space (memory or disk) 81 Bytes Thus 1 byte = 1 unit, 1Kb = 1024 bytes (210), 1Mb = 1024 × 1024 bytes = 1,048,576 bytes Using this system, some numbers require more storage space than 1 byte, e.g., integers above 256, and real numbers, examples are 270, 3.456, 0.24 In remote sensing (and also GIS) we recognize that data will be stored in order of preference as 1. BYTE data (least storage space, smaller file sizes) 2. INTEGER 3. REAL (largest file sizes) Some remotely sensed data are measured on a scale of 01023 and the storage requires 10 bits Most computer systems allow Integers to be stored as 8-, 16-, or 32-bit quantities 82 Data type Data type (name) Typical bit length Value range byte / int8 1 byte = 8 bit -128 to 127 (signed); 0 to 255 (unsigned) int16 2 byte = 16 bit -32768 to 32767 (signed); 0 to 65535 (unsigned) float 4 byte = 32 bit -3,4e^38 to 3,4e^38, ‘single precision’; floating point number according to IEEE 754 double 8 byte = 64 bit -Inf to +Inf, ‘double precision’ Define the type of value and its range. A variable can be operated and transformed in the computer program. 83 Color Image Color image is produced by using three raster arrays, i.e. RGB Each array element holds pixel values that represent the levels of one the three primary Colors of light. Normally, each pixel value has 0-255 levels or 8-bit data but could be 10-bit, e.g. AVHRR 84 Color allocation for Color composite images True Color – 3 bands False Color – 3 bands Pseudo Color – single band True Color image False Color NIR image 85 Color allocation for Color composite images True Color image Flase Color image 86 Color allocation for Color composite images Flase Color images 87 Color allocation for Color composite images Pseudo Color images Use 1 band of data Image data values 0-255 Color LUT can be greyscale or any Color palette Used mainly for classified images 88 Image Resolutions The size the sensor can resolve The ability of the sensor to differentiate variations in brightness The ability to detect changes over time The ability of the sensor to define fine wavelength intervals The four Remote Sensing resolutions that define the image data 89 Spatial resolution The fineness of spatial detail visible in an image. “Fine detail” means that small objects can be identified on an image. System’s spatial resolution is expressed in meters (or feet) of the ground-projected instantaneous field of view (IFOV) Finer spatial resolution → greater the resolving power of the sensor system 91 Spatial resolution 92 Spatial resolution and pixel size The spatial resolution and pixel size are often used interchangeably. In reality, they may not be equivalent. An image sampled at a small pixel size does not necessarily have a high resolution. Why? 240px by 240px 10 m resolution, 10 m pixel size 80px by 80px 30 m resolution, 10 m pixel size 30px by 30px 80 m resolution, 10 m pixel size 93 Spatial resolution vs extent However, there is trade-off between spatial resolution and spatial extent. The higher the spatial resolution, the smaller area it can be covered by one single image. Landsat-7 ETM+ 30m MODIS 250m to 1km 94 Temporal resolution How frequently a satellite observe the same area on the Earth. It is also known as Repeat Cycle. Depends on a variety of factors, including the satellite/sensor capabilities (as some satellites can point its sensor to specified area), orbits, the swath overlap, and latitude (higher latitude gives increasing overlap in adjacent swaths). high temporal resolution = better source for change detection. 95 Temporal resolution 96 Spectral resolution Spectral resolution describes the ability of a sensor to define fine wavelength intervals. Coarse – sensitive to large portion of EM spectrum contained in a small number of wide bands Fine – sensitive to same portion of EM spectrum but have many narrow bands Goal – Recording very fine spectral details to distinguish between scene objects and features Landsat-7 image contains 7 channels/bands. Landsat-8 image contains 11 channels/bands. Hyperion can resolve 220 spectral bands (from 0.4 to 2.5 µm) Airborne Visible / Infrared Imaging Spectrometer (AVIRIS): 224 contiguous spectral channels (bands) with wavelengths from 400 to 2500 nano-meters. 97 Multispectral vs Hyperspectral The main difference between multispectral and hyperspectral imaging is the number of wavebands being imaged and how narrow the bands are. Multispectral imagery generally refers to 3 to 10 discrete “broader” bands. Hyperspectral imagery consists of much narrower bands (10-20 nm). A hyperspectral image could have hundreds of thousands of bands. 98 Spatial and spectral resolution between Panchromatic and visible light bands Panchromatic image (B&W) records all of the visible portion of the electromagnetic spectrum which can achieve high Signal to Noise Ratio (SNR). Its spectral resolution is fairly coarse so that adequate signals can be acquired to use smaller detectors giving higher spatial resolution. Color film is also sensitive to the reflected energy over the visible portion of the spectrum, and has higher spectral resolution. It is individually sensitive to the reflected energy at the blue, green, and red wavelengths of the spectrum. However, it has lower spatial resolution. 99 Spectral reflectance An example showing the spectral reflectance of typical ground objects 100 Spectral reflectance Image Source: Sun et al., 2017 101 Spectral reflectance Comparison of spectra for alunite from four sensors with different spectral resolutions (Image Source: USGS) 102 Materials spectra collection Collection of field and laboratory measurements for the USGS Spectral Library (Image Source: USGS) 103 Materials spectra collection 𝑅e𝑓𝑙𝑒𝑐𝑡𝑎𝑛𝑐𝑒 = 𝑅𝑒𝑓𝑙𝑒𝑐𝑡𝑒𝑑 𝑟𝑎𝑑𝑖𝑎𝑛𝑐𝑒 𝐼𝑛𝑐𝑜𝑚𝑖𝑛𝑔 𝑟𝑎𝑑𝑖𝑎𝑛𝑐𝑒 White reference (left), measured reflected radiance of a white reference and target surface (middle), and calculated reflectance of the target surface (right) 104 Radiometric resolution The ability of the sensor to differentiate very slight variations in brightness/energy. The maximum number of brightness levels available depends on the number of bits used in representing the energy recorded. The finer the radiometric resolution of a sensor, the more sensitive it is to detect small differences in reflected or emitted energy. Coarse radiometric resolution: record a landscape using only a few brightness levels or few bits (i.e. at very high contrast) Fine radiometric resolution: record the same landscape using many levels of brightness or bits and thus more sensitive to detect small differences in reflected energy. 105 Radiometric resolution a b 1 bit – 2 levels 2 bits – 4 levels c d 3 bits – 8 levels 4 bits – 16 levels 106 Radiometric resolution 3 bits – 8 levels 2n-1 4 bits – 16 levels b c d 2 bits – 4 levels … a 2n bright levels 1 bit – 2 levels 0 107 Radiometric resolution (Landsat MSS) Multispectral Scanner System (Landsat TM) AVHRR 108 Radiometric resolution However, there is trade-off between radiometric resolution and spatial resolution/spectral resolution. Finer radiometric resolution requires sufficient signal strength and desirably high signal-to-noise ratio to give correct signal allowing discrimination of very slight energy differences. In general, high flux per unit area and signal correction is necessary. To give finer radiometric resolution, it can be done by reducing spatial resolution (larger GSD size) or by broadening the band incident upon a sensor. SNR is ratio of desired signal power to noise power, i.e. 109 Trade offs between image resolutions It is very difficult to obtain extremely high spectral, spatial, temporal, and radiometric resolution at the same time. 110 Trade offs – a short summary It is very difficult to obtain extremely high spectral, spatial, temporal, and radiometric resolution at the same time. Several sensors can obtain global coverage every one-two days because of their wide swath width meaning lower spatial resolution. Higher spatial resolution polar/non-polar may take 8-16 days to attain global coverage. Geostationary satellites obtain much more frequent observations but at lower spatial resolution due to the much greater orbital distance and only over a fraction of the earth. SNR can also be increased by broadening the wavebands and thus increasing radiometric quality. However, this also scarifies spectral resolution, i.e. the ability to discriminate fine spectral differences. 111 Questions Why it is very difficult to obtain extremely high spectral, spatial, temporal, and radiometric resolution at the same time? Try to explain in your own words. 112 Summary 1. Basic Specifications of Satellite and Sensor Sensors and Imaging Scanners Satellite Orbit Swath Width Repeat cycle and Revisit time FOV and IFOV 2. Introduction to the Satellites/Sensors Medium Resolution Satellites - Landsat Moderate Resolution Satellites - Terra (MODIS) Copernicus Satellites constellation - Sentinels High Resolution Satellites Small Satellites and Sensors 3. Characteristics of Remote Sensing Images Raster image Binary data Color Image Image resolutions 113 Discussion 114 Group Project Members and topic for the group project 115 Which kind of satellite images should I select for my research? Data availability (accessibility, time range, etc.) Scale issue (spatial/temporal resolutions) Others o Are there any existing products that meet the requirements? o What image processing methods/techniques should be used? o… 116 Homework Read the following satellite data user handbooks before the next lecture. ❑ Landsat 8 Data Users Handbook (Recommended) https://d9-wret.s3.us-west-2.amazonaws.com/assets/palladium/production/s3fspublic/atoms/files/LSDS-1574_L8_Data_Users_Handbook-v5.0.pdf ❑ MODIS Surface Reflectance User’s Guide https://modis-land.gsfc.nasa.gov/pdf/MOD09_UserGuide_v1.4.pdf ❑ Sentinel-2 User Handbook https://sentinels.copernicus.eu/documents/247904/685211/Sentinel2_User_Handbook.pdf/8869acdf-fd84-43ec-ae8c-3e80a436a16c?t=1438278087000 117 End of Lesson 2 118 LSGI536 Remote Sensing Image Processing Lecture 3 Remote Sensing Image Pre-processing Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] Outlines 1. Processes of Image Pre-processing 2. Radiometric Pre-processing 3. Geometric Pre-processing 4. Other Pre-processing Processes 2 Section 1 Processes of Image Pre-processing 3 How to collect and exploit Remote Sensing data? Space-borne instruments can only measure the properties of electromagnetic waves emitted, reflected or scattered by the Earth’s surface. Scientists need to understand where these waves originate from, how they interact with the environment, and how they propagate towards the sensor. This is based on understanding the radiation sources and how radiation propagates in the atmosphere so that sensor for measuring radiation can be developed. More precisely, sensors on space platforms are developed that record target scene by receiving photons or EM waves in term of variations in electrical currents or voltages. The electrical variations are immediately converted into digital numbers forming a pixel value in an image. 4 How to collect and exploit Remote Sensing data? The ultimate goal is that users can obtain reliable biogeophysical quantities with respect to the environment such as atmospheric pollution, oceanic currents, crop productivity, etc. However, extract useful and reliable information from remote sensing data is a big challenge. To know the value of the radiation measurements captured by the sensors in space, the next question is How can we derive the bio-geophysical quantities correctly subject to systematic distortion, random error and characteristics of the environment that are responsible for the radiation to reach the sensor? All these are related to pre-processing of remote sensing data. 5 Image pre-processing The goal of pre-processing is to correct distorted (geometric) or degraded (radiometric) image data. to create a more faithful representation of the original image. depict the reality prior to any further enhancement, interpretation, or analysis. 6 Image pre-processing Signal Noise Removal This is mainly related to the senor and its platform and should be removed before applying other corrections. Geometric Correction Due to altitude, orientation and velocity of the platform Earth curvature and rotation. Cloud Screening Radiometric Correction Conversion from DN to radiance, related to sensor sensitivity and calibration. Sun Angle correction apply to images acquired under different solar illumination angles by normalization assuming the sun was at the zenith on each of the sensing. Atmospheric Correction Radiation attenuation due to scattering and absorption May not be necessary for some applications such as classification for single date. But crucial for spectral comparison of image acquired on different dates. 7 After image pre-processing Image enhancement contrast enhancement (make features look more clearly by optimal use of colour) and image filtering (enhance or suppress) specific spatial patterns in an image to improve the appearance of the imagery to assist in visual interpretation, automated feature extraction, and analysis Image classification digitally identify and classify pixels and each objects in the image Data merging/data fusion combine and transform the original bands into “new” images that better display or highlight certain features in the scene, e.g., pan sharpening. 8 Digital Image Processing The process of extracting information from digital images obtained from satellites. Information regarding each pixel is fed into an algorithm and the result of the computation stored for that pixel. Thus, for each image being processed by a particular algorithm there is an input and output image. Order of processing is important. 10 What are tiers? Landsat Collections — What are Tiers? (3 mins) https://www.youtube.com/watch?v=NruC3z4peBc What are the differences between the three tiers? 11 What are tiers? – Real Time All the data are immediately provided after acquisition in less than 12 hours (4-6 hours typically) – nearly in real time. Be useful when monitoring events and natural disasters. Transition from RT to Tier 1 or Tier 2 is 14-26 days delay. 12 What are tiers? – Tier 1 RMSE smaller than or equal to 12 meters Landsat scenes with the highest available data quality are placed into Tier 1 13 What are tiers? – Tier 2 Adhere to the same radiometric standard as Tier 1 scenes Do not meet the Tier 1 geometry specification RMSE higher than 12 meters caused by significant cloud cover and/or insufficient ground control points 14 Section 2 Radiometric Pre-processing 15 Radiometric pre-processing Noise removal Radiometric correction Atmospheric correction Topographic correction 16 Noise removal Noise It is unwanted disturbance in image data that is due to limitations in the sensing, signal digitation, or data recording process. The effects of noise range from a degradation to total masking of the true radiometric information content of the digital image. It may either be systematic (banding of multispectral images) to dropped lines or parts of lines, or random speckle. Critical to the subsequent processing and classification of an image. Produce an image that is as close to the original radiometry of the scene as possible. 17 Example of systematic noise removal Line drop A number of adjacent pixels along a line (or an entire line) may contain spurious DNs. Solution? 18 Example of systematic noise removal Line drop A number of adjacent pixels along a line (or an entire line) may contain spurious DNs. Solution Can be solved by replacing the defective DNs with the average of the values of the pixels occurring in the lines just above and below. Alternatively The DNs from the preceding line can simply be inserted in the defective pixels. 19 Example of random noise removal Bad pixels (bit errors): Detector does not record spectral data for an individual pixel. Such noise is often referred to as being “spikey” in character, and it causes images to have a “salt and pepper” or “snowy” appearance. Solution? 20 Example of random noise removal Bad pixels (bit errors): Detector does not record spectral data for an individual pixel. Such noise is often referred to as being “spikey” in character, and it causes images to have a “salt and pepper” or “snowy” appearance. Solution: Noise can be identified by comparing each pixel in an image with its neighbours. Normally change much more abruptly than true image values. 21 Example of random noise removal The noisy pixel value can then be replaced by the average of its neighbouring values. Moving neighbourhoods or windows of 3 х 3 or 5 x 5 pixels are typically used in such procedures. 22 Stripe noise Sixteen-line frequency noise in a Landsat TM band 2 – Sumatra coastline De-striping is using low pass filter/noise removal filter 23 Radiometric correction Need to calibrate data radiometrically due to: (i) System effects Different sensors and bands convert differently to byte scale Systems have noise on pixel values Systematic differences in the digital numbers, e.g. striping Convert the DN value to Radiance, then to the Top of Atmosphere Reflectance (TOA) (ii) Sun angle and atmospheric effects Scene illumination (time of day, season) Atmospheric scattering and absorption 24 Radiometric correction: DN-to-Radiance conversion Two formulas to convert DNs to radiance Depends on the scene calibration data available in the header file(s) 1. Gain & Bias Method Use Gain and Bias (or Offset) values from the header file 2. Spectral Radiance Scaling method Use the LMin and LMax spectral radiance scaling factors High gain mode is used when the surface brightness is low and low gain mode is used when surface brightness is high. 25 Gain change Anomaly occurs on Landsat-7 A gain change occurs in a scene. Generally this change occurs on the beginning or the end of the scene. Usually, it is splitting the image band into two areas with different brightness. The upper one is brighter than the lower one in the case of the gain switching from high to low gain and vice versa. https://earth.esa.int/c/document_library/get_file?folderId=25717&name=DLFE-522.pdf 26 Landsat 7 ETM+ DN to Radiance LMAXs will ensure accurate conversion to radiance units. Source: Landsat 7 Data Users Handbook 27 Radiometric correction: DN-to-Radiance conversion for Landsat 7 This is spectral specific. A form of radiometric correction is the conversion of the digital numbers to absolute radiance values This is to convert any DN in a particular band to absolute units of spectral radiance in that band if LMAX and LMIN are known from the sensor calibration. 28 Landsat 7 ETM+ DN to Radiance =127 29 Landsat 7 ETM+ DN to Radiance 30 Radiometric correction: Sun angle effects due to seasonal change Radiance to Reflectance In order to normalize the sun angle effect, radiance at sensor’s aperture can be divided by cosine of sun angle from zenith for particular time/location. This is correction to images acquired under different solar illumination angles by normalization, assuming the sun was at the zenith on each of the sensing. 𝐿𝜆 𝐿𝜆⊥ = cos𝜃 𝑠 We can use Top of Atmosphere (TOA) Reflectance, which is a unitless measurement that provides the ratio of radiation reflected to the incident solar radiation on a given surface. It can be computed from satellite measured spectral radiance using the mean solar spectral irradiance and the solar zenith angle. Sun Elevation Angle 𝜋 × 𝐿 × 𝑑2 𝜌𝜆 = 𝐸𝑆𝑈𝑁 ×𝜆 cos𝜃 𝜆 𝑠 where: ρλ = Unitless planetary reflectance (TOA Reflectance) Lλ= spectral radiance received at sensor’s aperture, converted from DN d = Earth-Sun distance in astronomical units (one such a d equals 1.496e+8 km) ESUNλ = mean solar exo-atmospheric irradiances θs = solar zenith angle Effects of Seasonal Changes on Solar Elevation Angle 31 Radiometric correction: Sun angle effects due to seasonal change 32 Radiometric correction: Band 6 Conversion to Temperature Thermal Infrared (TIR) Band Conversion to Temperature 33 Atmospheric Correction The effects of the atmosphere include Radiation attenuation before reaching the ground due to absorption and scattering Increasing the amount of energy reaching the sensor by scattering the radiation (diffuse radiation) Decrease in thermal radiation due to water vapour absorption Atmospheric correction can be handled by 1) Dark object subtraction/Dark pixel method 2) Empirical methods 3) Radiation transfer models/methods Reflected radiance (EI/π) = Incident radiation over pi with the assumption that ground surface is Lambertian surface (radiance is constant across all incident directions in the hemisphere). Reflectance (ρ): Surface reflectance Transmittance (T): Transmission of atmosphere Path Radiance (Lp), which is attributed by atmospheric disturbance, which is spectral specific 34 Atmospheric Correction: Dark object subtraction Assumes that the darkest objects in the image should have a DN of zero and therefore, any response is accounted from path radiance (Lp) with the assumption that it affects the entire scene uniformly. For example: Reflectance of deep clear water is essentially zero in the near-infrared region. Any signal observed over this area represents the path radiance. This value can be subtracted from all pixels in that band. 35 Atmospheric Correction: Dark object subtraction Find the minimum pixel value from each band (using histograms) (minimum vs lowest value that has significant number of counts, may examine dark objects from the entire image not subset of it) Subtract that value from all of the pixels in the band Must be done separately for each band However, this is not always a correct assumption! Because it depends uniformity of the atmosphere over a scene. For example, haze is viewing-angle dependent, as such for extreme viewing angle, it is necessary to normalize the radiance value to nadir position rather than applying dark object subtraction. Overall brightness increases, this reduces image contrast and may due to haze. 70.5°forward 60.0°forward 45.6°forward 0°Nadir 36 Atmospheric Correction: Empirical method Absolute atmospheric correction may also be performed Using empirical line calibration (ELC), which forces the remote sensing image data to match in situ spectral reflectance measurements. Empirical line calibration is based on the equation: Reflectance (field spectrum) = gain x radiance (image) + offset Multi-spectral Ground Calibration Targets (source: link1, link2) 37 Atmospheric Correction: Radiative Transfer Models (RTMs) RTMs simulate the radiative transfer interactions of light scattering and absorption through the atmosphere. These models are typically used for the atmospheric correction of airborne/satellite data and allow for retrieving atmospheric composition. Require parameters of atmospheric condition at the time of image acquisition such as visibility, pressure and so on, can be obtained from local meteorological station. Available numerical models such as LOTRAN, MODTRAN, ATERM, ATCOR and 6S. 38 Atmospheric Correction: Bottom Of Atmosphere Reflectance (BOA) Upon Atmospheric correction on TOA reflectance is applied, Bottom Of Atmosphere (BOA) Reflectance / Surface Reflectance (SR) can be obtained. Look from Space vs See what is on the surface. Sentinel-2 TOA Level-1C image data (left) and associated Level-2A BOA image data (right) [link] 39 Atmospheric Correction Atmospheric Correction in QGIS https://youtu.be/myBn8u9MbjM ~ 6 mins 40 Topographic correction Cosine Correction for Terrain Slope cos o L H = LT cos i where: LH = radiance observed for a horizontal surface (i.e., slope-aspect corrected remote sensor data). LT = radiance observed over sloped terrain (i.e., the raw remote sensor data) 0 = sun’s zenith angle i = sun’s incidence angle in relation to the normal on a pixel 41 Section 3 Geometric Pre-processing 42 Geometric correction Remove geometric distortion so that individual picture elements (pixels) are in their proper planimetric (x, y) map locations. Internal and external geometric error Systematic (predictable) Non-systematic (random) 43 Geometric correction Various geometric distortions: Systematic distortions (predictable) Panoramic distortion Skew distortion due to earth rotation during sweep of IFOV Earth curvature – orbit variation due to ellipsoid After applying geometric correction, you will obtain a geometrically accurate image, registered to a ground coordinate system – georeferenced. 44 Geometric correction Various geometric distortions: Non-systematic (random) Variations in altitude, attitude, and velocity of the sensor platform Variable speed of scanning mirror Atmospheric refraction Relief displacement 45 Reasons to apply geometric correction Using coordinate system Perform accurate distance and area measurements Allow co-registration of images for change detection Mosaics of many image sets Overlay image with GIS data 46 Systematic distortions Panoramic Distortion (or Tangential Scale Distortion) The ground area imaged is proportional to the tangent of the scan angle rather than to the angle itself. Because data are sampled at regular intervals, this produces along-scan distortion. Change in scale at edge of scan (tangential sale distortion) 47 Systematic distortions Panoramic Distortion (or Tangential Scale Distortion) 49 Systematic distortions 50 Systematic distortions Skew Distortion Earth rotates as the sensor scans the terrain. This results in a shift of the ground swath being scanned, meaning each sweep covers an area slightly to the west of the previous weep, causing along-scan distortion. Deskewing involves offsetting each scan line successively to west. Skewed parallelogram appearance of images 51 External distortions Caused by attitude of the sensor or the shape of the object. 52 Terrain relief displacement 53 Geometric distortion due to change in altitude and platform attitude a) Altitude increase: smaller-scale imagery. Decrease: larger-scale imagery. b) An aircraft flies in the x-direction. Roll: Directional stability but the wings move up or down, i.e. they rotate about the x-axis angle (omega: w). Pitch: Wings are stable but the nose or tail moves up or down, i.e., they rotate about they-axis angle (phi: f). Yaw: Wings remain parallel but the fuselage is forced by wind to be oriented some angle to the left or right of the intended line of flight, i.e., it rotates about the z-axis angle (kappa: k). Suffers: combination of changes in altitude and rotation (roll, pitch, and 54 yaw). Correction for geometric distortions Most systematic distortions can be corrected at ground station using mathematical modelling. Most random distortions can be corrected by Ground Control Points (GCPs) identified in the image and register the image to the ground coordinate system (geo-referencing). For image data, geometric correction comprises two parts: geocoding and resampling. 56 Ground Control Points (GCP) Image coordinates specified in i rows and j columns, and Map coordinates (e.g., x, y measured in degrees of latitude and longitude, or meters in a Universal Transverse Mercator (UTM) projection). A location on the surface of the Earth (e.g., a road intersection) that can be identified on the imagery and located accurately on a map. 57 Types of Geometric Correction Two common methods: image-to-map rectification, and image-to-image registration 58 Image to Map Rectification Image-to-map rectification is the process by which the geometry of an image is made planimetric. 59 Image to Image Registration Instead of map, previously rectified image can be used for the rectification. Unrectified image can also be used. 60 Example: transformation equation (affine transformation) Image to Map: X=f1(X’,Y’), Y=f2(X’,Y’) X = aX’ + bY’ + c Y = dX’ + eY’ + f X’, Y’ : image; X, Y : map a, b, c, d, e, f are transformation parameters. To find the 6 parameters, at least 3 control points are required. Inverted to find new map locations for image pixels: x=g1(X’,Y’), y=g2(X’,Y’) The statistical technique of least squares regression is generally used to determine the coefficients for the coordinate transformation equations. 61 Performance evaluation The performance of the transformation often evaluated by the computation of the root-mean-square error (RMS error) for each of the ground control points. Finally the computation of RMSE of the total model, which measures the difference between an estimator/computed value and observed value/control value. The smaller the error the better the model. 62 RMS Error: Example where: xorig and yorig are the original row and column coordinates of the GCP in the image x’ and y’ are the computed or estimated coordinates in the original image when we utilize the six coefficients. the closer these paired values are to one another, the more accurate the algorithm. 63 RMS Error: Example 64 Intensity Interpolation (Resampling) An empty output matrix is created with the desired coordinate. Interpolation needs to transfer brightness value from an x’, y’ location in the original (distorted) to the rectified output image. The practice is commonly referred to as resampling. Three resampling methods are commonly used: Nearest neighbor Bilinear interpolation Cubic convolution 66 Resampling – Nearest-neighbour Substitutes in DN value of the closest pixel. Transfers original pixel brightness values without averaging them. Keeps extremes. Advantages: Keep original values and no averaging. Since original data are retained, this method is recommended before classification. Easy to compute and therefore fastest to use. Disadvantages: Produces a “stair-stepped” effect meaning features may be offset spatially. Data values may be lost, while other values may be duplicated. The brightness value closest to the x’, y’ coordinate is assigned to the output x, y coordinate. 67 Resampling – Bilinear interpolation Advantages: More spatially accurate than nearest neighbour. Stair-step effect reduced, the image looks smooth and less blocky. Still fast to compute. Disadvantages: Alters original data and reduces contrast by averaging neighbouring values together. Computationally more expensive than the nearest neighbour. Distance weighted average of the DN’s of the four closest pixels where Zk are the surrounding four data point values, and D2k are the distances squared from the point in question (x’, y’) to the these data points. 68 Resampling – Cubic convolution Determines the value from the weighted average of the 16 closest pixels to the specified input coordinates and assigns that value to the output coordinates. Advantages: More spatially accurate than nearest neighbor; Stair-step effect reduced, image looks smooth and sharpening as well Disadvantages: Alters original data and reduces contrast by averaging neighbouring values together; Computation takes a longer time when compared with the other methods 69 Nearest neighbour vs Bilinear vs Cubic convolution 70 Section 4 Other Pre-processing Processes 71 Cloud screening Cloud affects information retrieval from image Thick and bright clouds block all optical bands reflected from the Earth’s surface. Optically thin clouds affects retrieval of real reflectance from ground. Clouds over bright surfaces creates confusion for identifying snow, ice, and bright sand. It is difficult to separate between clouds and heavy aerosol loading due to similarities between the spectral reflectance of large aerosol particles (e.g., dust) and clouds. 72 Cloud screening Detection of cloud is spectral dependence, e.g. cirrus band 9 in Landsat 8 Many different algorithms have been developed. OLI bands 4,3,2 composite cirrus band 9 “Landsat 8’s Band 9 (1.360-1.390 µm) provides detection of high-altitude cloud contamination that may not be visible in other spectral bands.” [link] 73 Cloud screening Screening clouds and cloud shadows in optical satellite image False color composite of the four representative Landsat MSS images in the Puerto Rico site and their cloud and shadow masks by ATSA (gray: clear pixels; black: shadows; white: clouds) (Zhu et al., 2018) [link] 74 Cloud removal Remove Clouds in Landsat-8 image using ArcGIS https://youtu.be/vx6VYLm48DQ ~ 5 mins 75 Questions What are the basic processes in pre-processing of image data? List and explain the geometric distortion. What is radiometric correction? List and explain image interpolation methods. Explain the reasons for atmospheric correction by the radiance/energy equation. List and explain methods for atmospheric correction. What is RMSE? 76 End of Lecture 3 79 LSGI536 Remote Sensing Image Processing Lecture 4 Remote Sensing Image Enhancement Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] Outline 1. Contrast Enhancement What is it and why? 2. Contrast Stretch Linear stretch Histogram Equalization Histogram Matching Gaussian stretch 3. Spatial filtering Low pass filters – mean filter, median filter, majority filter, etc. High pass filters – edge enhancement, directional first differencing 4. Density slicing and Image Arithmetic Density slicing Band Ratioing – Simple ratios Normalized difference indices 2 Section 1 Contrast Enhancement 3 Contrast enhancement: definition Technology for increasing the visual distinction between features in a scene Done by spectral feature manipulation Producing the ‘best’ image for a particular application Applied to image data after the appropriate preprocessing Noise removal must be done before contrast enhancement. Because without noise removal, image interpreter is left with the prospect of analyzing enhanced noise 4 Why is contrast enhancement necessary? Display images and record device operations in a range of 0-255 grey levels Sensor data in a single scene rarely extend over this whole range Thus necessary to expand the narrow range of brightness levels in any one scene to take advantage of the whole range of 0-255 display levels available This maximizes the contrast between features In short, contrast enhancement changes the image value distribution to cover a wide range 6 Why is contrast enhancement necessary? For a Low contrast image, image values concentrated near a narrow range (mostly dark, or mostly bright, or mostly medium values) Contrast of a image can be revealed by histogram 7 Generic appearance of histogram 8 Graphical representation of image histogram SPOT XS Band 3 (NIR) Image histogram 9 Intensity transformation Contrast enhancement can be realized by a grey-level (intensity) transformation function of the form s = T(r) r denotes the grey level of point (x, y) of an input image s denotes the grey level of point (x, y) of a processed image Since the enhancement is applied at any given point in an image, the technique is referred to as point processing Darkening the levels below m, and brightening the levels above m This is a threshold function. When r=m, s=1 (white) 10 Types of contrast stretch Principle of contrast stretch enhancement 18 Section 2 Contrast Stretch Linear Contrast Stretch 19 Linear contrast stretch Translates the image pixel values from the observed range (Dmin to Dmax), scaling Dmin to 0 and Dmax to 255 Intermediate values retain their relative positions so that e.g. median input pixel maps to 127 Algorithm for linear stretch For example, original range is 50-100 DN’ = (𝐷𝑁−50) 50 × 255, where DN ∈ [50,100] 20 Transformation function for Linear Contrast Stretch 21 Images and plots of histogram for Linear Contrast Stretch 22 Images and plots of histogram for Linear Contrast Stretch 24 Disadvantage of Linear Contrast Stretch It may assign a large range of display values for a small amount of input values. e.g. refer to below image histogram DN values 60-108 represent few pixels in an image but linear stretch allocates DN output values of 0-127 (half the output range) Most of pixels in image confined to only half output range 25 Section 2 Contrast Stretch Histogram Equalization Stretch 26 Histogram Equalization stretch Image values are assigned to display levels based on frequency of occurrence in the image Image range 109-158 now stretched over a large portion of display levels (39-255) Smaller portion (0-38) reserved for infrequently occurring values of 60-108 In general, the goal is to transform gray-level value distribution of input image to uniform distribution. Probability Density Function (PDF) Arbitrary PDF Uniform PDF 27 Histogram Equalization stretch However Relative brightness in the original image is not maintained. The number of levels used is reduced. Result may be unsatisfactory compared with linear stretch if few occupied LUT (lookup table) entries in output stretch 28 Histogram Equalization Consider a continuous transformation function, and let variable r represent the gray levels of an input image, r is normalized to the interval [0,1] such that r=0 when black, and r=1 when white. Consider the following form of the continuous transformation function for 0 ≤ r ≤ 1. 𝑠 = 𝑇(𝑟) [𝑒𝑞.1] The goal is to obtain a uniformed histogram in the result, i.e., h(i) = constant for pixel value i. 29 Histogram Equalization 𝑠 = 𝑇(𝑟) produces grey level s for every value r in the input image and satisfies the following two conditions: 1) T(r) is single-valued and increasing in the interval 0 ≤ r ≤ 1; and 2) 0 ≤ T(r) ≤ 1 for 0 ≤ r ≤ 1 Condition 1 guarantees inverse transformation is possible and the increasing order from black to white in the output image, i.e. T(r) i.e. 𝑟 = 𝑇−1(𝑠) for 0≤𝑠≤1 Condition 2 guarantees output grey levels will be the same range as the input levels. 30 Histogram Equalization for Continuous Random Variables Consider the gray levels in the image as continuous Random Variables (RV) s and r. Let pr(r) and ps(s) be a probability density function (PDF) of the RVs r and s, respectively. Since T(r) satisfies the conditions, we can formulate the following formula: Transformation from varying probability to equal probability 31 Histogram Equalization for Discrete Values For discrete values, let n be the total number of pixels in the image, nk is the number of pixels having gray level rk, L is number of gray levels. The probability of gray level rk in an image can be estimated as follows The discrete version of the transformation function (also a CDF - cumulative distribution function) is The transformation given is called histogram equalization or histogram linearization. 32 Histogram Equalization 33 https://en.wikipedia.org/wiki/Histogram_equalization Histogram Equalization Implementation procedure Given an input image of size M × N with Level L, e.g., a 8x8 pixel image with 8-bit level Generate list of image value and count Compute Cumulative Number for each value Compute new value h(νk) for each image value in the input image using the following equation: 𝑤ℎ𝑒𝑟𝑒 𝑘=0,1, … 𝐿−1, 𝑛=𝑀х𝑁, 𝑣𝑖 is the number of pixel shaving level = i 34 Histogram Equalization Consider an example L = 16 35 Histogram Equalization - Summary The goal is to produce output image that has a uniform histogram. However, discrete transformation cannot produce uniform histogram. It can produce a histogram-equalized image that has a full range of the grayscale. Results are predictable and simple to implement. However, there are situations that attempting to base enhancement on uniform histogram is not the best approach. For example, when the input image having high concentration of pixels very near 0, the net effect is to map a very narrow interval of dark pixels into the upper end of the output image resulting in a light, washed-out appearance. 36 Section 2 Contrast Stretch Histogram Matching 37 Histogram Matching Assume we have two images and each has its specific histogram. So we want to answer this question before going further: Is it possible to modify one image based on the contrast of the other one? And the answer is YES. In fact, this is the definition of histogram matching. In other words, given images A and B, it is possible to modify the contrast level of A according to B. 38 Histogram Matching Histogram matching is useful when we want to unify the contrast level of a group of images. In fact, Histogram equalization can also be taken as histogram matching, since we modify the histogram of an input image to be similar to the normal distribution. In order to match the histogram of images A and B: 1. we need to first equalize the histogram of both images. 2. we need to map each pixel of A to B using the equalized histograms. 3. we modify each pixel of A based on B. 39 Histogram Matching 40 Histogram Matching 41 Histogram Matching This method (that allows specifying the shape of the histogram of the image that we want to process) is called histogram matching or histogram specification. Consider pr(w) is the PDF from the input image, and pz(t) is the PDF that we wish the output image to have 42 Histogram Matching The goal is to replace rk with zk. 43 Histogram Matching Implementation 1. 2. 3. 4. Obtain the histogram of input image Perform histogram equalization on the input image, i.e. compute sk for each rk Obtain transformation function G Compute zk for each value of sk such that Then, for each pixel in the original image, if the value of that pixel is rk, map this value to its corresponding level sk; then map level sk into the final level zk. 44 Histogram Matching – Example 45 Histogram Matching – Example 46 Comparison between histogram equalization and histogram matching If there is large concentration of pixels in the input histogram having levels very near 0. Uniform histogram would only map a very narrow interval of dark pixels into the upper end of the grey scale of the output image. (See the starting grey level is over 128) 47 Comparison between histogram equalization and histogram matching Curve 1: based on histogram equalization Curve 2: based on histogram matching 48 Section 2 Contrast Stretch Gaussian Stretch 49 Normal (Gaussian) distribution 1.0 = 0, = 0, = 0, = , 0. = 0. , = 1.0, =.0, = 0. , 0. 0. 0. 0.0 1 0 1 μ is the mean or expectation of the distribution (and also its median and mode) σ is its standard deviation σ2 is the variance of the distribution https://www.mathsisfun.com/data/standard-normal-distribution-table.html https://www.intmath.com/counting-probability/normal-distribution-graph-interactive.php https://en.wikipedia.org/wiki/Normal_distribution 50 The Gaussian stretch procedure i. ii. iii. Original pixel value Target distribution (z value) Probability of each class (target distribution) iv. Target number of pixel of each class, e.g. probability × total number of pixel v. Cumulative the target number of pixel vi. Observed number of pixels from the input image vii. Cumulative observed number of pixels of the input image viii. New pixel value Example: Original 0 class, Cumulative observed number of pixels is 1311, then find the closest class by minimum difference, i.e., 1311-530=781, 1311-1398=-87, therefore, the new class is class 1 Why? 51 Gaussian stretch Fits observed histogram to a normal distribution (Gaussian) form Normal distribution gives probability of observing a value if Mean and Standard Deviation are known, e.g., assume no. of pixels = 262144 no. of quantization levels = 16 Then target no. of pixels in each class for normal distribution (column iv) = probability (column iii) × 262144 Then cumulative no. of pixels at each level is calculated Output pixel value determined by comparing v & vii Once value of column viii determined, written to LUT 52 Result of Gaussian stretch 53 Raw data SPOT XS Band 3 (NIR) – output limits set to input limits 54 Linear stretch of actual data range 55 Result of Histogram Equalization 56 Result of Gaussian transform 57 Questions What is the pre-requisite of image enhancement? List reasons to perform contrast enhancement What are the advantages and disadvantages of histogram equalization? Compare histogram equalization and histogram matching Explain the procedure to perform histogram equalization for discrete values 62 Section 3 Spatial Filtering 63 Image filtering Spatial Filters emphasize or de-emphasize image data of various spatial frequencies Spatial frequency refers to “roughness” of brightness variations or changes of pixel value in an image Degree of change in tonal variation Rapid change → high spatial frequency → rough Slow change → low spatial frequency → smooth Areas of high spatial frequency are tonally rough Grey levels change abruptly over small distances or a small number of pixels. e.g. across roads or field borders Smooth areas have low spatial frequency e.g. large fields or water bodies Grey levels vary only gradually over a relatively large number of pixels. e.g. large agricultural fields or water bodies. 64 Image filtering Low pass filters: Smoothing emphasize low-frequency changes in brightness and de-emphasize high-frequency local detail smoothing filter (mean, median) noise removal filter, noise is usually scattered and different to surrounding→high frequency High pass filters: Sharpening emphasize high-frequency components of images and de-emphasize more general, low-frequency detail edge enhancement filter directional first differencing filter Feature (edge) detection Edges tend to be high frequency Can specify enhancement to look for specific edges 65 Image filtering Original Spot Pan 66 Image filtering Smooth 67 Image filtering Sharpen 68 Image filtering Find Edges 69 Image filtering 70 Image filtering - convolution The image processing operation applying spatial filtering is called convolution. Convolution involves two inputs An image A moving window, also called kernel or convolution matrix. Pixels modified on basis of grey level of neighbouring pixels in 3 stages: Input image Moving window (kernel) Output image Convolution (Padding, no strides) 71 Kernel Kernel is a square matrix (moving window) which is moved pixel-by-pixel over the input image Consider a m x n matrix kernel. If m < n or m > n, central element of the filter should locate at the intersection of the central row and column of the filter window When m=n, the filter window has an odd numbered array of elements (3x3, 5x5, etc.) those elements represent a weight to be applied to each corresponding digital number of the input image: result is summarized for central pixel e.g. mean Convolution Kernels 72 Convolution Input Image 1st convolution operation Kernel 9th convolution operation 73 Padding image borders 2 6 5 4 2 6 5 4 8 9 8 6 8 9 8 6 7 7 8 7 7 7 8 7 6 8 7 6 6 8 7 6 Replication padding Zero padding 2 2 6 5 4 4 2 2 6 5 4 4 8 8 9 8 6 6 7 7 7 8 7 7 6 6 8 7 6 6 6 6 8 7 6 6 0 0 0 0 0 0 0 2 6 5 4 0 0 8 9 8 6 0 0 7 7 8 7 0 0 6 8 7 6 0 0 0 0 0 0 0 74 Convolution Output Kernel Input 75 Convolution Convolution kernel can be used for blurring, sharpening, embossing, edge detection, and more. 76 Section 3 Spatial Filtering Low pass filters 77 Low pass filters: Smoothing Mean filter 91 + + 1/91 1/ 91 + + 1/91 1/ 91 + + 1/92 1/ 1/ 91 + 1/91 + 1/91 = 1.1 Kernel Input image Output image 78 Low pass filters: Smoothing Unequal-weighted smoothing filter 0.25 0.50 0.25 1 1 1 0.50 0.50 1 2 1 0.25 0.50 0.25 1 1 1 1 79 Low pass 3×3 Mean filter (Moving average filter) The output value for the central image pixel covered by the kernel (k) is the value of the products of each of the surrounding input pixel values and their corresponding kernel weights (W): Other frequencies may be smoothed by altering size of kernel or weighting factors 80 Traverse of pixel values across raw image and after mean filter Before X-axis: Pixel Number Y-axis: Pixel value After 81 Effects of mean filter Reduce the overall variability of the image and lower its contrast. Pixels that have larger or smaller values than their neighbourhood average are respectively decreased or increased in value so that local detail is lost. It retrieves overall pattern of values that is of interest, rather than the details of local variation. 4 4 4 4 4 4 4 8 4 4 4 4 4 4 4 4 4 4 82 Image smoothing using kernel sizes 3x3, 5x5, 7x7 IKONOS panchromatic image of ShauKei Wan 3x3 5x5 7x7 83 Median smoothing filters Superior to mean filter, as median is an actual number in the dataset (kernel) Less sensitive to error or extreme values e.g., 3,1,2,8,5,3,9,4,27 are pixel values in a 3 x 3 kernel median=? mean=6.89 rounded to 7 but 7 does not present in original dataset Mean larger than six of the nine observed values because influenced by extreme value 27 (3 times higher than next highest value in dataset) Therefore, isolated extreme pixels, which may represent as noise or spike, can be removed by median Preserves edges better than mean, which blurs edges (see the figure in next slide) 84 Comparison of median and mean filters Both the median and the moving average filters remove high-frequency oscillations Median filter more successfully removes isolated spikes and better preserves edges, defined as pixels at which the gradient or slope of grey level value changes remarkably. 85 Majority smoothing filter In this case, as the kernel passes over the image, the central pixel is set to the majority value within the kernel. Used for identifying class in the data while mean and median are irrelevant for classification. Majority filter Removes misclassified “salt and pepper” pixels 87 Gaussian smoothing filter Gaussian filter (weights generated by Gaussian Function) 88 Smoothing filters in software 89 Stripe noise removal Sixteenth line banding noise (brighter or darker than the others) in LANDSAT Band 2 (green) of 3.3.96: Deep Bay, Hong Kong Landsat band 2: Deep Bay after 7×7 median filter 90 Line drop removal Dropped line removed by averaging pixels each side of the line using a 1-dimensional 3×1 vertical filter with threshold of 0 (i.e. detect values of 0 or No data) Question: How to design a filter to achieve above processing? 91 Section 3 Spatial Filtering High pass filters 92 High pass filters: Sharpening Edge enhancing filter Kernel Input image Output image 93 High pass filters: Sharpening High pass filters that sharpen edges -1 -1 -1 1 -2 1 -1 9 -1 -2 5 -2 -1 -1 -1 1 -2 1 94 High pass (sharpening) filters Emphasize high frequency component by exaggerating local contrast, highlighting fine detail or enhance detail that has been blurred. A sharpening filter seeks to emphasize changes. When the kernel is applied over a region of uniform values, then no changes on output value. It has maximum output when the centre pixel differs significantly from the surrounding pixels. 95 Before and after 3×3 high pass filter IKONOS Panchromatic, Mt. Butler 97 Image subtraction method for high pass filter Image Source: Kriti Bajpai et al., 2017, [link] 100 Directional first differencing Emphasizing edges in image data Determines the derivative of grey levels with respect to a given direction Compares each pixel to one of its neighbours Result can be positive or negative outside the byte range, so we need rescale the range Because pixel-to-pixel differences are often very small, contrast stretch must be applied. 101 Y-directional filter emphasising E-W trends LANDSAT band 2 (green): west of Guangzhou 102 Directional first differencing 0 0 0 9 8 0 0 0 0 0 0 9 8 0 0 0 0 0 0 9 8 0 0 0 0 0 0 9 8 0 0 0 0 0 0 9 8 0 0 0 0 0 0 9 8 0 0 0 103 Edge Enhancement Using Laplacian Convolution Kernel The Laplacian is a second derivative (as opposed to the gradient which is a first derivative) and is invariant to rotation, meaning that it is insensitive to the direction in which the discontinuities (point, line, and edges) run. 0 -1 0 -1 -1 -1 1 -2 1 -1 4 -1 -1 8 -1 -2 4 -2 0 -1 0 -1 -1 -1 1 -2 1 104 Edge enhancement Edge enhancement through directional first differencing: (a) original image; (b) vertical first difference; (c) horizontal first difference; (d) left diagonal first difference; (e) right diagonal first difference; (f ) Laplacian edge detector. 105 Section 4 Density Slicing and Image Arithmetic 106 Density slicing DNs along the x axis of an image histogram are divided into a series of analyst-specified intervals or “slices” DNs falling within a given interval in the input image are then displayed at a single DN in the output image. If six different slices are established, the output image contains only six different gray levels. Density slice for surface temperature visualization 108 Spectral index Indices may be used to enhance a particular single feature on image. Different indices can be used to enhance different earth surface features on image, like enhancement of vegetation, water, snow, building, etc. Band ratio Normalized Difference Vegetation Index (NDVI) Normalized Difference Water Index (NDWI) Normalized Difference Snow Index (NDSI) Normalized Difference Building Index (NDBI) … 110 Image Arithmetic: band ratioing Band Ratioing is a process of dividing pixel values in one image by corresponding pixel values in another image. Reasons to use Band Ratioing are To enhance the image by bringing out Earth’s surface cover types from the image To provide independence of variations in scene illumination 111 Image Arithmetic: band ratioing Why does band ratio can highlight subtle spectral changes? It is because band ratio can emphasize the differences between two bands (for different materials). e.g. vegetation is darker in the visible, but brighter in the NIR than soil, thus the ratio difference is greater than either band individually 112 Ratio images Band Ratios are commonly used as vegetation indices aimed at identifying greenness and biomass A ratio of NIR/Red is most common The number of ratios possible from n bands is n(n-1), thus for SPOT 5 (3 bands) = 6, Landsat ETM (6 bands) = 30 It can be used to generate false colour composites by combining 3 monochromatic ratios A band ratio using TM near-infrared band (band 4) divided by the visiblered band (band 3), which created a vegetation index. 113 Normalized difference vegetation index (NDVI) NDVI is used to quantify vegetation greenness and is useful in understanding vegetation density and assessing changes in plant health. NDVI is calculated as a ratio between the red (R) and near infrared (NIR) values in traditional fashion. NDVI is defined as NDVI takes on values between -1.0 and 1.0, in which values equal or less than zero mean non-vegetated area. Not an absolute value but sum and difference of bands Well correlate to biomass, leaf chlorophyll levels, leaf area index values and so on. 118 NDVI In Landsat 4-7, NDVI = (Band ? – Band ?) / (Band ? + Band ?). In Landsat 8/9, NDVI = (Band ? – Band ?) / (Band ? + Band ?). https://www.usgs.gov/media/images/landsat-surface-reflectance-and-normalized-difference-vegetation-index 119 NDVI In Landsat 4-7, NDVI = (Band 4 – Band 3) / (Band 4 + Band 3). In Landsat 8/9, NDVI = (Band 5 – Band 4) / (Band 5 + Band 4). https://www.usgs.gov/media/images/landsat-surface-reflectance-and-normalized-difference-vegetation-index 120 NDVI 121 China and India Lead the Way in Greening https://earthobservatory.nasa.gov/images/1445 40/china-and-india-lead-the-way-in-greening 122 Questions Write down the kernel and explain low pass filter and high pass filter Explain band ratioing 124 Homework – Histogram equalization and matching Please conduct histogram equalization and matching by yourself using the shared Excel documents on Blackboard. 125 Preliminary group project presentation The presentation date is March 6 in Lecture 5. 3 min presentation with PPT. Mainly introduce the topic, literature review, and research plan. Existing problems can also be listed for discussion. 126 End of Lecture 4 127 LSGI536 Remote Sensing Image Processing Lecture 5 Remote Sensing Image Interpretation Dr. Zhiwei Li Research Assistant Professor Department of Land Surveying and Geo-Informatics The Hong Kong Polytechnic University Email: [email protected] Outlines 1. Introduction to Image Classification & Machine Learning 2. Supervised Classification k-Nearest Neighbour Classifier (k-NN) Minimum Distance to Means Classifier (MDM) Maximum Likelihood Classifier (MLC) 3. Unsupervised Classification K-means ISODATA 4. Accuracy Assessment Confusion matrix Accuracy metrics 5. Change Detection Visual Inspection Post-classification comparison Temporal image differencing 2 Section 1 Image Classification & Machine Learning 3 Image classification Image to Information Earth Observation Land-Use/Land-Cover Classification Objective: automatically categorize all pixels using numerical, Spectral Pattern for each pixel. Uses more than one band: otherwise = Density Slice (just rely on brightness) This is a learning problem, i.e. learn from data, then predict what we want to know. 4 Image classification Land Use refers to what people do on the land surface (e.g., agriculture, commerce, settlement) Land Cover refers to the type of material present on the landscape (e.g., water, sand, crops, forest, wetland, human-made materials such as asphalt) 5 Machine learning Think about tremendous size of remote sensing images available from the Internet (e.g. 50 years of Landsat images), this deluge of data calls for automated methods of data analysis, which is what Machine Learning provides. Machine Learning is a set of methods that can automatically detect patterns in data (given image), and then use the uncovered patterns to predict future data (new image), or to perform other kinds of decision making under uncertainty (such as planning how to collect more data) 8 Machine learning Supervised learning Unsupervised learning 9 Remote Sensing Image Classification Methods Parameter Training? Output Classes Target Type Parametric Supervised Hard Per-pixel Non-Parametric Unsupervised Soft (fuzzy) Object-oriented Non-Metric Hybrid 13 Section 2 Supervised Classification 14 Supervised classification There are four distinct stages: Date preparation Training – Require splitting sample data into training sample and test sample. Classification – Select and apply model and algorithm Output Production of thematic maps, tables or statistics The classification output becomes a GIS input Needs knowledge of area and/or reference data 15 Supervised classification 17 Training stag