RS LEC 5

Cards (48)

  • Digital image interpretation and analysis
    Manipulation and interpretation of digital images with the aid of a computer
  • Digital Image Processing
    Used to enhance data as a prelude to visual interpretation, and to automatically identify targets and extract information completely without manual intervention by a human interpreter
  • Central Idea of Digital Image Processing
    1. Digital image is fed into a computer one pixel at a time, using image processing software/program
    2. Using the software/program, data is inserted into an equation or series of equations, and then store the results of the computation for each pixel
    3. Results form a new digital image data that maybe displayed or recorded in pictorial format or may itself be further manipulated by additional programs
  • Elements of Visual Interpretation
    • Tone
    • Shape
    • Size
    • Pattern
    • Texture
    • Shadow
    • Association
  • Tone
    The relative brightness or color of objects in an image, it is the fundamental element for distinguishing between different targets or features, variations in tone also allows the elements of shape, texture, and pattern of objects to be distinguished
  • Shape
    The general form, structure, or outline of individual objects, it can be a very distinctive clue for interpretation
  • Size
    The size of objects in an image is a function of scale, it is important to assess the size of a target relative to other objects in a scene, as well as the absolute size, to aid in the interpretation of that target
  • Pattern
    The spatial arrangement of visibly discernible objects, typically an orderly repetition of similar tones and textures will produce a distinctive and ultimately recognizable pattern, it can be a very distinctive clue for interpretation
  • Texture
    The arrangement and frequency of tonal variation in particular areas of an image, rough textures would consist of a mottled tone where the grey levels change abruptly in a small area, smooth textures would have very little tonal variation, texture is one of the most important elements for distinguishing features in radar imagery
  • Shadow
    Helpful in interpretation as it may provide an idea of the profile and relative height of a target or targets which may make identification easier, it can also reduce or eliminate interpretation in their area of influence, since targets within shadows are much less (or not at all) discernible from their surroundings, useful for enhancing or identifying topography and landforms, particularly in radar imagery
  • Association
    Takes into account the relationship between other recognizable objects or features in proximity to the target of interest, the identification of features that one would expect to associate with other features may provide information to facilitate identification
  • Digital image processing
    May involve numerous procedures including formatting and correcting of the data, digital enhancement to facilitate better visual interpretation, or even automated classification of targets and features entirely by computer
  • Common image processing functions
    • Preprocessing (Geometric Correction, Radiometric Calibration)
    • Image Enhancement (Spatial, Spectral)
    • Image Transformation
    • Image Classification and Analysis
  • Image Pre-processing
    Sometimes referred to as image restoration and rectification, are intended to correct for sensor- and platform-specific radiometric and geometric distortions of data
  • Radiometric Correction
    Correcting the data for sensor irregularities and unwanted sensor or atmospheric noise, and converting the data so they accurately represent the reflected or emitted radiation measured by the sensor
  • Groups of radiometric corrections
    • Cosmetic' rectification to compensate for data errors
    • Relative atmospheric correction based on ground reflectance properties
    • Absolute atmospheric correction based on atmospheric process information
  • Cosmetic Correction
    Corrections typically executed (if required) at the satellite data receiving stations or image pre-processing centers, before reaching the final user, to address issues like periodic line dropouts, line striping, and random noise or spike
  • Periodic line dropouts
    • Occur due to recording problems when one of the detectors of the sensor either gives wrong data or stops functioning
  • Line striping
    • Often occurs due to non-identical detector response, with some detectors drifting to higher or lower levels over time
  • Random noise or spike
    • May be due to errors during transmission of data or to a temporary disturbance, where individual pixels acquire DN-values that are much higher or lower than the surrounding pixels
  • Geometric Correction
    Correcting for geometric distortions due to sensor-Earth geometry variations, and conversion of the data to real world coordinates on the Earth's surface
  • Factors causing geometric distortions
    • Perspective of the sensor optics
    • Motion of the scanning system
    • Motion of the platform
    • Platform altitude, attitude, and velocity
    • Terrain relief
    • Curvature and rotation of the Earth
  • Systematic Distortions
    Distortions that can be rectified using data from platform ephemeris and knowledge of internal sensor distortion, including panoramic distortion, platform velocity, curvature of the earth, earth rotation, scan skew, and mirror scan velocity variance
  • Nonsystematic Distortions
    Distortions that arise from sensor system's attitude, velocity, and altitude, and can be corrected only through the use of ground control points (GCPs), including topographic or relief displacement due to terrain variation
  • Ground swath
    • Not normal to the ground track but slightly skewed, producing cross-scan geometric distortion
  • Mirror scanning rate
    • Usually not constant across a given scan, producing along-scan geometric distortion
  • Geometric distortions

    Systematic distortions
  • Nonsystematic distortions
    Arise from sensor system's attitude, velocity, and altitude
  • Nonsystematic distortions
    • Can be corrected only through the use of ground control points (GCPs)
  • Topographic, or relief displacement
    • Due to terrain variation, usually the most serious of the displacement types, especially in mountainous terrain
  • Orthorectification
    Can correct terrain-related distortions using a digital elevation model (DEM)
  • Rubbersheet rectification
    Can correct terrain-related distortions based on ground control points
  • Georeferencing
    The process of aligning images with ground control points (GCPs) on the Earth's surface in order to adopt a certain coordinate system
  • Ground control points (GCPs)
    • Points that can be clearly identified in the image and in a source that is in the required map projection system
  • Root Mean Square (RMS) error
    The distance between the input (source) location of a GCP and the retransformed location for the same GCP
  • Resampling
    • The process of extrapolating data values to a new grid
  • Nearest neighbour resampling
    Uses the digital value from the pixel in the original image which is nearest to the new pixel location in the corrected image
  • Bilinear interpolation resampling
    Takes a weighted average of four pixels in the original image nearest to the new pixel location
  • Cubic convolution resampling
    Calculates a distance weighted average of a block of sixteen pixels from the original image which surround the new output pixel location
  • Image enhancement procedures
    1. Contrast manipulation
    2. Spatial feature manipulation