RS 5A

Cards (50)

  • Digital image interpretation and analysis
    Manipulation and interpretation of digital images with the aid of a computer
  • Digital Image Processing
    Used to enhance data as a prelude to visual interpretation, or to automatically identify targets and extract information completely without manual intervention by a human interpreter
  • Central Idea of Digital Image Processing
    1. Digital image is fed into a computer one pixel at a time, using image processing software/program
    2. Using the software/program, data is inserted into an equation or series of equations, and then store the results of the computation for each pixel
    3. Results form a new digital image data that maybe displayed or recorded in pictorial format or may itself be further manipulated by additional programs
  • Elements of Visual Interpretation
    • Tone
    • Shape
    • Size
    • Pattern
    • Texture
    • Shadow
    • Association
  • Tone
    The relative brightness or color of objects in an image, fundamental for distinguishing between different targets or features, variations allow distinguishing shape, texture, and pattern
  • Shape
    The general form, structure, or outline of individual objects, can be a very distinctive clue for interpretation
  • Size
    Size of objects in an image is a function of scale, important to assess size relative to other objects and absolute size to aid interpretation
  • Pattern
    The spatial arrangement of visibly discernible objects, an orderly repetition of similar tones and textures, can be a very distinctive clue for interpretation
  • Texture
    The arrangement and frequency of tonal variation in particular areas of an image.
  • Shadow
    Helpful in interpretation as it may provide an idea of the profile and relative height of a target, can also reduce or eliminate interpretation in their area of influence, useful for enhancing or identifying topography and landforms in radar imagery
  • Association
    Takes into account the relationship between other recognizable objects or features in proximity to the target of interest, the identification of features that one would expect to associate with other features may provide information to facilitate identification
  • Digital Image Processing procedures
    • Formatting and correcting of the data
    • Digital enhancement to facilitate better visual interpretation
    • Automated classification of targets and features entirely by computer
  • Common image processing functions
    • Preprocessing (Geometric Correction, Radiometric Calibration)
    • Image Enhancement (Spatial, Spectral)
    • Image Transformation
    • Image Classification and Analysis
  • Image Pre-processing
    Intended to correct for sensor- and platform-specific radiometric and geometric distortions of data, normally required prior to the main data analysis and extraction of information
  • Radiometric Correction groups
    • Cosmetic rectification to compensate for data errors
    • Relative atmospheric correction based on ground reflectance properties
    • Absolute atmospheric correction based on atmospheric process information
  • Cosmetic Correction

    Corrections typically executed at the satellite data receiving stations or image pre-processing centers, before reaching the final user, to address issues like periodic line dropouts, line striping, and random noise or spike
  • Periodic line dropouts
    • Occur due to recording problems when one of the detectors of the sensor stops functioning or gives wrong data
  • Line striping
    • Occurs due to non-identical detector response, with some detectors drifting to higher or lower levels over time
  • Random noise or spike
    • May be due to errors during data transmission or temporary disturbances, individual pixels acquire values much higher or lower than surrounding pixels
  • Geometric corrections
    Correcting for geometric distortions due to sensor-Earth geometry variations, and conversion of the data to real world coordinates on the Earth's surface
  • Factors causing geometric distortions
    • Perspective of the sensor optics
    • Motion of the scanning system
    • Motion of the platform
    • Platform altitude, attitude, and velocity
    • Terrain relief
    • Curvature and rotation of the Earth
  • Types of geometric distortions
    • Systematic distortions
    • Nonsystematic distortions
  • Systematic distortions
    • Panoramic distortion
    • Cross-track scan error
    • Earth's rotation
    • Platform velocity
    • Scan skew
    • Mirror scan velocity variance
  • Panoramic distortion

    Due to spacing of detectors and regular sampling, the ground area imaged is proportional to the tangent of the scan angle rather than the angle itself
  • Cross-track scan error

    A function of the distance from the sensor to the target, the instantaneous field of view, and the scan angle off nadir
  • Earth's rotation

    Each line offset to the west from the previous one due to time taken to build an image as the sensor scans the earth surface
  • Platform velocity
    Changes in platform speed and ground track cause along-track scale distortion
  • Scan skew
    Caused by the forward motion of the platform during the time required for each mirror sweep, producing cross-scan geometric distortion
  • Mirror scan velocity variance
    Mirror scanning rate is usually not constant across a given scan, producing along-scan geometric distortion
  • Nonsystematic distortions
    Arise from sensor system's attitude, velocity, and altitude, can be corrected only through the use of ground control points
  • Terrain-related distortions

    Due to small changes in altitude and aspect, can be corrected by orthorectification using a digital elevation model
  • Georeferencing
    The process of aligning images with ground control points on the Earth's surface in order to adopt a certain coordinate system
  • Ground control points (GCPs)

    Points that can be clearly identified in the image and in a source that is in the required map projection system, used to determine the transformation parameters
  • Root Mean Square (RMS) error
    The distance between the input (source) location of a GCP and the retransformed location for the same GCP, the difference between the desired output coordinate for a GCP and the actual output coordinate
  • Resampling methods

    • Nearest neighbour
    • Bilinear interpolation
    • Cubic convolution
  • Nearest neighbour resampling
    Uses the digital value from the pixel in the original image which is nearest to the new pixel location in the corrected image, does not alter the original values but may result in some pixel values being duplicated while others are lost
  • Bilinear interpolation resampling
    Takes a weighted average of four pixels in the original image nearest to the new pixel location, alters the original pixel values and creates entirely new digital values in the output image
  • Cubic convolution resampling
    Calculates a distance weighted average of a block of sixteen pixels from the original image which surround the new output pixel location, produces images with a much sharper appearance and avoids the blocky appearance of the nearest neighbour method
  • Common image enhancement procedures
    • Contrast manipulation (Gray-level thresholding, Level slicing, Contrast stretching)
    • Spatial feature manipulation (Spatial filtering/convolution, Edge enhancement)
  • Spatial filtering
    Digital processing functions used to enhance the appearance of an image by highlighting or suppressing specific features based on their spatial frequency