Pixel Level Data Fusion at Cape Chiniak, Kodiak Island, Alaska


Shaded relief map 

Image of Kodiak

In the remote sensing context, data fusion refers to the process of merging data from different sensors in some beneficial way. Fusion is often categorized into one of three types: pixel-level, feature-level, or decision-level. The simplest approach is to merge one or more images at the pixel level. Although this approach does not serve to automate the extraction of information, the results can often be visually striking, and useful as a data visualization technique.
The grey scale image shown here is a shaded relief map depicting the topography over a 600x1000 meter patch at Cape Chiniak on Kodiak Island, Alaska. The original data set was produced using photogrammetric methods. The color image is a pixel level fusion of the shaded relief map with an orthophotograph and a height-as-intensity map. When viewed using Chromadepth glasses, the red elevations give a vivid 3-dimensional effect. Importantly, this technique is not limited to the display of physical variables. For example, one might choose to depict change over time in the 3rd dimension.
Pixel level data fusion

Pixel level data fusion

Click the image for a full resolution view

Pixel Level Data Fusion at Harrisburg International Airport


Click the image for a full resolution view

Click the image for a full resolution view


 
The color images above depict another example of pixel level data fusion.  These images of Harrisburg International Airport in Harrisburg, Pennsylvania display the fusion of NASA ATM LIDAR data with AVIRIS hyperspectral data. For this experiment a 4 meter resolution AVIRIS scene was merged with a LIDAR shaded relief model and a LIDAR digital elevation model. When viewed using Chromadepth glasses, the red elevations give a vivid 3-dimensional effect.

 
[NGS Remote Sensing Research and Development Main Page]
[NOAA Main Page] [NOS Main Page] [NGS Main Page]


This page updated on March 28, 2000