Georsgis Blog

↑ Grab this Headline Animator

Friday 9 October 2020

Remote Sensing Image Fusion, pdf eBook


Preface of the Book



Nowadays, approximately four TB of image data are collected daily by instruments mounted on satellite platforms, not to mention the data produced by a myriad of specific campaigns carried out through airborne instruments. Very high-resolution (VHR) multispectral scanners, IKONOS, QuickBird, GeoEye,WorldView, Pl ́eiades, just to mention the most popular, and especially the related monospectral panchromatic instruments are responsible for a large part of the amount.


 Imaging spectrometers with tens to hundreds of bands will significantly contribute after the launch of the upcoming PRISMA and EnMap missions. In parallel, synthetic aperture radar (SAR) satellite constellation systems, TerraSAR-X/Tandem-X, COSMO-SkyMed, RadarSat-2, and the upcoming RadarSat-3 and Sentinel-2 are taking high-resolution microwave images of the Earth with ever improved temporal repetition capabilities. 


The availability of image data with spectral diversity (visible, near infrared, short wave infrared, thermal infrared, X- and C-band microwaves with related polarizations) and complementary spectral-spatial resolution, together with the peculiar characteristics of each image set, have fostered the development of fusion techniques specifically tailored to remotely sensed images of the Earth.


 Fusion aims at producing an extra value with respect to those separately available from the individual datasets. Though the results of fusion are more often analyzed by human experts to solve specific tasks (detection of landslides, flooded and burned areas, just to mention a few examples), partially supervised and also fully automated systems, most notably thematic classifiers, have started benefiting from fused images instead of separate datasets. 


The most prominent fusion methodology specifically designed for remote sensing images is the so-called panchromatic sharpening or pansharpening, which in general requires the presence of a broadband in the visible or visible near-infrared (V-NIR) wavelengths, with ground resolution that is two to six times greater than that of the narrow spectral bands. 


Multispectral pansharpening can be brought back to the launch of the first SPOT instrument, which was first equipped with a panchromatic scanner together with the multispectral one. Recently, hyperspectral pansharpening has also started being investigated by an increasing number of scientists, with the goal, for example, of coupling spatial and spectral detection capabilities in a unique image.


The fusion of images from heterogeneous datasets, that is, of images produced by independent modalities that do not share either wavelengths or imaging mechanisms is a further task, which is pursued in remote sensing ap-plications. A notable example is the sharpening of thermal bands through simultaneous visible near-infrared observations. This issue has also been largely addressed outside the scope of remote sensing, for detection applications, both military and civilian.


A more unconventional and specific topic is the fusion of optical and SAR image data. Several fusion methods, in which an optical image is enhanced by a SAR image, have been developed for specific applications. The all-weather acquisition capability of SAR systems, however, makes a product with opposite features to appear even more attractive. 


The availability of a SAR image spectrally and/or spatially enhanced by an optical observation with possible spectral diversity is invaluable, especially because the optical data may not always be available, particularly in the presence of cloud covers, while SAR platforms capture images regardless of sunlight and meteorological conditions.


Download Link

Remote Sensing Image Fusion, pdf eBook

No comments:

Contact us

Name

Email *

Message *

Follow us on Facebook and YouTube