Paper
4 December 2000 Multiscale edge representation applied to image fusion
Author Affiliations +
Abstract
In this paper the fusion of multimodal images into one greylevel image is aimed at. A multiresolution technique, based on the wavelet multiscale edge representation is applied. The fusion consists of retaining only the modulus maxima of the wavelet coefficients from the different bands and combining them. After reconstruction, a synthetic image is obtained that contains the edge information from all bands simultaneously. Noise reduction is applied by removing the noise-related modulus maxima. In several experiments on test images and multispectral satellite images, we demonstrate that the proposed technique outperforms mapping techniques, as PCA and SOM and other wavelet-based fusion techniques.
© (2000) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Paul Scheunders "Multiscale edge representation applied to image fusion", Proc. SPIE 4119, Wavelet Applications in Signal and Image Processing VIII, (4 December 2000); https://doi.org/10.1117/12.408573
Lens.org Logo
CITATIONS
Cited by 17 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Wavelets

Wavelet transforms

Denoising

Neurons

Reconstruction algorithms

Multispectral imaging

Back to Top