Paper
19 May 2020 Deep snow: synthesizing remote sensing imagery with generative adversarial nets
Author Affiliations +
Abstract
In this work we demonstrate that generative adversarial networks (GANs) can be used to generate realistic pervasive changes in RGB remote sensing imagery, even in an unpaired training setting. We investigate some transformation quality metrics based on deep embedding of the generated and real images which enable visualization and understanding of the training dynamics of the GAN, and provide a useful measure in terms of quantifying how distinguishable the generated images are from real images. We also identify some artifacts introduced by the GAN in the generated images, which are likely to contribute to the differences seen between the real and generated samples in the deep embedding feature space even in cases where the real and generated samples appear perceptually similar.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Christopher X. Ren, Amanda Ziemann, James Theiler, and Alice M. S. Durieux "Deep snow: synthesizing remote sensing imagery with generative adversarial nets", Proc. SPIE 11392, Algorithms, Technologies, and Applications for Multispectral and Hyperspectral Imagery XXVI, 113920T (19 May 2020); https://doi.org/10.1117/12.2560716
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Remote sensing

RGB color model

Image quality

Clouds

Data modeling

Image processing

Neural networks

Back to Top