Presentation + Paper
27 May 2022 Cross-modal knowledge distillation in deep networks for SAR image classification
Chowdhury Sadman Jahan, Andreas Savakis, Erik Blasch
Author Affiliations +
Abstract
We propose Cross-KD, a novel framework for knowledge distillation across modalities from Electro-Optical (EO) to Synthetic Aperture Radar (SAR). The Cross-KD approach is response-based and takes into consideration the differences in network size and feature representations in the two modalities. The proposed training includes two stages consisting of i) EO network training and ii) SAR network training with transfer learning and knowledge distillation from the EO network. Knowledge distillation (KD) is performed at the soft output level, allowing features in the EO and SAR networks to be different. The Cross-KD model is agnostic in the selection of network backbone and does not place any constraints on the network architecture, thus making knowledge transfer applicable even from a smaller network to a larger network. We test our framework on a recent EO-SAR coupled dataset with promising results on SAR image classification. Cross-KD achieves performance gains for each component of the model, as evidenced in our ablation studies resulting in 93.98% mean per class accuracy.
Conference Presentation
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Chowdhury Sadman Jahan, Andreas Savakis, and Erik Blasch "Cross-modal knowledge distillation in deep networks for SAR image classification", Proc. SPIE 12099, Geospatial Informatics XII, 1209904 (27 May 2022); https://doi.org/10.1117/12.2624257
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Synthetic aperture radar

Data modeling

Image classification

Image fusion

Data fusion

Electro optical modeling

Network architectures

Back to Top