Paper
3 April 2024 Quantization method for bipolar morphological neural networks
Author Affiliations +
Proceedings Volume 13072, Sixteenth International Conference on Machine Vision (ICMV 2023); 1307204 (2024) https://doi.org/10.1117/12.3023272
Event: Sixteenth International Conference on Machine Vision (ICMV 2023), 2023, Yerevan, Armenia
Abstract
In the paper, we present a quantization method for bipolar morphological neural networks. Bipolar morphological neural networks use only addition, subtraction, and maximum operations inside the neuron and exponent and logarithm as activation functions of the layers. These operations allow fast and compact gate implementation for FPGA and ASIC, which makes these networks a promising solution for embedded devices. Quantization allows us to reach an additional increase in computational efficiency and reduce the complexity of hardware implementation by using integer values of low bitwidth for computations. We propose an 8-bit quantization scheme based on integer maximum, addition, and lookup tables for non-linear functions and experimentally demonstrate that basic models for image classification can be quantized without noticeable accuracy loss. More advanced models still provide high recognition accuracy but would benefit from further fine-tuning.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Elena Limonova, Michael Zingerenko, Dmitry Nikolaev, and Vladimir V. Arlazarov "Quantization method for bipolar morphological neural networks", Proc. SPIE 13072, Sixteenth International Conference on Machine Vision (ICMV 2023), 1307204 (3 April 2024); https://doi.org/10.1117/12.3023272
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Quantization

Neurons

Neural networks

Field programmable gate arrays

Education and training

Tunable filters

Instrument modeling

Back to Top