24 August 2023 Multifaceted perception-based blind omnidirectional image quality assessment
Hongxi Liu, Chaoyong Wang, Jingyu Jiang, Yun Liu
Author Affiliations +
Abstract

With the rapid development of computer vision, the omnidirectional image (OI) with a broader view and a higher resolution provides humans with an immersive visual experience and has attracted more researchers’ attention. But current image quality assessment models cannot perform well due to neglecting the difference between panoramic images and traditional images, limiting OIs’ further development. Inspired by the above problem, we propose a blind OI quality assessment model based on multi-frequency features, color features, and human visual system-based saliency features. Specifically, we utilize the high- and low-frequency information mapped to the frequency domain to measure the frequency domain loss. The local binary pattern operator is applied to encode the different color channel information. Then the local natural scene statistics and entropy features are extracted using the input image to weight the obtained saliency image by our model to reduce the model’s computational complexity. Finally, support vector regression is employed to predict the OIs’ quality scores. Experimental results on CVIQD and OIQA databases prove that our work performs better than state-of-the-art OIQA models.

© 2023 SPIE and IS&T
Hongxi Liu, Chaoyong Wang, Jingyu Jiang, and Yun Liu "Multifaceted perception-based blind omnidirectional image quality assessment," Journal of Electronic Imaging 32(4), 043034 (24 August 2023). https://doi.org/10.1117/1.JEI.32.4.043034
Received: 11 March 2023; Accepted: 9 August 2023; Published: 24 August 2023
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image quality

Performance modeling

Databases

Panoramic photography

Color

Data modeling

Image processing

RELATED CONTENT


Back to Top