Eye Scanpath Prediction-Based No-Reference Quality Assessment of Omnidirectional Images | IEEE Journals & Magazine | IEEE Xplore

Eye Scanpath Prediction-Based No-Reference Quality Assessment of Omnidirectional Images


Abstract:

Omnidirectional image (OI) represents a crucial form of virtual reality (VR) technology content and capable of capturing 360° information. While OIs provide a novel way f...Show More

Abstract:

Omnidirectional image (OI) represents a crucial form of virtual reality (VR) technology content and capable of capturing 360° information. While OIs provide a novel way for producers and consumers to create, utilize, and interact with visual information, their high-resolution and high-fidelity properties require compression before storage or transmission. When viewed in close proximity to the eyes, low-quality OIs have the potential to induce discomfort, nausea, and motion sickness. Therefore, a precise assessment of OI quality is imperative. Our idea is to introduce eye scanpath representing visual attention mechanism into our model. We proposed an eye scanpath prediction-based no-reference OI quality assessment (OIQA) method. The main idea of our proposed network is to combine subjective perception with objective features to obtain quality features. We can obtain subjective heatmap based on the density of gaze points in eye scanpath and design an adaptive algorithm to extract saliency images of viewport at saliency regions to obtain feature vector. Finally, we simultaneously input the feature vectors of the OI path, subjective heatmap path, and saliency images of viewport path into the regression fully connection net and output the quality score. The experimental results demonstrated that our model achieved better performance on four OI databases than existing full-reference (FR) and no-reference (NR) methods.
Article Sequence Number: 5034315
Date of Publication: 30 September 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.