Dynamic Expert-Knowledge Ensemble for Generalizable Video Quality Assessment | IEEE Journals & Magazine | IEEE Xplore

Dynamic Expert-Knowledge Ensemble for Generalizable Video Quality Assessment


Abstract:

Despite the impressive progress of supervised methods in quality assessment for in- the-wild videos, models trained on one domain often fail to generalize well to others ...Show More

Abstract:

Despite the impressive progress of supervised methods in quality assessment for in- the-wild videos, models trained on one domain often fail to generalize well to others due to the domain shifts caused by distortion diversity and content variation. Domain generalizable video quality assessment (VQA) methods that can work across domains remain an open research challenge. Although combining more data following the mixed-domain training strategy can improve the generalization performance to a certain extent, the specific knowledge from each source domain, which could potentially be useful for improving unseen domain generalization, is ignored in this principle. Motivated by this, we propose a domain generalizable VQA method named Dynamic Ensemble of Expert-Knowledge (DEEK), a novel framework that dynamically exploits the expert-knowledge from each source domain to achieve a generalizable ensemble prediction. Specifically, based on the multiple experts each trained to specialize in a particular source domain, we aim to exploit complementary information provided by the expert-knowledge. We effectively train an ensemble model by proposing a quality-sensitive InfoNCE loss to regularize the collaborative training of all experts in the contrastive learning formulation, aiming to exploit complementary information provided by the expert-knowledge when forming the ensemble. By dynamically integrating the experts according to their relevances to the target data, these expert-knowledge could be leveraged for better generalization. Experiments on five VQA datasets verify that our approach outperforms the state-of-the-arts by large margins.
Page(s): 2577 - 2589
Date of Publication: 30 November 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.