Loading [a11y]/accessibility-menu.js
Mutual Learning Between Saliency and Similarity: Image Cosegmentation via Tree Structured Sparsity and Tree Graph Matching | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Monday, 27 January, the IEEE Xplore Author Profile management portal will undergo scheduled maintenance from 9:00-11:00 AM ET (1400-1600 UTC). During this time, access to the portal will be unavailable. We apologize for any inconvenience.

Mutual Learning Between Saliency and Similarity: Image Cosegmentation via Tree Structured Sparsity and Tree Graph Matching


Abstract:

This paper proposes a unified mutual learning framework based on image hierarchies, which integrates structured sparsity with tree-graph matching to conquer the problem o...Show More

Abstract:

This paper proposes a unified mutual learning framework based on image hierarchies, which integrates structured sparsity with tree-graph matching to conquer the problem of weakly supervised image cosegmentation. We focus on the interaction between two common-object properties: saliency and similarity. Most existing cosegmentation methods only pay emphasis on either of them. The proposed method realizes the learning of the prior knowledge for structured sparsity with the help of tree-graph matching, which is capable of generating object-oriented salient regions. Meanwhile, it also reduces the searching space and computational complexity of tree-graph matching with the attendance of structured sparsity. We intend to thoughtfully exploit the hierarchically geometrical relationships of coherent objects. Experimental results compared with the state-of-the-arts on benchmark data sets confirm that the mutual learning framework is capable of effectively delineating co-existing object patterns in multiple images.
Published in: IEEE Transactions on Image Processing ( Volume: 27, Issue: 9, September 2018)
Page(s): 4690 - 4704
Date of Publication: 31 May 2018

ISSN Information:

PubMed ID: 29993547

Funding Agency:


References

References is not available for this document.