As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
In Domain Adaptation (DA), how to reduce the distributional differences across domains and preserve the data structures are two critical issues to obtain domain-invariant features. Existing DA methods either preserve the Local Manifold Structure (LMS) or the Global Discriminative Consistency (GDC), while fail to take those two metrics into account simultaneously. Therefore, the extracted features are either short of discriminative ability or sensitive to the multimodally distributed data. Moreover, the local neighbored relationships among data points are mostly established in original data space, which is unreliable, especially for data with large noises. Therefore, this paper proposes a novel DA approach, i.e., Adaptive Local Neighbors for Transfer Discriminative Feature Learning, to leverage LMS and GDC into a unified transfer feature learning model, where we only focus on the GDC between the local neighbors, so that the extracted features are more discriminative and robust to the multimodally distributed data. Moreover, the data points’ local neighbors are revealed adaptively in the learned subspace so that it is insensitive to the data noises. Compared with the state-of-the-art methods, the proposed approach achieves higher performance for different cross-domain image classification tasks, especially 3.0% improved for Office10+Caltech10 dataset.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.