Filters








8,735 Hits in 8.7 sec

On the Hardness of Domain Adaptation and the Utility of Unlabeled Target Samples [chapter]

Shai Ben-David, Ruth Urner
2012 Lecture Notes in Computer Science  
The contributions of this paper are two-fold: On the one hand we show that Domain Adaptation in this setup is hard.  ...  Even under very strong assumptions about the relationship between source and target distribution and, on top of that, a realizability assumption for the target task with respect to a small class, the required  ...  Acknowledgements We thank Shai Shalev-Shwartz for insightful discussions on this work and Shalev Ben-David for deriving the bound of Lemma 5.  ... 
doi:10.1007/978-3-642-34106-9_14 fatcat:vo3pa2fesreatbfphzvab4o4x4

Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation [article]

Jichang Li, Guanbin Li, Yemin Shi, Yizhou Yu
2021 arXiv   pre-print
This could lead to disconnection between the labeled and unlabeled target samples as well as misalignment between unlabeled target samples and the source domain.  ...  In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.  ...  In the context of semi-supervised domain adaptation, the presence of few labeled target samples is utilized to help features of unlabeled target samples from different classes be guided to aggregate in  ... 
arXiv:2104.09415v1 fatcat:lmtocnxxlnd6pahcxgk5oahq7i

Unsupervised Intra-domain Adaptation for Semantic Segmentation through Self-Supervision [article]

Fei Pan, Inkyu Shin, Francois Rameau, Seokju Lee, In So Kweon
2020 arXiv   pre-print
First, we conduct the inter-domain adaptation of the model; from this adaptation, we separate the target domain into an easy and hard split using an entropy-based ranking function.  ...  To tackle this issue, previous works have considered directly adapting models from the source data to the unlabeled target data (to reduce the inter-domain gap).  ...  Acknowledgments This research was partially supported by the Shared Sensing for Cooperative Cars Project funded by Bosch  ... 
arXiv:2004.07703v4 fatcat:hzuyfhl2f5asbgarbeltmizn2y

Adaptive Semantic Segmentation with a Strategic Curriculum of Proxy Labels [article]

Kashyap Chitta, Jianwei Feng, Martial Hebert
2018 arXiv   pre-print
Our architecture then allows selective mining of easy samples from this set of proxy labels, and hard samples from the annotated source domain.  ...  We conduct a series of experiments with the GTA5, Cityscapes and BDD100k datasets on synthetic-to-real domain adaptation and geographic domain adaptation, showing the advantages of our method over baselines  ...  The strategy we use is based on easy and hard sample mining from the target and source domains, while also gradually shifting the overall distribution of data presented to the network towards the target  ... 
arXiv:1811.03542v1 fatcat:ieuw362knfhq7fj6xzhdtoggnu

Learning Invariant Representation with Consistency and Diversity for Semi-supervised Source Hypothesis Transfer [article]

Xiaodong Wang, Junbao Zhuo, Shuhao Cui, Shuhui Wang
2021 arXiv   pre-print
Semi-supervised domain adaptation (SSDA) aims to solve tasks in target domain by utilizing transferable information learned from the available source domain and a few labeled target data.  ...  data and maintaining the prediction diversity when adapting model to target domain.  ...  The goal of SSHT is to adapt the source model to target domain with only a few labeled target samples and unlabeled target samples.  ... 
arXiv:2107.03008v2 fatcat:rse5gdh6unfe5etka5ze3mjmpa

CALDA: Improving Multi-Source Time Series Domain Adaptation with Contrastive Adversarial Learning [article]

Garrett Wilson, Janardhan Rao Doppa, Diane J. Cook
2021 arXiv   pre-print
Similar to prior methods, CALDA utilizes adversarial learning to align source and target feature representations.  ...  Based on results from human activity recognition, electromyography, and synthetic datasets, we find utilizing cross-source information improves performance over prior time series and contrastive methods  ...  Importance of Unlabeled Target Domain Data Finally, in the problem of unsupervised domain adaptation, we have unlabeled target domain data available for use during training.  ... 
arXiv:2109.14778v1 fatcat:7cn2m2d3dfevjfshgrmdcwrwle

Unsupervised Person Re-Identification: A Systematic Survey of Challenges and Solutions [article]

Xiangtan Lin and Pengzhen Ren and Chung-Hsing Yeh and Lina Yao and Andy Song and Xiaojun Chang
2021 arXiv   pre-print
This survey review recent works on unsupervised person Re-ID from the perspective of challenges and solutions.  ...  While supervised person Re-ID methods achieve superior performance over unsupervised counterparts, they can not scale to large unlabelled datasets and new domains due to the prohibitive labelling cost.  ...  In this setting, a model is first pre-trained on the labelled source domain and then adapted to the unlabelled target domain.  ... 
arXiv:2109.06057v2 fatcat:epfow7w3trevff5iku2uvb4ov4

Unsupervised Intra-Domain Adaptation for Semantic Segmentation Through Self-Supervision

Fei Pan, Inkyu Shin, Francois Rameau, Seokju Lee, In So Kweon
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
First, we conduct the inter-domain adaptation of the model, from this adaptation, we separate target domain into an easy and hard split using an entropy-based ranking function.  ...  To tackle this issue, previous works have considered directly adapting models from the source data to the unlabeled target data (to reduce the inter-domain gap).  ...  This work was also partially supported by the Korea Research Fellowship Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (2015H1D3A1066564  ... 
doi:10.1109/cvpr42600.2020.00382 dblp:conf/cvpr/PanSRLK20 fatcat:du3q4yucdjhhbpnfp2ay6yvl2y

Heterogeneous Domain Adaptation via Soft Transfer Network [article]

Yuan Yao, Yu Zhang, Xutao Li, Yunming Ye
2019 arXiv   pre-print
To circumvent negative transfer, STN aligns the conditional distributions by using the soft-label strategy of unlabeled target data, which prevents the hard assignment of each unlabeled target data to  ...  Heterogeneous domain adaptation (HDA) aims to facilitate the learning task in a target domain by borrowing knowledge from a heterogeneous source domain.  ...  The soft-label strategy avoids the hard assignment of each unlabeled target data to only one class that may be incorrect.  ... 
arXiv:1908.10552v1 fatcat:owcelg242fhb5mbxxbjigijwpe

Complementary Pseudo Labels For Unsupervised Domain Adaptation On Person Re-identification [article]

Hao Feng, Minghao Chen, Jinming Hu, Dong Shen, Haifeng Liu, Deng Cai
2021 arXiv   pre-print
One of the most successful approaches predicts neighbors of each unlabeled image and then uses them to train the model.  ...  Although the predicted neighbors are credible, they always miss some hard positive samples, which may hinder the model from discovering important discriminative information of the unlabeled domain.  ...  "CE": CROSS-ENTROPY LOSS.TABLE V ABLATION STUDIES OF THE NUMBER OF UNLABELED SAMPLES FOR ADAPTATION. THE PERCENTAGE DENOTES THE NUMBER OF TARGET DOMAIN DATA.TABLE VI ABLATION STUDIES OF e 1 AND e 2 .  ... 
arXiv:2101.12521v2 fatcat:yizuuwzvsbeajea653sksqbmie

OVANet: One-vs-All Network for Universal Domain Adaptation [article]

Kuniaki Saito, Kate Saenko
2021 arXiv   pre-print
In this paper, we propose a method to learn the threshold using source samples and to adapt it to the target domain.  ...  Then, we adapt the open-set classifier to the target domain by minimizing class entropy.  ...  Unsupervised domain adaptation (UDA) [29] aims to learn a good classifier for a target domain given labeled source and unlabeled target data.  ... 
arXiv:2104.03344v4 fatcat:n7yaj2rbxbcntjri7xfspqbuvm

Adaptive Consistency Regularization for Semi-Supervised Transfer Learning [article]

Abulikemu Abuduweili, Xingjian Li, Humphrey Shi, Cheng-Zhong Xu, Dejing Dou
2021 arXiv   pre-print
Consistency (AKC) on the examples between the source and target model, and Adaptive Representation Consistency (ARC) on the target model between labeled and unlabeled examples.  ...  as well as labeled/unlabeled data in the target domain.  ...  Domain Adaption. Different from fine-tuning, domain adaptation [38] copes with the problem of sample selection bias between the training and test data.  ... 
arXiv:2103.02193v2 fatcat:efoyh2qo5zhypbflukcy5zxvhy

Two-phase Pseudo Label Densification for Self-training based Domain Adaptation [article]

Inkyu Shin, Sanghyun Woo, Fei Pan, InSo Kweon
2020 arXiv   pre-print
The self-training scheme involves iterative processing of target data; it generates target pseudo labels and retrains the network.  ...  Recently, deep self-training approaches emerged as a powerful solution to the unsupervised domain adaptation.  ...  Acknowledgement This research is supported by the National Cancer Center(NCC).  ... 
arXiv:2012.04828v1 fatcat:xlwlzyw475grbo6omckhzhuzqi

Divide to Adapt: Mitigating Confirmation Bias for Domain Adaptation of Black-Box Predictors [article]

Jianfei Yang, Xiangyu Peng, Kai Wang, Zheng Zhu, Jiashi Feng, Lihua Xie, Yang You
2022 arXiv   pre-print
Domain Adaptation of Black-box Predictors (DABP) aims to learn a model on an unlabeled target domain supervised by a black-box predictor trained on a source domain.  ...  This is enabled by a new divide-to-adapt strategy. BETA divides the target domain into an easy-to-adapt subdomain with less noise and a hard-to-adapt subdomain.  ...  We thank TACC (Texas Advanced Computing Center) for supporting us to get access to the Longhorn supercomputer and the Frontera supercomputer.  ... 
arXiv:2205.14467v1 fatcat:kj6er6xawrdnbactrnpnq63kay

Improve conditional adversarial domain adaptation using self‐training

Zi Wang, Xiaoliang Sun, Ang Su, Gang Wang, Yang Li, Qifeng Yu
2021 IET Image Processing  
The model's predictions on unlabelled samples are leveraged to pseudo-label target samples. The training procedure consists of two alternating steps.  ...  Domain adaptation for image classification is one of the most fundamental transfer learning tasks and a promising solution to overcome the annotation burden.  ...  Here, we utilize a self-training framework [10] to exploit information conveyed in unlabelled target samples for domain adaptive image classification tasks.  ... 
doi:10.1049/ipr2.12184 fatcat:exhsxnsxhzgpvkagotixezxale
« Previous Showing results 1 — 15 out of 8,735 results