Filters








5,558 Hits in 4.4 sec

Unsupervised Domain Adaptation for Zero-Shot Learning

Elyor Kodirov, Tao Xiang, Zhenyong Fu, Shaogang Gong
2015 2015 IEEE International Conference on Computer Vision (ICCV)  
Zero-shot learning (ZSL) can be considered as a special case of transfer learning where the source and target domains have different tasks/label spaces and the target domain is unlabelled, providing little  ...  In this paper a novel ZSL method is proposed based on unsupervised domain adaptation.  ...  Note that our unsupervised domain adaptation method is also transductive. This is because of the unique nature of zero-shot learning: there is no separate training data in the target domain.  ... 
doi:10.1109/iccv.2015.282 dblp:conf/iccv/KodirovXFG15 fatcat:r2o6prdiabbileemh4dmtddcoq

Unifying Unsupervised Domain Adaptation and Zero-Shot Visual Recognition [article]

Qian Wang, Penghui Bu, Toby P. Breckon
2019 arXiv   pre-print
We present a unified domain adaptation framework for both unsupervised and zero-shot learning conditions.  ...  traditional zero-shot learning.  ...  unsupervised domain adaptation condition; source data and labelled target data for zero-shot learning condition).  ... 
arXiv:1903.10601v2 fatcat:d35y4p4tgrh5tcvrbwwrn6euwu

Learning Robust Visual-Semantic Embeddings [article]

Yao-Hung Hubert Tsai and Liang-Kang Huang and Ruslan Salakhutdinov
2017 arXiv   pre-print
A novel technique of unsupervised-data adaptation inference is introduced to construct more comprehensive embeddings for both labeled and unlabeled data.  ...  The proposed method combines representation learning models (i.e., auto-encoders) together with cross-domain learning criteria (i.e., Maximum Mean Discrepancy loss) to learn joint embeddings for semantic  ...  From Zero to Few-Shot Learning In this subsection, we extend our experiments from transductive zero-shot to transductive few-shot learning.  ... 
arXiv:1703.05908v2 fatcat:bmvr3bbvavepbg7k7gwgks7gne

Zero-Shot Learning Through Cross-Modal Transfer [article]

Richard Socher, Milind Ganjoo, Hamsa Sridhar, Osbert Bastani, Christopher D. Manning, Andrew Y. Ng
2013 arXiv   pre-print
Most previous zero-shot learning models can only differentiate between unseen classes.  ...  In our zero-shot framework distributional information in language can be seen as spanning a semantic basis for understanding what objects look like.  ...  [27] learn when to transfer knowledge from one category to another for each instance. Domain Adaptation.  ... 
arXiv:1301.3666v2 fatcat:cluvoeqj6zb2tfoimh7sshwdva

MixNorm: Test-Time Adaptation Through Online Normalization Estimation [article]

Xuefeng Hu, Gokhan Uzunbas, Sirius Chen, Rui Wang, Ashish Shah, Ram Nevatia, Ser-Nam Lim
2021 arXiv   pre-print
Unsupervised Domain Adaptation and Zero-Shot Classification.  ...  However, in practice, these two assumptions may not stand, the reasons for which we propose two new evaluation settings where batch sizes are arbitrary and multiple distributions are considered.  ...  Therefore, we just use one augmentation for faster computation. TEST-TIME ADAPTATION IN ZERO-SHOT LEARNING SETTING For Zero-Shot Classification, samples are evaluated separately at batch size of 1.  ... 
arXiv:2110.11478v1 fatcat:po2tg35wpfciroi434rv6gkrx4

Improving Zero and Few-Shot Abstractive Summarization with Intermediate Fine-tuning and Data Augmentation [article]

Alexander R. Fabbri, Simeng Han, Haoyuan Li, Haoran Li, Marjan Ghazvininejad, Shafiq Joty, Dragomir Radev, Yashar Mehdad
2021 arXiv   pre-print
In this work, we introduce a novel and generalizable method, called WikiTransfer, for fine-tuning pretrained models for summarization in an unsupervised, dataset-specific manner.  ...  WikiTransfer models achieve state-of-the-art, zero-shot abstractive summarization performance on the CNN-DailyMail dataset and demonstrate the effectiveness of our approach on three additional diverse  ...  However, little work has focused on domain adaptation in summarization. examine domain adaptation for extractive summarization.  ... 
arXiv:2010.12836v2 fatcat:pjvoiv7srjgtxklj36bowxbsim

Zero-Shot Domain Adaptation via Kernel Regression on the Grassmannian [article]

Yongxin Yang, Timothy Hospedales
2015 arXiv   pre-print
However, even for the setting of unsupervised domain adaptation, where the target domain is unlabelled, collecting data for every possible target domain is still costly.  ...  Domain adaptation algorithms aim to ameliorate domain shift, allowing a model trained on a source to perform well on a different target domain.  ...  We refer to this scenario as zero-shot domain adaptation (ZSDA), distinct to supervised and unsupervised domain adaptation.  ... 
arXiv:1507.07830v2 fatcat:b2ttnev7fncktokmkqasogu7ai

Deep Discriminative Learning for Unsupervised Domain Adaptation [article]

Rohith AP and Ambedkar Dukkipati and Gaurav Pandey
2019 arXiv   pre-print
We perform additional experiments when the source data has less labeled examples and also on zero-shot domain adaptation task where no target domain samples are used for training.  ...  We show that this simple approach for performing unsupervised domain adaptation is indeed quite powerful.  ...  We only use the target domain samples while testing. This is equivalent to Zero shot Domain Adaptation and is reported as 'Zero-shot' in Table 7 .  ... 
arXiv:1811.07134v2 fatcat:dtgvbpluffeczcvk5sbbmaoehu

Self-Learning for Zero Shot Neural Machine Translation [article]

Surafel M. Lakew, Matteo Negri, Marco Turchi
2021 arXiv   pre-print
to optimize the initial model to the zero-shot pair, where the latter two constitute a self-learning cycle.  ...  This work proposes a novel zero-shot NMT modeling approach that learns without the now-standard assumption of a pivot language sharing parallel data with the zero-shot source and target languages.  ...  the S − T pair while learning the zero-shot U − T ).  ... 
arXiv:2103.05951v1 fatcat:ccpndv5rwbhj5e5jli2omwb7y4

Invariance Through Latent Alignment [article]

Takuma Yoneda, Ge Yang, Matthew R. Walter, Bradly Stadie
2022 arXiv   pre-print
ILA performs unsupervised adaptation at deployment-time by matching the distribution of latent features on the target domain to the agent's prior experience, without relying on paired data.  ...  Although simple, we show that this idea leads to surprising improvements on a variety of challenging adaptation scenarios, including changes in lighting conditions, the content in the scene, and camera  ...  The authors would also like to acknowledge the MIT SuperCloud and Lincoln Laboratory Supercomputing Center for providing high performance computing resources.  ... 
arXiv:2112.08526v3 fatcat:6ydpc2hgcja6xdtnaq5du5i5ru

Generalized Zero-Shot Domain Adaptation via Coupled Conditional Variational Autoencoders [article]

Qian Wang, Toby P. Breckon
2020 arXiv   pre-print
For this particular problem, neither conventional domain adaptation approaches nor zero-shot learning algorithms directly apply.  ...  In this paper, we formulate this particular domain adaptation problem within a generalized zero-shot learning framework by treating the labelled source domain samples as semantic representations for zero-shot  ...  In zero-shot In the scenario of image classification, the relations of the novel generalized zero-shot domain adaptation problem with traditional zero-shot learning and unsupervised/supervised domain adaptation  ... 
arXiv:2008.01214v1 fatcat:xok5crh7frdixax6pakbhcjot4

Unsupervised Domain Adaptation for Cross-Subject Few-Shot Neurological Symptom Detection [article]

Bingzhao Zhu, Mahsa Shoaran
2021 arXiv   pre-print
This paper introduces an unsupervised domain adaptation approach based on adversarial networks to enable few-shot, cross-subject epileptic seizure detection.  ...  Modern machine learning tools have shown promise in detecting symptoms of neurological disorders. However, current approaches typically train a unique classifier for each subject.  ...  We ran the proposed domain adaptation approach for 5 independent trials and reported the average performance (AUC scores) ± standard deviation. 3-shot learning on two patients (Study 004-2 and Study 029  ... 
arXiv:2103.00606v1 fatcat:6szov6yuifhcjmj7cfv4hjubum

Nearly Zero-Shot Learning for Semantic Decoding in Spoken Dialogue Systems [article]

Lina M.Rojas-Barahona, Stefan Ultes, Pawel Budzianowski, Iñigo Casanueva, Milica Gasic, Bo-Hsiang Tseng, Steve Young
2018 arXiv   pre-print
The unsupervised tuning (i.e. the risk minimisation) improves the F-Measure when recognising nearly zero-shot data on the DSTC3 corpus.  ...  First, we learn features by using a deep learning architecture in which the weights for the unknown and known categories are jointly optimised.  ...  Recently, feature-based adaptation has been refined with unsupervised auto-encoders that learn features that can generalise between domains (Glorot et al., 2011; Zhou et al., 2016) .  ... 
arXiv:1806.05484v2 fatcat:sir4bp7fhvfqjbjxnkajvcxdti

GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval [article]

Kexin Wang, Nandan Thakur, Nils Reimers, Iryna Gurevych
2022 arXiv   pre-print
In this paper, we propose the novel unsupervised domain adaptation method Generative Pseudo Labeling (GPL), which combines a query generator with pseudo labeling from a cross-encoder.  ...  We further investigate the role of six recent pre-training methods in the scenario of domain adaptation for retrieval tasks, where only three could yield improved results.  ...  If these pre-training approaches can be used for unsupervised domain adaptation for dense retrieval was so far unclear.  ... 
arXiv:2112.07577v3 fatcat:55n5p6tpencplduxb4ik3htssy

Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer [article]

Jian Liang and Dapeng Hu and Yunbo Wang and Ran He and Jiashi Feng
2021 arXiv   pre-print
To effectively utilize the source model for adaptation, we propose a novel approach called Source HypOthesis Transfer (SHOT), which learns the feature extraction module for the target domain by fitting  ...  Unsupervised domain adaptation (UDA) aims to transfer knowledge from a related but different well-labeled source domain to a new unlabeled target domain.  ...  Grauman, “Geodesic flow kernel supervised semi-supervised learning,” in Proc. ICCV, 2019, pp. for unsupervised domain adaptation,” in Proc.  ... 
arXiv:2012.07297v3 fatcat:sofsnofvpjegnhzumy277rjo3a
« Previous Showing results 1 — 15 out of 5,558 results