Filters








184,359 Hits in 9.6 sec

Learning Classifiers for Domain Adaptation, Zero and Few-Shot Recognition Based on Learning Latent Semantic Parts [article]

Pengkai Zhu, Hanxiao Wang, Venkatesh Saligrama
2019 arXiv   pre-print
At test-time we freeze the encoder and only learn/adapt the classifier component to limited annotated labels in FSL; new semantic attributes in ZSL.  ...  In computer vision applications, such as domain adaptation (DA), few shot learning (FSL) and zero-shot learning (ZSL), we encounter new objects and environments, for which insufficient examples exist to  ...  In DA we have training samples from both source domain D s and target domain D t , where D t has no class label available during training.  ... 
arXiv:1901.09079v3 fatcat:synn6rzpkjaa3iiy2trkcih6gi

Simultaneous Deep Transfer Across Domains and Tasks [article]

Eric Tzeng, Judy Hoffman, Trevor Darrell, Kate Saenko
2015 arXiv   pre-print
We propose a new CNN architecture to exploit unlabeled and sparsely labeled target domain data.  ...  Fine-tuning deep models in a new domain can require a significant amount of labeled data, which for many applications is simply not available.  ...  Acknowledgements This work was supported by DARPA; AFRL; DoD MURI award N000141110688; NSF awards 113629, IIS-1427425, and IIS-1212798; and the Berkeley Vision and Learning Center.  ... 
arXiv:1510.02192v1 fatcat:75c3elan45c3rgtsakm3uca2ta

Simultaneous Deep Transfer Across Domains and Tasks

Eric Tzeng, Judy Hoffman, Trevor Darrell, Kate Saenko
2015 2015 IEEE International Conference on Computer Vision (ICCV)  
We propose a new CNN architecture to exploit unlabeled and sparsely labeled target domain data.  ...  Fine-tuning deep models in a new domain can require a significant amount of labeled data, which for many applications is simply not available.  ...  Acknowledgements This work was supported by DARPA; AFRL; DoD MURI award N000141110688; NSF awards 113629, IIS-1427425, and IIS-1212798; and the Berkeley Vision and Learning Center.  ... 
doi:10.1109/iccv.2015.463 dblp:conf/iccv/TzengHDS15 fatcat:uyllj3o26zenfml5lje6suctsm

Simultaneous Deep Transfer Across Domains and Tasks [chapter]

Judy Hoffman, Eric Tzeng, Trevor Darrell, Kate Saenko
2017 Advances in Computer Vision and Pattern Recognition  
We propose a new CNN architecture to exploit unlabeled and sparsely labeled target domain data.  ...  Fine-tuning deep models in a new domain can require a significant amount of labeled data, which for many applications is simply not available.  ...  Acknowledgements This work was supported by DARPA; AFRL; DoD MURI award N000141110688; NSF awards 113629, IIS-1427425, and IIS-1212798; and the Berkeley Vision and Learning Center.  ... 
doi:10.1007/978-3-319-58347-1_9 fatcat:nv6xcl5finbzbienksxptptksq

A survey of transfer learning

Karl Weiss, Taghi M. Khoshgoftaar, DingDing Wang
2016 Journal of Big Data  
Therefore, there is a need to create a high-performance learner for a target domain trained from a related source domain. This is the motivation for transfer learning.  ...  Therefore, there is a need to create high-performance learners trained with more easily obtained data from different domains. This methodology is referred to as transfer learning.  ...  data, and unsupervised transfer learning as the case of having no labeled source and no labeled target domain data.  ... 
doi:10.1186/s40537-016-0043-6 fatcat:auxq6aafwfhgtjaofb5lf45v4u

A survey on heterogeneous transfer learning

Oscar Day, Taghi M. Khoshgoftaar
2017 Journal of Big Data  
in a target domain, which has little or no labeled target training data.  ...  Utilizing a labeled source, or auxiliary, domain for aiding a target task can greatly reduce the cost and effort of collecting sufficient training labels to create an effective model in the new target  ...  Competing interests The authors declare that they have no competing interests. Availability of data and materials Not applicable. Consent for publication Not applicable.  ... 
doi:10.1186/s40537-017-0089-0 fatcat:bpfjycwlkrawzdyyfv2ugle5cy

Domain Adaptation Extreme Learning Machines for Drift Compensation in E-Nose Systems

Lei Zhang, David Zhang
2015 IEEE Transactions on Instrumentation and Measurement  
This paper proposes a unified framework, referred to as Domain Adaptation Extreme Learning Machine (DAELM), which learns a robust classifier by leveraging a limited number of labeled data from target domain  ...  To our best knowledge, ELM with cross-domain learning capability has never been studied.  ...  and the dashed arrow denotes the labeled data from target domain for classifier learning.  ... 
doi:10.1109/tim.2014.2367775 fatcat:pg7ekbo4bna2hptog2kkolwusu

Sampling for unsupervised domain adaptive object detection

Fatemeh Mirrashed, Vlad I. Morariu, Larry S. Davis
2013 2013 IEEE International Conference on Image Processing  
Motivated by traditional semisupervised learning algorithms that aim for better classification using both labeled and unlabeled data, we propose a variation of co-learning technique that automatically  ...  constructs a more balanced set of samples from the target domain.  ...  Often, however, we have sufficient labeled training data from single or multiple source domains but wish to learn a classifier which performs well on a target domain with a different distribution and no  ... 
doi:10.1109/icip.2013.6738677 dblp:conf/icip/MirrashedMD13 fatcat:ugunwmaqsjhhpj6z2wzgcte734

One-Shot Adaptation of Supervised Deep Convolutional Models [article]

Judy Hoffman, Eric Tzeng, Jeff Donahue, Yangqing Jia, Kate Saenko, Trevor Darrell
2014 arXiv   pre-print
Furthermore, we propose several methods for adaptation with deep models that are able to operate with little (one example per category) or no labeled domain specific data.  ...  In general, training or fine-tuning a state-of-the-art deep model on a new domain requires a significant amount of data, which for many applications is simply not available.  ...  Background: Deep Domain Adaptation Approaches For our task we consider adapting between a large source domain and a target domain with few or or no labeled examples.  ... 
arXiv:1312.6204v2 fatcat:z3zfoyzyobfpfn2cqg3raj36l4

Selecting pseudo supervision for unsupervised domain adaptive SAR target classification

Lingjun Zhao, Qishan He, Ding Ding, Siqian Zhang, Gangyao Kuang, Li Liu
2022 EURASIP Journal on Advances in Signal Processing  
AbstractIn recent years, deep learning has brought significant progress for the problem of synthetic aperture radar (SAR) target classification.  ...  To address this problem, in this paper we propose an unsupervised domain adaptation method based on selective pseudo-labelling for SAR target classification.  ...  Acknowledgements The authors would like to thank the handing Associate Editor and the anonymous reviewers for their valuable comments and suggestions for this paper.  ... 
doi:10.1186/s13634-022-00906-y fatcat:evcsksx6wngt3gpvt3wwlps62m

Source Free Domain Adaptation with Image Translation [article]

Yunzhong Hou, Liang Zheng
2021 arXiv   pre-print
labels for training under realistic settings.  ...  Compared with directly classifying target images, higher accuracy is obtained with these style transferred images using the pre-trained model.  ...  As shown in Fig. 1 , aside from a pre-trained classifier on source domain, this setting allows no other cues (images and labels) from the source domain, and has an unlabeled target domain.  ... 
arXiv:2008.07514v2 fatcat:ktoa6xf74ffchmvfxhujc32db4

Adversarial Domain Adaptation Using Artificial Titles for Abstractive Title Generation

Francine Chen, Yan-Ying Chen
2019 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics  
This paper examines techniques for adapting from a labeled source domain to an unlabeled target domain in the context of an encoder-decoder model for text generation.  ...  Evaluation on adapting to/from news articles and Stack Exchange posts indicates that the use of these techniques can boost performance for both unsupervised adaptation as well as fine-tuning with limited  ...  Domain-Adapted Title Generation Our goal is to improve performance when labeled data from one domain, the source, is used to train a model which is then applied to another domain with no or only limited  ... 
doi:10.18653/v1/p19-1211 dblp:conf/acl/ChenC19 fatcat:algknbuwlzck7igqcqec3hhvou

Filling the Gap: Semi-Supervised Learning for Opinion Detection Across Domains

Ning Yu, Sandra Kübler
2011 Conference on Computational Natural Language Learning  
For domain transfer, we show that self-training gains an absolute improvement in labeling accuracy for blog data of 16% over the supervised approach with target domain training data.  ...  We investigate the use of Semi-Supervised Learning (SSL) in opinion detection both in sparse data situations and for domain adaptation.  ...  Therefore, when a limited set of labeled data is available in the target domain, using SSL with unlabeled data is expected to achieve an improvement over supervised learning.  ... 
dblp:conf/conll/YuK11 fatcat:fzcuv7o7qfabxddxe73bszqfqy

A framework for classifier adaptation and its applications in concept detection

Jun Yang, Alexander G. Hauptmann
2008 Proceeding of the 1st ACM international conference on Multimedia information retrieval - MIR '08  
It directly modifies the decision function of an existing classifier of any type into a classifier for a new domain, based on limited labeled data in the new domain and no "old data", which makes it an  ...  We then extend this framework to adapt multiple classifiers into one classifier, with the weights of existing classifiers learned automatically to reflect their utility.  ...  This work was supported in part by the National Science Foundation (NSF) under Grants No. IIS-0535056 and CNS-0751185.  ... 
doi:10.1145/1460096.1460171 dblp:conf/mir/YangH08 fatcat:jpdxiq473fhbffdu4avicwbahy

Active Sentiment Domain Adaptation

Fangzhao Wu, Yongfeng Huang, Jun Yan
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
Instead of the source domain sentiment classifiers, our approach adapts the general-purpose sentiment lexicons to target domain with the help of a small number of labeled samples which are selected and  ...  A unified model is proposed to fuse different types of sentiment information and train sentiment classifier for target domain.  ...  Thus, sentiment domain adaptation, which transfers the sentiment classifier trained in a source domain with sufficient labeled data to a target domain with no or scarce labeled data, has been widely studied  ... 
doi:10.18653/v1/p17-1156 dblp:conf/acl/WuHY17 fatcat:uhezbtujcnearcashulbwcpf3i
« Previous Showing results 1 — 15 out of 184,359 results