Filters








32,835 Hits in 4.7 sec

On handling negative transfer and imbalanced distributions in multiple source transfer learning

Liang Ge, Jing Gao, Hung Ngo, Kang Li, Aidong Zhang
2014 Statistical analysis and data mining  
As there are usually multiple relevant domains where knowledge can be transferred, multiple source transfer learning (MSTL) has recently attracted much attention.  ...  In this paper, we propose a novel two-phase framework to effectively transfer knowledge from multiple sources even when there exist irrelevant sources and imbalanced class distributions.  ...  To achieve this, we first propose a Supervised Local Weight (SLW) scheme based on the following assumption: • Supervised Manifold Assumption: If predictions from a particular source domain are smooth and  ... 
doi:10.1002/sam.11217 fatcat:jm3s4q2ocralln542l3jez6lry

Learning from Multiple Sources via Multiple Domain Relationship

Zhen LIU, Junan YANG, Hui LIU, Jian LIU
2016 IEICE transactions on information and systems  
Then, the knowledge of the sources was transferred to the target based on the smoothness assumption, which enforced that the target classifier shares similar decision values with the relevant source classifiers  ...  Experimental results demonstrate that the proposed method can more effectively enhance the learning performance. key words: transfer learning, multiple source transfer, domain similarity, manifold assumption  ...  Then, the knowledge of the sources is transferred to the target based on the smoothness assumption.  ... 
doi:10.1587/transinf.2016edl8008 fatcat:zibjkvthd5hthjdcefqhaw77ti

Transfer Adaptation Learning: A Decade Survey [article]

Lei Zhang, Xinbo Gao
2020 arXiv   pre-print
A research problem is characterized as transfer adaptation learning (TAL) when it needs knowledge correspondence between different moments/domains.  ...  Conventional machine learning aims to find a model with the minimum expected risk on test data by minimizing the regularized empirical risk on the training data, which, however, supposes that the training  ...  ACKNOWLEDGMENT The author would like to thank the pioneer researchers in transfer learning, domain adaptation and other related fields. The author would also like to thank Dr. Mingsheng Long and Dr.  ... 
arXiv:1903.04687v2 fatcat:wurprqieffalnnp6isfkhh5y5i

Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation [article]

Zhiwei Hao, Jianyuan Guo, Ding Jia, Kai Han, Yehui Tang, Chao Zhang, Han Hu, Yunhe Wang
2022 arXiv   pre-print
Knowledge distillation is a widely used paradigm for compressing cumbersome architectures via transferring information to a compact student.  ...  Transfer learning results on other classification benchmarks and downstream vision tasks also demonstrate the superiority of our method over the state-of-the-art algorithms.  ...  This design is based on the assumption that and a larger model can learn more knowledge than a smaller one, and models of the same family share the same knowledge pattern, i.e., a student can learn most  ... 
arXiv:2107.01378v4 fatcat:g47oykhjtbbvvhrdqsmk4tgzp4

Relative Transfer Function Inverse Regression from Low Dimensional Manifold [article]

Ziteng Wang, Emmanuel Vincent, Yonghong Yan
2017 arXiv   pre-print
In room acoustic environments, the Relative Transfer Functions (RTFs) are controlled by few underlying modes of variability. Accordingly, they are confined to a low-dimensional manifold.  ...  The experiments show promising results: the model achieves lower prediction error of the RTF than the free field assumption.  ...  Based on a local linearity assumption on the manifold, the method of Probabilistic Piecewise Affine Mapping (PPAM) was proposed for a 2D sound localization task.  ... 
arXiv:1710.09091v1 fatcat:gkp22f5r5fflvnndrw5iy2riwi

Alignment-based transfer learning for robot models

Botond Bocsi, Lehel Csato, Jan Peters
2013 The 2013 International Joint Conference on Neural Networks (IJCNN)  
Most learning approaches are formulated in a supervised learning framework and are based on clearly defined training sets.  ...  Experimental results indicate that task transfer between different robot architectures is a sound concept.  ...  We classify transfer learning methods based on the principles with which they transfer knowledge between the tasks.  ... 
doi:10.1109/ijcnn.2013.6706721 dblp:conf/ijcnn/BocsiCP13 fatcat:3k3hbc6mkva3dc2fms3w5rnjwe

Transfer Spectral Clustering [chapter]

Wenhao Jiang, Fu-lai Chung
2012 Lecture Notes in Computer Science  
Despite of its superior performance, spectral clustering has not yet been incorporated with knowledge transfer or transfer learning.  ...  Furthermore, it makes use of co-clustering to achieve and control the knowledge transfer among tasks.  ...  Based on this assumption, a co-clustering objective is proposed to design a new spectral clustering algorithm called transfer spectral clustering (TSC).  ... 
doi:10.1007/978-3-642-33486-3_50 fatcat:j3neicqwxbebta7rfnghwvox6m

Methodologies for Cross-Domain Data Fusion: An Overview

Yu Zheng
2015 IEEE Transactions on Big Data  
The last category of data fusion methods is further divided into four groups: multi-view learning-based, similarity-based, probabilistic dependency-based, and transfer learning-based methods.  ...  These methods focus on knowledge fusion rather than schema mapping and data merging, significantly distinguishing between cross-domain data fusion and traditional data fusion studied in the database community  ...  Can we transfer the knowledge learned from multiple datasets of Beijing to another city?  ... 
doi:10.1109/tbdata.2015.2465959 fatcat:flm37ozmhzcrfbrzeuagxm4l6a

Geometric Regularization of Local Activations for Knowledge Transfer in Convolutional Neural Networks

Ilias Theodorakopoulos, Foteini Fotopoulou, George Economou
2021 Information  
in the features' dimensionality, thus enabling knowledge transfer between different architectures.  ...  Experimental evidence demonstrates that the proposed technique is effective in different settings, including knowledge-transfer to smaller models, transfer between different deep architectures and harnessing  ...  obtained from the official repository of each respective dataset and are available in https://www.cs.toronto.edu/~kriz/cifar.html (CIFAR10/100) and http://ufldl.stanford.edu/housenumbers (SVHN) (accessed on  ... 
doi:10.3390/info12080333 fatcat:tx257qkcrjbf5kocqzzbybkrm4

Subdomain Adaptation with Manifolds Discrepancy Alignment [article]

Pengfei Wei, Yiping Ke, Xinghua Qu, Tze-Yun Leong
2020 arXiv   pre-print
Reducing domain divergence is a key step in transfer learning problems. Existing works focus on the minimization of global domain divergence.  ...  We then propose a general framework, called Transfer with Manifolds Discrepancy Alignment (TMDA), to couple the discovery of data manifolds with the minimization of M3D.  ...  This is different from the idea of [19] that aims to transfer multiple manifolds information, although both of the two works are based on multiple manifolds assumption.  ... 
arXiv:2005.03229v1 fatcat:5ddk2qeojvhmdk5q3e7bdixwmm

Knowledge transfer via multiple model local structure mapping

Jing Gao, Wei Fan, Jing Jiang, Jiawei Han
2008 Proceeding of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD 08  
The effectiveness of knowledge transfer using classification algorithms depends on the difference between the distribution that generates the training examples and the one from which test examples are  ...  Importantly, different from many previously proposed methods, none of the base learning method is required to be specifically designed for transfer learning.  ...  The weight approximation is based on the clustering assumption which requires that the manifold structure of the data is related to the conditional probability P (y|x).  ... 
doi:10.1145/1401890.1401928 dblp:conf/kdd/GaoFJH08 fatcat:dh3yxawvlrenbl3ixskculj3r4

Cross-database age estimation based on transfer learning

Ya Su, Yun Fu, Qi Tian, Xinbo Gao
2010 2010 IEEE International Conference on Acoustics, Speech and Signal Processing  
The proposed framework transfers the knowledge gained from training samples to the target data and improves the performance in cross-database scenarios.  ...  Experimental results for age estimation tasks on different datasets demonstrate the effectiveness and robustness of our proposed framework.  ...  [2] constructed certain manifolds to highlight on the aging patterns.  ... 
doi:10.1109/icassp.2010.5495414 dblp:conf/icassp/SuFTG10 fatcat:v33zga565jbblogps7uaewsoae

OVERVIEW OF COMPUTER VISION SUPERVISED LEARNING TECHNIQUES FOR LOW-DATA TRAINING

Burlacu Alexandru
2020 Zenodo  
The paper starts describing various methods to ease the need for a big labeled dataset with giving some background on supervised, weakly-supervised and then self-supervised learning in general, and in  ...  The work was approved at the International Conference on Electronics, Communications and Computing, ECCO -2019.  ...  The assumption that data lie approximately on a manifold of much lower dimension than that of the input space is called manifold assumption and is a widespread assumption in machine learning in general  ... 
doi:10.5281/zenodo.4298709 fatcat:6g4yindodzephb72ryx34re4rq

Multi-Domain Transfer Component Analysis for Domain Generalization

Thomas Grubinger, Adriana Birlutiu, Holger Schöner, Thomas Natschläger, Tom Heskes
2017 Neural Processing Letters  
On the climate control in residential building datasets, the Multi-SSTCA version using the manifold information (locality preserving) performed better than the Multi-SSTCA without the manifold information  ...  Conclusions In this paper we presented an extension of TCA to multiple domains and successfully applied it for domain generalization.  ...  However, this assumption is often violated when data originates from multiple domains.  ... 
doi:10.1007/s11063-017-9612-8 fatcat:6ptnzbx24rgppm5q5z3tzzzwpy

Unsupervised Cross-Domain Transfer in Policy Gradient Reinforcement Learning via Manifold Alignment

Haitham Bou Ammar, Eric Eaton, Paul Ruvolo, Matthew Taylor
2015 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
In the case of multiple task domains, these algorithms require an inter-task mapping to facilitate knowledge transfer across domains.  ...  Transfer learning methods tackle this problem by reusing knowledge gleaned from solving other related tasks.  ...  ., tabula rasa) or via human demonstrations, we instead use transfer learning (TL) to initialize the policy for a new target domain based on knowledge from one or more source tasks.  ... 
doi:10.1609/aaai.v29i1.9631 fatcat:to4zghll7jeotk7ssrvx5523wi
« Previous Showing results 1 — 15 out of 32,835 results