Filters








32,358 Hits in 4.4 sec

Unsupervised Cross-Dataset Transfer Learning for Person Re-identification

Peixi Peng, Tao Xiang, Yaowei Wang, Massimiliano Pontil, Shaogang Gong, Tiejun Huang, Yonghong Tian
2016 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
Specifically, we present an multi-task dictionary learning method which is able to learn a dataset-shared but targetdata-biased representation.  ...  To overcome this limitation, we develop a novel cross-dataset transfer learning approach to learn a discriminative representation.  ...  There are few multi-task learning methods, or unsupervised transfer learning methods in general, available for the unsupervised setting.  ... 
doi:10.1109/cvpr.2016.146 dblp:conf/cvpr/PengXWPGHT16 fatcat:u56wbkzymve27l4mmcepzcglh4

Multi-task Dictionary Learning based Convolutional Neural Network for Computer aided Diagnosis with Longitudinal Images [article]

Jie Zhang, Qingyang Li, Richard J. Caselli, Jieping Ye, Yalin Wang
2017 arXiv   pre-print
Then, we propose a novel unsupervised learning method, termed Multi-task Stochastic Coordinate Coding (MSCC), for learning different tasks by using shared and individual dictionaries and generating the  ...  To reach this goal, we innovate a CNN based deep learning multi-task dictionary learning framework to address the above challenges.  ...  For the multi-task methods, we observed MSCC-R has better performance than L21 and cFSGL. Comparing with single-task methods, we noticed that the dictionary learning methods have better performance.  ... 
arXiv:1709.00042v1 fatcat:sciiul57vfey7k6mcyn7ilmb3i

Using Task Descriptions in Lifelong Machine Learning for Improved Performance and Zero-Shot Transfer [article]

David Isele, Mohammad Rostami, Eric Eaton
2017 arXiv   pre-print
To reduce this burden, we develop a lifelong learning method based on coupled dictionary learning that utilizes high-level task descriptions to model the inter-task relationships.  ...  Given only the descriptor for a new task, the lifelong learner is also able to accurately predict a model for the new task through zero-shot learning using the coupled dictionary, eliminating the need  ...  Introduction Transfer learning (TL) and multi-task learning (MTL) methods reduce the amount of experience needed to train individual task models by reusing knowledge from other related tasks.  ... 
arXiv:1710.03850v1 fatcat:y42nqcdh4zah5csz5fmp2ylkpe

Multi-Task Regularization with Covariance Dictionary for Linear Classifiers [article]

Fanyi Xiao, Ruikun Luo, Zhiding Yu
2013 arXiv   pre-print
D-SVM uses a dictionary of parameter covariance shared by all tasks to do multi-task knowledge transfer among different tasks.  ...  In this paper we propose a multi-task linear classifier learning problem called D-SVM (Dictionary SVM).  ...  is possible to use more general covariance matrix to model the learning problem as long as it is positive semi-definite which we will explore in the future.  ... 
arXiv:1310.5393v1 fatcat:vkow3icrsvfbnbavw4xwwpeayu

Multi-task additive models with shared transfer functions based on dictionary learning [article]

Alhussein Fawzi, Mathieu Sinn, Pascal Frossard
2015 arXiv   pre-print
We establish a connection with sparse dictionary learning and propose a new efficient fitting algorithm which alternates between sparse coding and transfer function updates.  ...  We introduce a novel multi-task learning approach which provides a corpus of accurate and interpretable additive models for a large number of related forecasting tasks.  ...  Our algorithm for solving the multi-task additive model learning problem uses an intrinsic connection with sparse dictionary learning [4] , [5] , [6] .  ... 
arXiv:1505.04966v1 fatcat:rljvuv45lrcobcmubtf3v2cmge

Multitask Additive Models With Shared Transfer Functions Based on Dictionary Learning

Alhussein Fawzi, Mathieu Sinn, Pascal Frossard
2017 IEEE Transactions on Signal Processing  
We establish a connection with sparse dictionary learning and propose a new efficient fitting algorithm which alternates between sparse coding and transfer function updates.  ...  We introduce a novel multi-task learning approach which provides a corpus of accurate and interpretable additive models for a large number of related forecasting tasks.  ...  Multi-task additive models with shared transfer functions based on dictionary learning Alhussein Fawzi, Mathieu Sinn, and Pascal Frossard Abstract-Additive models form a widely popular class of regression  ... 
doi:10.1109/tsp.2016.2634546 fatcat:fjwzivb32rh5zkrmz37xq3kavu

A Transfer Model Based on Supervised Multi-Layer Dictionary Learning for Brain Tumor MRI Image Recognition

Yi Gu, Kang Li
2021 Frontiers in Neuroscience  
With the help of the knowledge learned from related domains, the goal of this model is to solve the task of transfer learning where the target domain has only a small number of labeled samples.  ...  To solve this problem, we propose a transfer model based on supervised multi-layer dictionary learning (TSMDL) for brain tumor MRI image recognition in this paper.  ...  Multi-Layer Dictionary Learning With the development of deep learning, researchers have found that the deeper the structure of a neural network, the better and more accurate the representation.  ... 
doi:10.3389/fnins.2021.687496 pmid:34122003 pmcid:PMC8193061 fatcat:z3njbccagnc4nhzfca3glljpsa

Using Task Descriptions in Lifelong Machine Learning for Improved Performance and Zero-Shot Transfer

Mohammad Rostami, David Isele, Eric Eaton
2020 The Journal of Artificial Intelligence Research  
To reduce this burden, we develop a lifelong learning method based on coupled dictionary learning that utilizes high-level task descriptions to model inter-task relationships.  ...  Given only the descriptor for a new task, the lifelong learner is also able to accurately predict a model for the new task through zero-shot learning using the coupled dictionary, eliminating the need  ...  Acknowledgments This research was supported by ONR grant #N00014-11-1-0139, AFRL grant #FA8750-14-1-0069, AFRL grant #FA8750-16-1-0109, and the DARPA Lifelong Learning Machines program under grant #FA8750  ... 
doi:10.1613/jair.1.11304 fatcat:gjbed6fp5jgaxpusimdxutanwi

Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous Graph [article]

Nuttapong Chairatanakul, Noppayut Sriwatanasakdi, Nontawat Charoenphakdee, Xin Liu, Tsuyoshi Murata
2021 arXiv   pre-print
First, we construct a dictionary-based heterogeneous graph (DHG) from bilingual dictionaries. This opens the possibility to use graph neural networks for cross-lingual transfer.  ...  In cross-lingual text classification, it is required that task-specific training data in high-resource source languages are available, where the task is identical to that of a low-resource target language  ...  Related Work Transfer learning -The goal of transfer learning is to solve a target task with limited target data by incorporating source knowledge from other domains (Pan and Yang, 2009; Ruder, 2019)  ... 
arXiv:2109.04400v2 fatcat:vtugtbf2ubh6jdavktnqbjyoey

Domain Transfer Multi-Instance Dictionary Learning [article]

Ke Wang, Jiayong Liu, Daniel González
2016 arXiv   pre-print
In this paper, we invest the domain transfer learning problem with multi-instance data.  ...  The adaptive function is a linear function based a domain transfer multi-instance dictionary.  ...  -Zhang and Si [40] proposed the problem of domain transfer learning for multi-instance data, and a novel method to solve it within the framework of multi-task learning.  ... 
arXiv:1605.08397v1 fatcat:z3axtjbtgrhs3mheewtwokq45q

Transfer Knowledge between Cities

Ying Wei, Yu Zheng, Qiang Yang
2016 Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD '16  
We evaluate the proposed method with a case study of air quality prediction.  ...  FLORAL learns semantically related dictionaries for multiple modalities from a source domain, and simultaneously transfers the dictionaries and labelled instances from the source into a target domain.  ...  Two strands of research, i.e., multi-task multi-view learning and multi-view transfer learning, enable knowledge transfer between domains with multimodal data.  ... 
doi:10.1145/2939672.2939830 dblp:conf/kdd/WeiZ016 fatcat:ax7lscs4jvgwraactqpi3pw7ci

Event Oriented Dictionary Learning for Complex Event Detection

Yan Yan, Yi Yang, Deyu Meng, Gaowen Liu, Wei Tong, Alexander G. Hauptmann, Nicu Sebe
2015 IEEE Transactions on Image Processing  
Complex event detection is a retrieval task with 1 the goal of finding videos of a particular event in a large-scale 2 unconstrained Internet video archive, given example videos and 3 text descriptions  ...  Toward this goal, we leverage 14 training images (frames) of selected concepts from the semantic 15 indexing dataset with a pool of 346 concepts, into a novel 16 supervised multitask p -norm dictionary  ...  Multi-Task Learning Multi-task learning [22] methods aim to simultaneously learn classification/regression models for a set of related tasks.  ... 
doi:10.1109/tip.2015.2413294 pmid:25794390 fatcat:wuhftyflfjddhi3zflntqbwxnm

Learning to Represent Bilingual Dictionaries

Muhao Chen, Yingtao Tian, Haochen Chen, Kai-Wei Chang, Steven Skiena, Carlo Zaniolo
2019 Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)  
To enhance the learning process on limited resources, our model adopts several critical learning strategies, including multi-task learning on different bridges of languages, and joint learning of the dictionary  ...  model with a bilingual word embedding model.  ...  BilDRL variants with multi-task or joint learning use both dictionaries of the same language pair.  ... 
doi:10.18653/v1/k19-1015 dblp:conf/conll/ChenTCCSZ19 fatcat:acovrv3wuzbtnnifgysnrkc63q

Lifelong Metric Learning [article]

Gan Sun, Yang Cong, Ji Liu, Xiaowei Xu
2017 arXiv   pre-print
More specifically, the proposed LML maintains a common subspace for all learned metrics, named lifelong dictionary, transfers knowledge from the common subspace to each new metric task with task-specific  ...  In this paper, we consider lifelong learning problem to mimic "human learning", i.e., endowing a new capability to the learned metric for a new task from new online samples and incorporating previous experiences  ...  As new t + 1-th task arrives, LML transfers knowledge through the shared base of lifelong dictionary to learn the new metric model with sparsity regularization, and refines the lifelong dictionary with  ... 
arXiv:1705.01209v2 fatcat:6ecgzorfpbar3hj32kmy4dxvcu

Joint Semantic and Latent Attribute Modelling for Cross-Class Transfer Learning

Peixi Peng, Yonghong Tian, Tao Xiang, Yaowei Wang, Massimiliano Pontil, Tiejun Huang
2018 IEEE Transactions on Pattern Analysis and Machine Intelligence  
Such a joint attribute learning model is then extended by following a multi-task transfer learning framework to address a more challenging unsupervised domain adaptation problem, where annotations are  ...  A number of vision problems such as zero-shot learning and person re-identification can be considered as crossclass transfer learning problems.  ...  Among the existing cross-dataset transfer learning works, [23] adopted an SVM multi-kernel learning transfer strategy, and both [24] and [25] employed multi-task metric learning models.  ... 
doi:10.1109/tpami.2017.2723882 pmid:28692964 fatcat:3di4odmubrgablmnmy6yprjm5a
« Previous Showing results 1 — 15 out of 32,358 results