Filters








115,249 Hits in 2.1 sec

Multi-task Sparse Structure Learning

Andre R. Goncalves, Puja Das, Soumyadeep Chatterjee, Vidyashankar Sivakumar, Fernando J. Von Zuben, Arindam Banerjee
2014 Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management - CIKM '14  
Multi-task learning (MTL) aims to improve generalization performance by learning multiple related tasks simultaneously.  ...  The task relationship structure learning component builds on recent advances in structure learning of Gaussian graphical models based on sparse estimators of the precision (inverse covariance) matrix.  ...  Multi-task Sparse Structure Learning In this section we describe our multi-task Sparse Structure Learning (MSSL) method.  ... 
doi:10.1145/2661829.2662091 dblp:conf/cikm/GoncalvesDCSZB14 fatcat:x7xtf5hqsraljozyhrxotvg5v4

Robust Visual Tracking via Structured Multi-Task Sparse Learning

Tianzhu Zhang, Bernard Ghanem, Si Liu, Narendra Ahuja
2012 International Journal of Computer Vision  
In this paper, we formulate object tracking in a particle filter framework as a structured multi-task sparse learning problem, which we denote as Structured Multi-Task Tracking (S-MTT).  ...  Since we model particles as linear combinations of dictionary templates that are updated dynamically, learning the representation of each particle is considered a single task in Multi-Task Tracking (MTT  ...  Multi-Task Learning Multi-task learning (MTL, Chen et al. 2009 ) has recently received much attention in machine learning and computer vision.  ... 
doi:10.1007/s11263-012-0582-z fatcat:jeip233quba5zaagqk4ban2zn4

Deep sparse multi-task learning for feature selection in Alzheimer's disease diagnosis

Heung-Il Suk, Seong-Whan Lee, Dinggang Shen
2015 Brain Structure and Function  
In this regard, we use the optimal regression co-efficients learned in one hierarchy as feature weighting factors in the following hierarchy, and formulate a weighted sparse multi-task learning method.  ...  To this end, we first propose a novel deep architecture to recursively discard uninformative features by performing sparse multi-task learning in a hierarchical fashion.  ...  [B0101-15-0307, Basic Software Research in Human-level Lifelong Machine Learning (Machine Learning Center)].  ... 
doi:10.1007/s00429-015-1059-y pmid:25993900 pmcid:PMC4714963 fatcat:p56y3gbo7ndubgti3caagtaenq

Integrating low-rank and group-sparse structures for robust multi-task learning

Jianhui Chen, Jiayu Zhou, Jieping Ye
2011 Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '11  
Specifically, the proposed RMTL algorithm captures the task relationships using a low-rank structure, and simultaneously identifies the outlier tasks using a group-sparse structure.  ...  In this paper, we propose a robust multi-task learning (RMTL) algorithm which learns multiple tasks simultaneously as well as identifies the irrelevant (outlier) tasks.  ...  Learning multiple tasks under this setting is usually referred to as robust multi-task learning [36] .  ... 
doi:10.1145/2020408.2020423 dblp:conf/kdd/ChenZY11 fatcat:l6ntfkv52be5fhwooiigd2cu7y

Efficient Multi-Task Structure-Aware Sparse Bayesian Learning for Frequency-Difference Electrical Impedance Tomography

Shengheng Liu, Yongming Huang, Hancong Wu, Chao Tan, Jiabin Jia
2020 IEEE Transactions on Industrial Informatics  
J 2020, 'Efficient multi-task structure-aware sparse Bayesian learning for frequency-difference electrical impedance tomography', Ieee transactions on industrial informatics. https://doi.Abstract-Frequency-difference  ...  Index Terms-Inverse problem, electrical impedance tomography (EIT), sparse Bayesian learning (SBL), image reconstruction, frequency difference.  ...  Section III then elaborates the proposed sequential reconstruction algorithm based on multitask structure-aware sparse Bayesian learning (MT-SA-SBL).  ... 
doi:10.1109/tii.2020.2965202 fatcat:xnxnpuswijgrtbuwi6nw4zdfyq

Automated gene expression pattern annotation in the mouse brain

Tao Yang, Xinlin Zhao, Binbin Lin, Tao Zeng, Shuiwang Ji, Jieping Ye
2015 Pacific Symposium on Biocomputing. Pacific Symposium on Biocomputing  
In addition, we propose a novel structure-based multi-label classification approach, which makes use of label hierarchy based on brain ontology during model learning.  ...  Our approach is shown to be robust on both binary-class and multi-class tasks and even with a relatively low training ratio.  ...  Multi-task sparse logistic regression We also propose to directly solve the multi-class annotation problem via multi-task learning. Suppose there are k classes (k = 3 or 4 in our study).  ... 
pmid:25592576 pmcid:PMC4299912 fatcat:tqh2sfx2gjbdlolag7tkgtdbcu

Learning Sparse Sharing Architectures for Multiple Tasks [article]

Tianxiang Sun, Yunfan Shao, Xiaonan Li, Pengfei Liu, Hang Yan, Xipeng Qiu, Xuanjing Huang
2019 arXiv   pre-print
In this paper, we propose a novel parameter sharing mechanism, named Sparse Sharing. Given multiple tasks, our approach automatically finds a sparse sharing structure.  ...  Most existing deep multi-task learning models are based on parameter sharing, such as hard sharing, hierarchical sharing, and soft sharing.  ...  Deep Multi-Task Learning Multi-task learning (MTL) utilizes the correlation among tasks to improve performance by training tasks in parallel.  ... 
arXiv:1911.05034v2 fatcat:baoelzd6hbbfdmbqlbkivv5q3m

Learning Sparse Sharing Architectures for Multiple Tasks

Tianxiang Sun, Yunfan Shao, Xiaonan Li, Pengfei Liu, Hang Yan, Xipeng Qiu, Xuanjing Huang
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
In this paper, we propose a novel parameter sharing mechanism, named Sparse Sharing. Given multiple tasks, our approach automatically finds a sparse sharing structure.  ...  Most existing deep multi-task learning models are based on parameter sharing, such as hard sharing, hierarchical sharing, and soft sharing.  ...  Deep Multi-Task Learning Multi-task learning (MTL) utilizes the correlation among tasks to improve performance by training tasks in parallel.  ... 
doi:10.1609/aaai.v34i05.6424 fatcat:jsdphvcanvfjhdklu7vqj5htae

AUTOMATED GENE EXPRESSION PATTERN ANNOTATION IN THE MOUSE BRAIN

TAO YANG, XINLIN ZHAO, BINBIN LIN, TAO ZENG, SHUIWANG JI, JIEPING YE
2014 Biocomputing 2015  
In addition, we propose a novel structure-based multi-label classification approach, which makes use of label hierarchy based on brain ontology during model learning.  ...  Our approach is shown to be robust on both binary-class and multi-class tasks and even with a relatively low training ratio.  ...  Multi-task sparse logistic regression We also propose to directly solve the multi-class annotation problem via multi-task learning. Suppose there are k classes (k = 3 or 4 in our study).  ... 
doi:10.1142/9789814644730_0015 fatcat:yhvr3i6bmrcc5kr6hqh2ixscl4

Learning multiple visual tasks while discovering their structure

Carlo Ciliberto, Lorenzo Rosasco, Silvia Villa
2015 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
Multi-task learning is a natural approach for computer vision applications that require the simultaneous solution of several distinct but related problems, e.g. object detection, classification, tracking  ...  The key idea is that exploring task relatedness (structure) can lead to improved performances.  ...  We considered the setting of Reproducing Kernel Hilbert Spaces for vector-valued functions [20] and formulated the Sparse Kernel MTL as an output kernel learning problem where both a multi-task predictor  ... 
doi:10.1109/cvpr.2015.7298608 dblp:conf/cvpr/CilibertoRV15 fatcat:7fbrulnmibgopovzx76ocb23w4

Learning Multiple Visual Tasks while Discovering their Structure [article]

Carlo Ciliberto, Lorenzo Rosasco, Silvia Villa
2015 arXiv   pre-print
Multi-task learning is a natural approach for computer vision applications that require the simultaneous solution of several distinct but related problems, e.g. object detection, classification, tracking  ...  The key idea is that exploring task relatedness (structure) can lead to improved performances.  ...  Sparse Kernel Multi Task Learning When a-priori knowledge of the problem structure is not available, it is desirable to learn the tasks relations directly from the data.  ... 
arXiv:1504.03106v1 fatcat:q6yprek425gz5hgfat6zbl23wu

Multi-task Sparse Gaussian Processes with Improved Multi-task Sparsity Regularization [chapter]

Jiang Zhu, Shiliang Sun
2014 Communications in Computer and Information Science  
In this paper, we propose an improved multi-task sparsity regularizer which can effectively regularize the subset selection of multiple tasks for multi-task sparse Gaussian processes.  ...  Inspired by the idea of multi-task learning, we believe that simultaneously selecting subsets of multiple Gaussian processes will be more suitable than selecting them separately.  ...  Multi-task learning is an active research direction [3, 7, 8, 15] .  ... 
doi:10.1007/978-3-662-45646-0_6 fatcat:rbp7e5phffbvbcwfb5vppgxl3a

A Survey on Multi-Task Learning [article]

Yu Zhang, Qiang Yang
2018 arXiv   pre-print
Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to leverage useful information contained in multiple related tasks to help improve the generalization performance of  ...  learning, multi-view learning and graphical models.  ...  Even though each W k is sparse or row-sparse, the entire parameter matrix W can be non-sparse and hence this model can discover the latent sparse structure among multiple tasks.  ... 
arXiv:1707.08114v2 fatcat:6lrpe4nk45djbjyfjco7t4yfme

Encoding Tree Sparsity in Multi-Task Learning: A Probabilistic Framework

Lei Han, Yu Zhang, Guojie Song, Kunqing Xie
2014 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Multi-task learning seeks to improve the generalization performance by sharing common information among multiple related tasks.  ...  In this paper, we propose a probabilistic tree sparsity (PTS) model to utilize the tree structure to obtain the sparse solution instead of the group structure.  ...  To capture the diverse structure among tasks, a number of models have been proposed by utilizing a decomposition structure of the model coefficients to learn more flexible sparse patterns.  ... 
doi:10.1609/aaai.v28i1.9009 fatcat:udjaywb7t5hv7i4eo7lpziunoa

Synthetic aperture radar target recognition using weighted multi-task kernel sparse representation

Chen Ning, Wenbo Liu, Gong Zhang, Xin Wang
2019 IEEE Access  
INDEX TERMS Synthetic aperture radar, sparse representation, kernel, multi-task, target recognition.  ...  Then, the proposed method provides a unified framework, named multi-task kernel sparse representation, for SAR target classification.  ...  of multi-task learning.  ... 
doi:10.1109/access.2019.2959228 fatcat:vtibvr2fg5e3jmqyajuijpmuta
« Previous Showing results 1 — 15 out of 115,249 results