Filters








32,508 Hits in 6.8 sec

Semisupervised Feature Analysis by Mining Correlations Among Multiple Tasks

Xiaojun Chang, Yi Yang
2017 IEEE Transactions on Neural Networks and Learning Systems  
In this paper, we propose a novel semi-supervised feature selection framework by mining correlations among multiple tasks and apply it to different multimedia applications.  ...  Instead of independently computing the importance of features for each task, our algorithm leverages shared knowledge from multiple related tasks, thus, improving the performance of feature selection.  ...  feature selection [18] . 5) Feature Selection with Shared Information among multiple tasks (FSSI): It simultaneously learns multiple feature selection functions of different tasks in a joint framework  ... 
doi:10.1109/tnnls.2016.2582746 pmid:27411230 fatcat:65fjj3uoyjdptadc5clg5hgamy

Semi-Supervised Multiple Feature Analysis for Action Recognition

Sen Wang, Zhigang Ma, Yi Yang, Xue Li, Chaoyi Pang, Alexander G. Hauptmann
2014 IEEE transactions on multimedia  
In our multiple feature learning framework, the idea of uncovering the shared structure is applied to exploiting shared information among different features.  ...  [23] use the graph-based semi-supervised framework incorporating feature selection to learn classification information from real-world image data.  ... 
doi:10.1109/tmm.2013.2293060 fatcat:caiu5if4trf73fmpqs5jdbjudi

An overview of multi-task learning

Yu Zhang, Qiang Yang
2017 National Science Review  
As a promising area in machine learning, multi-task learning (MTL) aims to improve the performance of multiple related learning tasks by leveraging useful information among them.  ...  Then several different settings of MTL are introduced, including multi-task supervised learning, multi-task unsupervised learning, multi-task semi-supervised learning, multi-task active learning, multi-task  ...  However, unlike multitask semi-supervised learning, which exploits information contained in the unlabeled data, in multi-task active learning, each task selects informative unlabeled data to query an oracle  ... 
doi:10.1093/nsr/nwx105 fatcat:7w67kng7ufbandtcneeniropny

Multi-task Semi-supervised Semantic Feature Learning for Classification

Changying Du, Fuzhen Zhuang, Qing He, Zhongzhi Shi
2012 2012 IEEE 12th International Conference on Data Mining  
Thus to make multiple tasks learn from each other we wish to share the associations between categories and those common semantics among tasks.  ...  semantics shared among tasks and specific semantics exclusive to each task.  ...  With these stable semanticcategory associations being exploited for the knowledge sharing among tasks 1 , multiple tasks may learn from each other.  ... 
doi:10.1109/icdm.2012.15 dblp:conf/icdm/DuZHS12 fatcat:vfy4azkvvbcnni3mi43cq56u2e

Multi-task support vector machines for feature selection with shared knowledge discovery

Sen Wang, Xiaojun Chang, Xue Li, Quan Z. Sheng, Weitong Chen
2016 Signal Processing  
Meanwhile, shared information exploiting across multiple tasks has been also taken into account by imposing a constraint which globally limits the combined feature selection matrices to be low-rank.  ...  ., Multi-task support vector machines for feature selection with shared knowledge discovery, Signal Processing (2015), http://dx.  ...  Shared information among multiple tasks is rarely considered when selecting features.  ... 
doi:10.1016/j.sigpro.2014.12.012 fatcat:ccbnnl4x3bfhpb67zfa7zlehni

Graph-Based Neural Network Models with Multiple Self-Supervised Auxiliary Tasks [article]

Franco Manessi, Alessandro Rozza
2020 arXiv   pre-print
semi-supervised graph classification tasks.  ...  In this paper, we propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.  ...  Among the factorization based methods, Xu et al. (2013) presents a semi-supervised factor graph model that can exploit the relationships among the nodes.  ... 
arXiv:2011.07267v2 fatcat:kibmmwy3uvaf3dai6ha7cpnocy

Semisupervised Multitask Learning

Qiuhua Liu, Xuejun Liao, Hui Li, J.R. Stack, L. Carin
2009 IEEE Transactions on Pattern Analysis and Machine Intelligence  
In this paper we integrate MTL and semi-supervised learning into a single framework, thereby exploiting two forms of contextual information.  ...  one feature vector within the context of all unlabeled feature vectors; this is referred to as semi-supervised learning.  ...  Semisupervised STL Semisupervised Pooling Semisupervised MTL Semisupervised MTL with RBF kernel Number of Labeled Data for Each Task Average AUC on 68 tasks Supervised STL with RBF kernel Supervised  ... 
doi:10.1109/tpami.2008.296 pmid:19372611 fatcat:6j5pfgkjmzbbdmwhsz4icyvjwq

A Survey on Multi-Task Learning [article]

Yu Zhang, Qiang Yang
2018 arXiv   pre-print
In order to improve the performance of learning tasks further, MTL can be combined with other learning paradigms including semi-supervised learning, active learning, unsupervised learning, reinforcement  ...  Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to leverage useful information contained in multiple related tasks to help improve the generalization performance of  ...  Semi-supervised learning aims to exploit geometrical information contained in the unlabeled data, while active learning selects representative unlabeled data to query an oracle with the hope to reduce  ... 
arXiv:1707.08114v2 fatcat:6lrpe4nk45djbjyfjco7t4yfme

Visual Understanding via Multi-Feature Shared Learning With Global Consistency

Lei Zhang, David Zhang
2016 IEEE transactions on multimedia  
is much improved through the semi-supervised learning with global label consistency.  ...  Image/video data is usually represented with multiple visual features. Fusion of multi-source information for establishing the attributes has been widely recognized.  ...  Additionally, we also expect that the underlying feature correlation and complementary structural information among multiple features can be exploited for simultaneously learning multiple predictors during  ... 
doi:10.1109/tmm.2015.2510509 fatcat:cp4rtxaha5dblgrg6sugmsa7sa

Semi-supervised multi-task learning for lung cancer diagnosis [article]

Naji Khosravan, Ulas Bagci
2018 arXiv   pre-print
We also showed that a semi-supervised approach can be used to overcome the limitation of lack of labeled data for the 3D segmentation task.  ...  Our results support that joint training of these two tasks through a multi-task learning approach improves system performance on both.  ...  selection of the tasks, features learned from one task can act discriminative for other tasks as well.  ... 
arXiv:1802.06181v2 fatcat:yzytrlwp4zb5rio6pm36pfbm3u

Semi-supervised Text Categorization by Considering Sufficiency and Diversity [chapter]

Shoushan Li, Sophia Yat Mei Lee, Wei Gao, Chu-Ren Huang
2013 Communications in Computer and Information Science  
This motivates semi-supervised learning for TC to improve the performance by exploring the knowledge in both labeled and unlabeled data.  ...  After carefully considering the diversity preference, we modify the traditional bootstrapping algorithm by training the involved classifiers with random feature subspaces instead of the whole feature space  ...  To overcome this difficulty, various semi-supervised learning methods have been proposed to improve the performance by exploiting unlabeled data that are readily available for most TC tasks (Blum and  ... 
doi:10.1007/978-3-642-41644-6_11 fatcat:ovcg5rk6inartgessrv6ccon7q

Multiple Graph Adversarial Learning [article]

Bo Jiang and Ziyan Zhang and Jin Tang and Bin Luo
2019 arXiv   pre-print
Based on MGAL, we then provide a unified network for semi-supervised learning task. Promising experimental results demonstrate the effectiveness of MGAL model.  ...  One main challenge for multi-graph representation is how to exploit both structure information of each individual graph and correlation information across multiple graphs simultaneously.  ...  The main challenge for multi-graph representation is how to exploit the information of each individual graph A (v) while take in the correlation cue among multiple graphs simultaneously in final representation  ... 
arXiv:1901.07439v1 fatcat:r6ujhuegtnbotgrqnka5w2lvie

Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization [article]

Shujun Wang, Lequan Yu, Caizi Li, Chi-Wing Fu, Pheng-Ann Heng
2020 arXiv   pre-print
To be specific, we formulate our framework with feature embedding using a multi-task learning paradigm.  ...  Besides conducting the common supervised recognition task, we seamlessly integrate a momentum metric learning task and a self-supervised auxiliary task to collectively utilize the extrinsic supervision  ...  The auxiliary self-supervised task is able to exploit the intrinsic semantic information within a single image to provide informative feature representations for the main task.  ... 
arXiv:2007.09316v1 fatcat:qqhd6onu55hqdlliczuns3kcpm

Deceptive Review Spam Detection via Exploiting Task Relatedness and Unlabeled Data

Zhen Hai, Peilin Zhao, Peng Cheng, Peng Yang, Xiao-Li Li, Guangxia Li
2016 Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing  
In this paper, we propose to exploit the relatedness of multiple review spam detection tasks and readily available unlabeled data to address the scarcity of labeled opinion spam data.  ...  Existing work on detecting deceptive reviews primarily focuses on feature engineering and applies off-the-shelf supervised classification algorithms to the problem.  ...  Shared Text Features among Conclusions We have coped with the problem of detecting deceptive review spam.  ... 
doi:10.18653/v1/d16-1187 dblp:conf/emnlp/HaiZCYLL16 fatcat:cxhnjz2b6jafhnmxahz7s4o5h4

Beyond without Forgetting: Multi-Task Learning for Classification with Disjoint Datasets [article]

Yan Hong, Li Niu, Jianfu Zhang, Liqing Zhang
2020 arXiv   pre-print
Inspired by semi-supervised learning, we use unlabeled datasets with pseudo labels to facilitate each task.  ...  In existing methods, for each task, the unlabeled datasets are not fully exploited to facilitate this task.  ...  Semi-supervised Multi-task Learning One group of semi-supervised MTL methods [13, 14] exploit shared manifold information among multiple tasks.  ... 
arXiv:2003.06746v1 fatcat:xlxymmw4prgsbbbee6dstlq5ve
« Previous Showing results 1 — 15 out of 32,508 results