Filters








45,716 Hits in 4.2 sec

Multi-Label Transfer Learning for Multi-Relational Semantic Similarity [article]

Li Zhang, Steven R. Wilson, Rada Mihalcea
2019 arXiv   pre-print
We propose a multi-label transfer learning approach based on LSTM to make predictions for several relations simultaneously and aggregate the losses to update the parameters.  ...  Multi-relational semantic similarity datasets define the semantic relations between two short texts in multiple ways, e.g., similarity, relatedness, and so on.  ...  Acknowledgments This material is based in part upon work supported by the Michigan Institute for Data Science, by the John Templeton Foundation (grant #61156), by the National Science Foundation (grant  ... 
arXiv:1805.12501v2 fatcat:3um7lbyqfzcevnaaivp4txjb3e

Multi-Label Transfer Learning for Multi-Relational Semantic Similarity

Li Zhang, Steven Wilson, Rada Mihalcea
2019 Proceedings of the Eighth Joint Conference on Lexical and Computational Semantics (*  
We propose a multi-label transfer learning approach based on LSTM to make predictions for several relations simultaneously and aggregate the losses to update the parameters.  ...  Multi-relational semantic similarity datasets define the semantic relations between two short texts in multiple ways, e.g., similarity, relatedness, and so on.  ...  Acknowledgments This material is based in part upon work supported by the Michigan Institute for Data Science, by the John Templeton Foundation (grant #61156), by the National Science Foundation (grant  ... 
doi:10.18653/v1/s19-1005 dblp:conf/starsem/ZhangWM19 fatcat:ozugcxpmmrft3hhfjsntsoqy74

Learning Semantic Similarity for Multi-label Text Categorization [chapter]

Li Li, Mengxiang Wang, Longkai Zhang, Houfeng Wang
2014 Lecture Notes in Computer Science  
We propose a multi-label transfer learning approach based on LSTM to make predictions for several relations simultaneously and aggregate the losses to update the parameters.  ...  Multi-relational semantic similarity datasets define the semantic relations between two short texts in multiple ways, e.g., similarity, relatedness, and so on.  ...  Acknowledgments This material is based in part upon work supported by the Michigan Institute for Data Science, by the John Templeton Foundation (grant #61156), by the National Science Foundation (grant  ... 
doi:10.1007/978-3-319-14331-6_26 fatcat:vkka3mn4sbdyzcdlctgfaan5v4

Multi-Label Zero-Shot Learning with Transfer-Aware Label Embedding Projection [article]

Meng Ye, Yuhong Guo
2018 arXiv   pre-print
Auxiliary information can be conveniently incorporated to guide the label embedding projection to further improve label relation structures for zero-shot knowledge transfer.  ...  In this paper we propose a transfer-aware embedding projection approach to tackle multi-label zero-shot learning.  ...  These semantic embedding vectors have the nice property of catching general similarities between any pair of label phrases/words, but may not be optimal for multi-label classification and information transfer  ... 
arXiv:1808.02474v1 fatcat:dov2w7ofbvdg3kdfiprkb5sm3i

Transductive Multi-class and Multi-label Zero-shot Learning [article]

Yanwei Fu, Yongxin Yang, Timothy M. Hospedales, Tao Xiang, Shaogang Gong
2015 arXiv   pre-print
In this paper we discuss two related lines of work improving the conventional approach: exploiting transductive learning ZSL, and generalising ZSL to the multi-label case.  ...  , and is used to bridge between these domains for knowledge transfer.  ...  We propose a novel framework for multi-label zero-shot learning [9] .  ... 
arXiv:1503.07884v1 fatcat:or4zahtjj5atpoxks5n4dilega

Graphonomy: Universal Human Parsing via Graph Transfer Learning [article]

Ke Gong, Yiming Gao, Xiaodan Liang, Xiaohui Shen, Meng Wang, Liang Lin
2019 arXiv   pre-print
identifying and utilizing label redundancies across related tasks.  ...  This poses many fundamental learning challenges, e.g. discovering underlying semantic structures among different label granularity, performing proper transfer learning across different image domains, and  ...  In contrast to the existing multi-task learning pipelines, we explicitly model the relations among different label sets and extract a unified structure for universal human parsing via graph transfer learning  ... 
arXiv:1904.04536v1 fatcat:di2yce3ytbhadml5lljt7yn66m

Graphonomy: Universal Human Parsing via Graph Transfer Learning

Ke Gong, Yiming Gao, Xiaodan Liang, Xiaohui Shen, Meng Wang, Liang Lin
2019 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
This poses many fundamental learning challenges, e.g. discovering underlying semantic structures among different label granularity, performing proper transfer learning across different image domains, and  ...  identifying and utilizing label redundancies across related tasks.  ...  In contrast to the existing multi-task learning pipelines, we explicitly model the relations among different label sets and extract a unified structure for universal human parsing via graph transfer learning  ... 
doi:10.1109/cvpr.2019.00763 dblp:conf/cvpr/Gong0LS0L19 fatcat:rv5xgqag4jcghmzffi3s67fp2u

Class label autoencoder for zero-shot learning [article]

Guangfeng Lin, Caixia Fan, Wanjun Chen, Yajun Chen, Fan Zhao
2018 arXiv   pre-print
However, the projection function cannot be used between the feature space and multi-semantic embedding spaces, which have the diversity characteristic for describing the different semantic information  ...  CLA can not only build a uniform framework for adapting to multi-semantic embedding spaces, but also construct the encoder-decoder mechanism for constraining the bidirectional projection between the feature  ...  Yongqin Xian from MPI for Informatics, who provided the data source to us.  ... 
arXiv:1801.08301v1 fatcat:l6z42l2wsne5vk64jfzw76jez4

Multi-label Zero-shot Classification by Learning to Transfer from External Knowledge [article]

He Huang, Yuanwei Chen, Wei Tang, Wenhao Zheng, Qing-Guo Chen, Yao Hu, Philip Yu
2020 arXiv   pre-print
To address these difficult issues, this paper introduces a novel multi-label zero-shot classification framework by learning to transfer from external knowledge.  ...  Multi-label zero-shot classification aims to predict multiple unseen class labels for an input image. It is more challenging than its single-label counterpart.  ...  as external knowledge and transfer it for zero-shot multi-label image classification.  ... 
arXiv:2007.15610v2 fatcat:5dqtojkavjb25g4yrp42xb4tge

Multi-Task Label Embedding for Text Classification [article]

Honglun Zhang, Liqiang Xiao, Wenqing Chen, Yongkun Wang, Yaohui Jin
2017 arXiv   pre-print
We implement unsupervised, supervised and semi-supervised models of Multi-Task Label Embedding, all utilizing semantic correlations among tasks and making it particularly convenient to scale and transfer  ...  Multi-task learning in text classification leverages implicit correlations among related tasks to extract common features and yield performance gains.  ...  In this paper, we proposed Multi-Task Label Embedding (MTLE) to map labels of each task into semantic vectors as well, similar to how Word Embedding represents the word sequences, thereby converting the  ... 
arXiv:1710.07210v1 fatcat:gotgyytrgzabjouedia4oeniam

Transfer Knowledge between Cities

Ying Wei, Yu Zheng, Qiang Yang
2016 Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD '16  
FLORAL learns semantically related dictionaries for multiple modalities from a source domain, and simultaneously transfers the dictionaries and labelled instances from the source into a target domain.  ...  In this paper, we propose a FLexible multimOdal tRAnsfer Learning (FLORAL) method to transfer knowledge from a city where there exist sufficient multimodal data and labels, to this kind of cities to fully  ...  Overview Learn semantically related dictionaries: To learn commensurable representations for multi-modalities, we first learn semantically related dictionaries from a source domain through a dictionary  ... 
doi:10.1145/2939672.2939830 dblp:conf/kdd/WeiZ016 fatcat:ax7lscs4jvgwraactqpi3pw7ci

Transfer Learning for Neural Semantic Parsing

Xing Fan, Emilio Monti, Lambert Mathias, Markus Dreyer
2017 Proceedings of the 2nd Workshop on Representation Learning for NLP  
In this paper, we propose using sequence-to-sequence in a multi-task setup for semantic parsing with a focus on transfer learning.  ...  Our experiments show that the multi-task setup aids transfer learning from an auxiliary task with large labeled data to a target task with smaller labeled data.  ...  Similar to this work, the authors use a neural semantic parsing model in a multi-task framework to jointly learn over multiple knowledge bases.  ... 
doi:10.18653/v1/w17-2607 dblp:conf/rep4nlp/FanMMD17 fatcat:p2cfqvaw6rearkc53xgzocfosq

Transfer Learning for Neural Semantic Parsing [article]

Xing Fan, Emilio Monti, Lambert Mathias, Markus Dreyer
2017 arXiv   pre-print
In this paper, we propose using sequence-to-sequence in a multi-task setup for semantic parsing with a focus on transfer learning.  ...  Our experiments show that the multi-task setup aids transfer learning from an auxiliary task with large labeled data to a target task with smaller labeled data.  ...  Similar to this work, the authors use a neural semantic parsing model in a multi-task framework to jointly learn over multiple knowledge bases.  ... 
arXiv:1706.04326v1 fatcat:tvobenc2rzd6ngg5r6asjtjr2m

Graphonomy: Universal Image Parsing via Graph Reasoning and Transfer [article]

Liang Lin and Yiming Gao and Ke Gong and Meng Wang and Xiaodan Liang
2021 arXiv   pre-print
different domains for bidirectional knowledge transfer.  ...  This poses many fundamental learning challenges, e.g., discovering underlying semantic structures among different label granularity or mining label correlation across relevant tasks.  ...  similarity and semantic similarity is more reliable for information transferring.  ... 
arXiv:2101.10620v1 fatcat:hnbuqiugsfhvbc7phn5htmsvcy

Sequential Sentence Classification in Research Papers using Cross-Domain Multi-Task Learning [article]

Arthur Brack and Anett Hoppe and Pascal Buschermöhle and Ralph Ewerth
2021 arXiv   pre-print
Our contributions can be summarised as follows: (1) We tailor two common transfer learning methods, sequential transfer learning and multi-task learning, and evaluate their performance for sequential sentence  ...  However, previous work has not investigated the potential of transfer learning with datasets from different scientific domains for this task yet.  ...  The multi-task learning approach sharing all possible layers is able to recognise semantically related classes (RQ5).  ... 
arXiv:2102.06008v1 fatcat:2hilmvh2efdxneabexmuxhqr5e
« Previous Showing results 1 — 15 out of 45,716 results