Filters








87,609 Hits in 6.6 sec

Representation Learning and NLP [chapter]

Zhiyuan Liu, Yankai Lin, Maosong Sun
2020 Representation Learning for Natural Language Processing  
This chapter presents a brief introduction to representation learning, including its motivation and basic idea, and also reviews its history and recent advances in both machine learning and NLP.  ...  Representation learning aims to learn representations of raw data as useful information for further classification or prediction.  ...  A typical machine learning system consists of three components [5] : Machine Learning = Representation + Objective + Optimization. (1.1) That is, to build an effective machine learning system, we first  ... 
doi:10.1007/978-981-15-5573-2_1 fatcat:4vahgg7ehffytb3qcvfgmo7wsm

Learning joint representation for community question answering with tri-modal DBM

Baolin Peng, Wenge Rong, Yuanxin Ouyang, Chao Li, Zhang Xiong
2014 Proceedings of the 23rd International Conference on World Wide Web - WWW '14 Companion  
In light of these issues, we proposed a trimodal deep boltzmann machine (tri-DBM) to extract unified representation for query, question and answer. Experiments on Yahoo!  ...  Answers dataset reveal using these unified representation to train a classifier judging semantic matching level between query and question outperforms models using bag-of-words or LSA representation significantly  ...  Its energy function of state v and h is computed as follows: (1) where θ = (W, b, c) are model parameters to be learned and M is total number of words occurred in a query or question.  ... 
doi:10.1145/2567948.2577341 dblp:conf/www/PengROLX14 fatcat:hzimjtp6qrfavmobn7feconszi

Toward a unified science of machine learning

Pat Langley
1989 Machine Learning  
Such integrated approaches provide a good role model for the rest of machine learning.  ...  EDITORIAL Toward a Unified Science of Machine Learning Diversification and unification Machine learning is a diverse discipline that acts as host to a variety of research goals, learning techniques, and  ... 
doi:10.1007/bf00116834 fatcat:xlob4qoyffa6dpmerskxvqea5m

Combine Factual Medical Knowledge and Distributed Word Representation to Improve Clinical Named Entity Recognition

Yonghui Wu, Xi Yang, Jiang Bian, Yi Guo, Hua Xu, William Hogan
2018 AMIA Annual Symposium Proceedings  
The evaluation results showed that the RNN with medical knowledge as embedding layers achieved new state-of-the-art performance (a strict F1 score of 86.21% and a relaxed F1 score of 92.80%) on the 2010  ...  However, it is still not clear how existing medical knowledge can help deep learning models in clinical NER tasks.  ...  We would like to thank the 2010 i2b2/VA challenge organizers for the development of the corpus used in this study.  ... 
pmid:30815153 pmcid:PMC6371322 fatcat:tl6cz77hnbeyzhj6miyjs7qxu4

Intelligent Interface for Knowledge Based System

Nyoman Bogi Aditya Karna, Iping Supriana, Ulfa Maulidevi
2014 TELKOMNIKA (Telecommunication Computing Electronics and Control)  
One of the solutions to overcoming this problem is providing a unified model that can accept all types of knowledge, which guarantees automatic interaction between the knowledge-based systems.  ...  It will help to acceleratethe establishment of a new knowledge-based system because it does not need knowledge initialization.  ...  From the proposed solution point of view, the model of knowledge can also be approached using machine learning and the expert system along with an appropriate method of reasoning.  ... 
doi:10.12928/telkomnika.v12i4.413 fatcat:lxjbkvwfvrg6nfxeyc75jwutey

Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks [article]

Yong Wang, Longyue Wang, Shuming Shi, Victor O.K. Li, Zhaopeng Tu
2019 arXiv   pre-print
The key challenge of multi-domain translation lies in simultaneously encoding both the general knowledge shared across domains and the particular knowledge distinctive to each domain in a unified model  ...  To guarantee the knowledge transformation, we also propose two complementary supervision signals by leveraging the power of knowledge distillation and adversarial learning.  ...  Towards learning a unified multi-domain translation model, several researchers turn to augment the NMT model to learn domain-specific knowledge.  ... 
arXiv:1911.09912v1 fatcat:yofseyxf5ngjjpuhp5ndhuwb6y

Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks

Yong Wang, Longyue Wang, Shuming Shi, Victor O.K. Li, Zhaopeng Tu
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
The key challenge of multi-domain translation lies in simultaneously encoding both the general knowledge shared across domains and the particular knowledge distinctive to each domain in a unified model  ...  To guarantee the knowledge transformation, we also propose two complementary supervision signals by leveraging the power of knowledge distillation and adversarial learning.  ...  Towards learning a unified multi-domain translation model, several researchers turn to augment the NMT model to learn domain-specific knowledge.  ... 
doi:10.1609/aaai.v34i05.6461 fatcat:7jjzoft3yne43mxox66veyw5oe

Sliced Cramer Synaptic Consolidation for Preserving Deeply Learned Representations

Soheil Kolouri, Nicholas A. Ketz, Andrea Soltoggio, Praveen K. Pilly
2020 International Conference on Learning Representations  
We then propose a fundamentally different class of preservation methods that aim at preserving the distribution of the network's output at an arbitrary layer for previous tasks while learning a new one  ...  We explore such selective synaptic plasticity approaches through a unifying lens of memory replay and show the close relationship between methods like Elastic Weight Consolidation (EWC) and Memory-Aware-Synapses  ...  INTRODUCTION Incremental learning without catastrophic forgetting is one of the core characteristics of a lifelong learning machine (L2M) and has recently gained renewed attention from the machine learning  ... 
dblp:conf/iclr/KolouriKSP20 fatcat:ku3hiiyenfdr5c52f7wp7sixpq

Managing Machine Learning Workflow Components [article]

Marcio Moreno, Vítor Lourenço, Sandro Rama Fiorini, Polyana Costa, Rafael Brandão, Daniel Civitarese, Renato Cerqueira
2020 arXiv   pre-print
To handle this problem, in this paper, we introduce machine learning workflow management (MLWfM) as a technique to aid the development and reuse of MLWfs and their components through three aspects: representation  ...  Also, we consider the execution of these components within a tool. The hybrid knowledge representation, called Hyperknowledge, frames our methodology, supporting the three MLWfM's aspects.  ...  Representation of Machine Learning Workflows In this section, we describe the knowledge model elements that support our MLWfM method.  ... 
arXiv:1912.05665v2 fatcat:lvw6urp2tvbk3nu2zopn4ipqii

A Unified View of Relational Deep Learning for Drug Pair Scoring [article]

Benedek Rozemberczki and Stephen Bonner and Andriy Nikolov and Michael Ughetto and Sebastian Nilsson and Eliseo Papa
2021 arXiv   pre-print
Here, we present a unified theoretical view of relational machine learning models which can address these tasks.  ...  In recent years, numerous machine learning models which attempt to solve polypharmacy side effect identification, drug-drug interaction prediction and combination therapy design tasks have been proposed  ...  Stephen Bonner is a fellow of the AstraZeneca postdoctoral program.  ... 
arXiv:2111.02916v4 fatcat:wyzysblwqfefdemmk2dhwtlvxa

Knowledge Representation and Management: Interest in New Solutions for Ontology Curation

Ferdinand Dhombres, Jean Charlet, Section Editors for the IMIA Yearbook Section on Knowledge Representation and Management
2021 IMIA Yearbook of Medical Informatics  
Knowledge representations are key to advance machine learning by providing context and to develop novel bioinformatics metrics.  ...  Objective: To select, present and summarize some of the best papers in the field of Knowledge Representation and Management (KRM) published in 2020.  ...  in the selection process of the KRM best papers.  ... 
doi:10.1055/s-0041-1726508 pmid:34479390 fatcat:qkgdqbj4irhcrihnhms362igkm

Propositionalization and embeddings: two sides of the same coin

Nada Lavrač, Blaž Škrlj, Marko Robnik-Šikonja
2020 Machine Learning  
Data preprocessing is an important component of machine learning pipelines, which requires ample time and resources.  ...  This paper outlines some of the modern data processing techniques used in relational learning that enable data fusion from different input data types and formats into a single table data representation  ...  One of the main strategic problems machine learning has to solve is better integration of knowledge and models across different domains and representations.  ... 
doi:10.1007/s10994-020-05890-8 pmid:32704202 pmcid:PMC7366599 fatcat:byyvqrplkrdvbcqvfctswm3ncu

Propositionalization and Embeddings: Two Sides of the Same Coin [article]

Nada Lavrač and BlažŠkrlj and Marko Robnik-Šikonja
2020 arXiv   pre-print
Data preprocessing is an important component of machine learning pipelines, which requires ample time and resources.  ...  This paper outlines some of the modern data processing techniques used in relational learning that enable data fusion from different input data types and formats into a single table data representation  ...  The work of the second author was funded by the Slovenian Research Agency through a young researcher grant.  ... 
arXiv:2006.04410v1 fatcat:idpgnam52jdnbbpv32qhm7o3im

Predicting mortality in critically ill patients with diabetes using machine learning and clinical notes

Jiancheng Ye, Liang Yao, Jiahong Shen, Rethavathi Janarthanam, Yuan Luo
2020 BMC Medical Informatics and Decision Making  
Results The best configuration of the employed machine learning models yielded a competitive AUC of 0.97.  ...  Methods We conducted a secondary analysis of Medical Information Mart for Intensive Care III (MIMIC-III) data. Different machine learning modeling and NLP approaches were applied.  ...  Acknowledgements Not applicable About this supplement This article has been published as part of BMC Medical Informatics and Decision Making Volume 20 Supplement 11 2020: Informatics and machine learning  ... 
doi:10.1186/s12911-020-01318-4 pmid:33380338 fatcat:mljzj3vorrbezm2vtz6s2nanae

MolRep: A Deep Representation Learning Library for Molecular Property Prediction [article]

Jiahua Rao, Shuangjia Zheng, Ying Song, Jianwen Chen, Chengtao Li, Jiancong Xie, Hui Yang, Hongming Chen, Yuedong Yang
2021 bioRxiv   pre-print
Herein, we have developed MolRep by unifying 16 state-of-the-art models across 4 popular molecular representations for application and comparison.  ...  However, unified frameworks have not yet emerged for fairly measuring algorithmic progress, and experimental procedures of different representation models often lack rigorousness and are hardly reproducible  ...  ., 2019a) , they either contained a few deep representation learning models or require many efforts to perform unified evaluation process and hyper-parameter searching.  ... 
doi:10.1101/2021.01.13.426489 fatcat:vnjctlklx5hs5ehr6egvnsxigi
« Previous Showing results 1 — 15 out of 87,609 results