The Internet Archive has a preservation copy of this work in our general collections.
The file type is application/pdf
.
Filters
Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors
[article]
2013
arXiv
pre-print
This model can be improved by initializing entity representations with word vectors learned in an unsupervised fashion from text, and when doing this, existing relations can even be queried for entities ...
We introduce a neural tensor network (NTN) model which predicts new relationship entries that can be added to the database. ...
Conclusion We introduced a new model based on Neural Tensor Networks. ...
arXiv:1301.3618v2
fatcat:ngou3f56xbddxmaz62sdaltdau
Knowledge Reasoning Based on Neural Tensor Network
2017
ITM Web of Conferences
This paper attempts to explore the model complexity of neural tensor network, a very important method of knowledge reasoning, and the reasoning accuracy. ...
Knowledge base (KBs) is a very important part of applications such as Q&A system, but the knowledge base is always faced with incompleteness and the lack of inter-entity relationships. ...
Hence, learning new facts based on the knowledge bases is an essential way to improve them. ...
doi:10.1051/itmconf/20171204004
fatcat:jnh7dlb6xfd63jdkuz6lm4euba
Reasoning With Neural Tensor Networks for Knowledge Base Completion
2013
Neural Information Processing Systems
Lastly, we demonstrate that all models improve when these word vectors are initialized with vectors learned from unsupervised large corpora. ...
In this paper we introduce an expressive neural tensor network suitable for reasoning over relationships between two entities. ...
FA8750-13-2-0040, the DARPA Deep Learning program under contract number FA8650-10-C-7020 and NSF IIS-1159679. ...
dblp:conf/nips/SocherCMN13
fatcat:nix242ktcvc5zgeje3bya6upsq
Knowledge-Driven Event Embedding for Stock Prediction
2016
International Conference on Computational Linguistics
On the other hand, events extracted from raw texts do not contain background knowledge on entities and relations that they are mentioned. ...
Representing structured events as vectors in continuous space offers a new way for defining dense features for natural language processing (NLP) applications. ...
2014CB340503, the National Natural Science Foundation of China (NSFC) via Grant 61472107 and 71532004, the Singapore Ministry of Education (MOE) AcRF Tier 2 grant T2MOE201301. ...
dblp:conf/coling/DingZLD16
fatcat:cm23wzwyanhvnir3lasyym3ehe
Building Memory with Concept Learning Capabilities from Large-Scale Knowledge Bases
2015
Neural Information Processing Systems
We present a new perspective on neural knowledge base (KB) embeddings, from which we build a framework that can model symbolic knowledge in the KB together with its learning process. ...
their embeddings from natural language descriptions, which is very like human's behavior of learning semantic concepts. 1 Concept learning in cognitive science usually refers to the cognitive process ...
When doing SGD with mini-batch, We back-propagate the error gradients into the neural network, and for CNN, finally into word vectors. ...
dblp:conf/nips/ShiZ15
fatcat:f4zozse33ngyddc3wnnote3fsq
Symbolic, Distributed and Distributional Representations for Natural Language Processing in the Era of Deep Learning: a Survey
[article]
2019
arXiv
pre-print
A clearer understanding of the strict link between distributed/distributional representations and symbols may certainly lead to radically new deep learning networks. ...
Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: discrete symbols are fading away, erased by vectors or tensors called distributed ...
Distributed representations are vectors or tensors in metric spaces which underly learning models such as neural networks and also some models based on kernel methods [Zanzotto and Dell'Arciprete 2012a ...
arXiv:1702.00764v2
fatcat:xmpga3xn6nhzdos7pcc25z5pde
Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge
[article]
2016
arXiv
pre-print
We propose Logic Tensor Networks: a uniform framework for integrating automatic learning and reasoning. ...
We show how Real Logic can be implemented in deep Tensor Neural Networks with the use of Google's tensorflow primitives. ...
Approximate satisfiability is defined as a learning task with both knowledge and data being mapped onto real-valued vectors. ...
arXiv:1606.04422v2
fatcat:jqttytjqefbdnlktw4pgjkzce4
Compositional Semantics
[chapter]
2020
Representation Learning for Natural Language Processing
After that, we present various typical models for N-ary semantic composition including recurrent neural network, recursive neural network, and convolutional neural network. ...
Many important applications in NLP fields rely on understanding more complex language units such as phrases, sentences, and documents beyond words. ...
Based on this philosophy, [15] proposes a recursive neural network to model different levels of semantic units. ...
doi:10.1007/978-981-15-5573-2_3
fatcat:uu524rdsxnd7flgrvprsuxaicq
Symbolic, Distributed, and Distributional Representations for Natural Language Processing in the Era of Deep Learning: A Survey
2020
Frontiers in Robotics and AI
A clearer understanding of the strict link between distributed/distributional representations and symbols may certainly lead to radically new deep learning networks. ...
Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: discrete symbols are fading away, erased by vectors or tensors called distributed ...
In fact, it is sufficient to generate new Gaussian vectors for new symbols when they appear. ...
doi:10.3389/frobt.2019.00153
pmid:33501168
pmcid:PMC7805717
fatcat:353mgx2tr5ftxcx2utx776isou
Building Memory with Concept Learning Capabilities from Large-scale Knowledge Base
[article]
2015
arXiv
pre-print
We present a new perspective on neural knowledge base (KB) embeddings, from which we build a framework that can model symbolic knowledge in the KB together with its learning process. ...
their embeddings from natural language descriptions, which is very like human's behavior of learning semantic concepts. ...
When doing SGD with mini-batch, We back-propagate the error gradients into the neural network, and for CNN, finally into word vectors. ...
arXiv:1512.01173v1
fatcat:xhdhq3fwibd43jgc2qftvlwutu
A Tensor Space Model-Based Deep Neural Network for Text Classification
2021
Applied Sciences
To solve this 'loss of term senses' problem, we develop a concept-driven deep neural network based upon our semantic tensor space model. ...
This is because a textual document is essentially expressed as a vector (only), albeit with word dimensions, which compromises the inherent semantic information, even if the vector is (appropriately) transformed ...
Word2Vec, a two-layer neural network with autoencoder characteristics, was trained to learn the relationships among words in a sentence and then found semantic representations [6] . ...
doi:10.3390/app11209703
fatcat:s5bp54tdu5djtmuzreehowk5om
A Review of Inference Methods Based on Knowledge Graph
[chapter]
2020
Frontiers in Artificial Intelligence and Applications
According to the different methods adopted for each type, each type also includes reasoning based on distributed representation; reasoning based on neural network and mixed reasoning. ...
Different from traditional knowledge inference methods, knowledge inference methods based on knowledge graphs are also diversified according to their simple, intuitive, flexible and rich knowledge expression ...
Neural Network Reasoning In single-step reasoning, neural network-based reasoning uses the neural network to directly model the knowledge graph fact tuple, and obtain the vector representation of the fact ...
doi:10.3233/faia200727
fatcat:wnjuq5lkbbdh7gzcoukri5nciy
Neural Ranking Models for Document Retrieval
[article]
2021
arXiv
pre-print
A variety of deep learning models have been proposed, and each model presents a set of neural network components to extract features that are used for ranking. ...
Several approaches to ranking are based on traditional machine learning algorithms using a set of hand-crafted features. ...
Human activity recognition using recurrent neural networks. In Machine Learning and Knowledge Liu, W., Jia, Y., Sermanet, P., Reed, S. ...
arXiv:2102.11903v1
fatcat:zc2otf456rc2hj6b6wpcaaslsa
Neural-Symbolic Computing: An Effective Methodology for Principled Integration of Machine Learning and Reasoning
[article]
2019
arXiv
pre-print
In spite of the recent impact of AI, several works have identified the need for principled knowledge representation and reasoning mechanisms integrated with deep learning-based systems to provide sound ...
We illustrate the effectiveness of the approach by outlining the main characteristics of the methodology: principled integration of neural learning with symbolic knowledge representation and reasoning ...
tensors represent facts (predicates of different arities) from a knowledge base and output tensors represent new facts. ...
arXiv:1905.06088v1
fatcat:gm4f3ncukrbevpd7nq5yr75ar4
A Survey on Neural-symbolic Systems
[article]
2021
arXiv
pre-print
In this case, an ideal intelligent system--a neural-symbolic system--with high perceptual and cognitive intelligence through powerful learning and reasoning capabilities gains a growing interest in the ...
Combining the fast computation ability of neural systems and the powerful expression ability of symbolic systems, neural-symbolic systems can perform effective learning and reasoning in multi-domain tasks ...
The graph is constructed from a knowledge graph and each node is represented by a vector that encodes semantic class information(it is the classes word embedding in this paper). ...
arXiv:2111.08164v1
fatcat:bc33afiitnb73bmjtrfbdgkwpy
« Previous
Showing results 1 — 15 out of 4,284 results