A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Complementing Language Embeddings with Knowledge Bases for Specific Domains
2021
International Conference on Artificial Neural Networks
To alleviate this issue, we propose a combined approach where the embedding is seen as a model of a logical knowledge base. ...
Current embeddings encompass a large language corpus, and need to be retrained to deal with specific subdomains. ...
The pre-trained model can be used to obtain domain-specific language embeddings, and to infer KB symbols over natural language expressions by using regular regions. ...
dblp:conf/icann/TentiPP21
fatcat:g2bdcyj42reutawmt3lqydqjfu
Modeling Content and Context with Deep Relational Learning
2021
Transactions of the Association for Computational Linguistics
Neural-symbolic representations have emerged as a way to combine the reasoning capabilities of symbolic methods, with the expressiveness of neural networks. ...
However, most of the existing frameworks for combining neural and symbolic representations have been designed for classic relational learning tasks that work over a universe of symbolic entities and relations ...
To demonstrate DRAIL's modeling approach, we introduce the task of open-domain stance prediction with social context, which combines social network analysis and textual inference over complex opinionated ...
doi:10.1162/tacl_a_00357
fatcat:w4clpw44fbhuxbynid2q33eyue
Modeling Content and Context with Deep Relational Learning
[article]
2020
arXiv
pre-print
Neural-symbolic representations have emerged as a way to combine the reasoning capabilities of symbolic methods, with the expressiveness of neural networks. ...
However, most of the existing frameworks for combining neural and symbolic representations have been designed for classic relational learning tasks that work over a universe of symbolic entities and relations ...
To demonstrate DRAIL's modeling approach, we introduce the task of open-domain stance prediction with social context, which combines social networks analysis and textual inference over complex opinionated ...
arXiv:2010.10453v1
fatcat:ruasqxgrxbgdzhd2lqepfl53nm
Predicting Strategic Behavior from Free Text (Extended Abstract)
2020
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Particularly, we introduce the following question: can free text expressed in natural language serve for the prediction of action selection in an economic context, modeled as a game? ...
Entities are typically embedded in some metric space, and represented through their embeddings, that is, through sub-symbols. ...
as a soft constraint over the hypothesis space in a way that favours solutions consistent with the encoded knowledge. ...
doi:10.24963/ijcai.2020/688
dblp:conf/ijcai/RaedtDMM20
fatcat:kbp4p2slsrculnqg2ig2dvchde
Towards an Approximative Ontology-Agnostic Approach for Logic Programs
[chapter]
2014
Lecture Notes in Computer Science
to highly heterogeneous and open domain knowledge sources. ...
This work focuses on advancing the conceptual and formal work on the interaction between distributional semantics and logic, focusing on the introduction of a distributional deductive inference model for ...
The embedding of logic programs in the distributional vector space allows the definition of a geometric interpretation for the inference process. ...
doi:10.1007/978-3-319-04939-7_21
fatcat:qko6i5nfgrglzj3r6dj7ndtfhm
Combining Representation Learning with Logic for Language Processing
[article]
2017
arXiv
pre-print
This thesis investigates different combinations of representation learning methods with logic for reducing the need for annotated training data, and for improving generalization. ...
of symbols via gradient-based optimization. ...
Note that this does not involve any explicit logical inference. Instead, we expect the vector space of symbol embeddings to incorporate all given rules. ...
arXiv:1712.09687v1
fatcat:mfpom6sfkfaj3a3znsiaxjrrme
Joint Learning of Words and Meaning Representations for Open-Text Semantic Parsing
2012
Journal of machine learning research
Open-text semantic parsers are designed to interpret any statement in natural language by inferring a corresponding meaning representation (MR -a formal representation of its sense). ...
WordNet) with learning from raw text. The model jointly learns representations of words, entities and MRs via a multi-task training process operating on these diverse sources of data. ...
Framework Our model is designed with the following key concepts: • Named symbolic entities (synsets, relation types, and lemmas) are all associated with a joint ddimensional vector space, termed the "embedding ...
dblp:journals/jmlr/BordesGWB12
fatcat:xzeshmtg4vcwjbnlbek62bbxcy
A Survey on Neural-symbolic Systems
[article]
2021
arXiv
pre-print
Combining the fast computation ability of neural systems and the powerful expression ability of symbolic systems, neural-symbolic systems can perform effective learning and reasoning in multi-domain tasks ...
This paper surveys the latest research in neural-symbolic systems along four dimensions: the necessity of combination, technical challenges, methods, and applications. ...
For non-synthetic question answering tasks of open-domain text, Gupta et al. [87] extend the neural module networks (NMN) [88] . ...
arXiv:2111.08164v1
fatcat:bc33afiitnb73bmjtrfbdgkwpy
Knowledge Graph Embeddings
[chapter]
2018
Encyclopedia of Big Data Technologies
A representative work by Wang et al (2014a) introduces a model that combines text corpus and KG to align them in the same vector space and creates KG embeddings and text embeddings. ...
Finally, the alignment model is used to align the KG embedding and text embedding in the same vector space using different alignment mechanisms, such as entity name and Wikipedia anchors. ...
doi:10.1007/978-3-319-63962-8_284-1
fatcat:xwmtv26vyrayvk3uwudy32xaz4
Modular design patterns for hybrid learning and reasoning systems
2021
Applied intelligence (Boston)
Recent years have seen a large number of publications on such hybrid neuro-symbolic AI systems. ...
Finally, our design patterns extend and refine Kautz's earlier attempt at categorizing neuro-symbolic architectures. ...
Following our taxonomy, an infer step uses such models in combination with either data or symbols to infer conclusions: Finally, sometimes an operation on a semantic model is neither a logical induction ...
doi:10.1007/s10489-021-02394-3
fatcat:ecyruntfdncsbbtdglhllwc6vi
Thinking, Fast and Slow: Combining Vector Spaces and Knowledge Graphs
[article]
2017
arXiv
pre-print
We analogize this to thinking 'fast' in vector space along with thinking 'slow' and 'deeply' by reasoning over the knowledge graph. ...
Knowledge graphs and vector space models are robust knowledge representation techniques with individual strengths and weaknesses. ...
We argue that a query processing engine developed over the VKG structure should combine the complimentary strengths of knowledge graphs and vector space models. ...
arXiv:1708.03310v2
fatcat:7mxviyv5fbes7dz6wruwg72ulu
Discrete and Continuous Representations and Processing in Deep Learning: Looking Forward
2021
AI Open
Information is compressed into dense, distributed embeddings. By stark contrast, humans use discrete symbols in their communication with language. ...
We suggest and discuss several avenues that could improve current neural networks with the inclusion of discrete elements to combine the advantages of both types of representations. ...
These embedding vectors, much like in NLP, represent visual concepts in a semantic space. ...
doi:10.1016/j.aiopen.2021.07.002
fatcat:d4kl3scabzamdl5yndhahpgyhe
Neural-Symbolic Computing: An Effective Methodology for Principled Integration of Machine Learning and Reasoning
[article]
2019
arXiv
pre-print
We illustrate the effectiveness of the approach by outlining the main characteristics of the methodology: principled integration of neural learning with symbolic knowledge representation and reasoning ...
Current advances in Artificial Intelligence and machine learning in general, and deep learning in particular have reached unprecedented impact not only across research communities, but also over popular ...
In early work, embedding techniques were proposed to transform symbolic representations into vector spaces where reasoning can be done through matrix computation [4, 47, 48, 42, 41, 6, 14, 57, 13, 39] ...
arXiv:1905.06088v1
fatcat:gm4f3ncukrbevpd7nq5yr75ar4
Reasoning over RDF Knowledge Bases using Deep Learning
[article]
2018
arXiv
pre-print
., the drawing of logical inferences from knowledge expressed in such standards, is traditionally based on logical deductive methods and algorithms which can be proven to be sound and complete and terminating ...
In this paper, we show that it is possible to train a Deep Learning system on RDF knowledge graphs, such that it is able to perform reasoning over new RDF knowledge graphs, with high precision and recall ...
similar vectors in the real coordinate space. ...
arXiv:1811.04132v1
fatcat:e42txhb2qvdsvj4bzuegxaxzvu
Neural, Symbolic and Neural-Symbolic Reasoning on Knowledge Graphs
[article]
2021
arXiv
pre-print
Considering the advantages and disadvantages of both methodologies, recent efforts have been made on combining the two reasoning methods. ...
Since knowledge graphs can be viewed as the discrete symbolic representations of knowledge, reasoning on knowledge graphs can naturally leverage the symbolic techniques. ...
Since it is expensive to manually define these templates, especially for open-domain QA systems over large KGs, later work studies how to generate templates automatically. ...
arXiv:2010.05446v5
fatcat:tc6fowebkzbv7df3cjyhkcu6uq
« Previous
Showing results 1 — 15 out of 9,133 results