A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Learning and analyzing vector encoding of symbolic representations
[article]
2018
arXiv
pre-print
A sequence-to-sequence network processing this language learns to encode symbol structures and query them. ...
The learned representation (approximately) shares a simple linearity property with theoretical techniques for performing this task. ...
CONCLUSION A standard bidirectional encoder-decoder model can generate vector embeddings of expressions denoting complex symbol structures and can successfully query the content of such representations ...
arXiv:1803.03834v1
fatcat:pzhvfjsgyzg3lh3uzvrpdsph74
Symbolic, Distributed and Distributional Representations for Natural Language Processing in the Era of Deep Learning: a Survey
[article]
2019
arXiv
pre-print
A clearer understanding of the strict link between distributed/distributional representations and symbols may certainly lead to radically new deep learning networks. ...
Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: discrete symbols are fading away, erased by vectors or tensors called distributed ...
When combining natural language processing and machine learning, this is a major issue: transforming symbols, sequences of symbols or symbolic structures in vectors or tensors that can be used in learning ...
arXiv:1702.00764v2
fatcat:xmpga3xn6nhzdos7pcc25z5pde
Symbolic, Distributed, and Distributional Representations for Natural Language Processing in the Era of Deep Learning: A Survey
2020
Frontiers in Robotics and AI
A clearer understanding of the strict link between distributed/distributional representations and symbols may certainly lead to radically new deep learning networks. ...
Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: discrete symbols are fading away, erased by vectors or tensors called distributed ...
When combining natural language processing and machine learning, this is a major issue: transforming symbols, sequences of symbols or symbolic structures in vectors or tensors that can be used in learning ...
doi:10.3389/frobt.2019.00153
pmid:33501168
pmcid:PMC7805717
fatcat:353mgx2tr5ftxcx2utx776isou
Discovering the Compositional Structure of Vector Representations with Role Learning Networks
[article]
2020
arXiv
pre-print
This method uncovers a symbolic structure which, when properly embedded in vector space, closely approximates the encodings of a standard seq2seq network trained to perform the compositional SCAN task. ...
We verify the causal importance of the discovered symbolic structure by showing that, when we systematically manipulate hidden embeddings based on this symbolic structure, the model's output is changed ...
Could it be that NNs do learn symbolic representations-covertly embedded as vectors in their state spaces? ...
arXiv:1910.09113v3
fatcat:mf3iqtffbrcejnzh4smt7slhfy
Wave2Vec: Vectorizing Electroencephalography Bio-Signal for Prediction of Brain Disease
2018
International Journal of Environmental Research and Public Health
, the proposed model vectorizes the symbols by learning the sequence using deep learning-based natural language processing. ...
The models of each class can be constructed through learning from the vectorized wavelet patterns and training data. ...
Conflicts of Interest: The authors declare no conflict of interest. ...
doi:10.3390/ijerph15081750
pmid:30111710
fatcat:7jffzzt2rnapvo6xdqkkwdsqw4
Analyzing the Capacity of Distributed Vector Representations to Encode Spatial Information
2020
2020 International Joint Conference on Neural Networks (IJCNN)
Vector Symbolic Architectures belong to a family of related cognitive modeling approaches that encode symbols and structures in high-dimensional vectors. ...
vector representations is limited and one way of modeling the numerical restrictions to cognition. ...
Such vectors are one variant of distributed representations in the sense that information is captured over all dimensions of the vector instead of one single number, which allows to encode both, symbol-like ...
doi:10.1109/ijcnn48605.2020.9207137
dblp:conf/ijcnn/MirusSC20a
fatcat:uzbvkiozqbfvjojsdstvvzrvee
Signal-domain representation of symbolic music for learning embedding spaces
[article]
2021
arXiv
pre-print
This improvement is reflected in the metric properties and in the generation ability of the space learned from our signal-like representation according to music theory properties. ...
In this paper, we introduce a novel representation of symbolic music data, which transforms a polyphonic score into a continuous signal. ...
Symbolic music representations The performances of machine learning techniques for symbolic generation is critically influenced by the properties of the input representation. ...
arXiv:2109.03454v1
fatcat:qmjt5lfj7jaz3hyq256shnesj4
Structural Inductive Biases in Emergent Communication
[article]
2021
arXiv
pre-print
We investigate the impact of representation learning in artificial agents by developing graph referential games. ...
In order to communicate, humans flatten a complex representation of ideas and their attributes into a single word or a sentence. ...
The node features consist of a concatenation of the property encoding and the type encoding (represented as one-hot vectors). ...
arXiv:2002.01335v4
fatcat:qhsg5vssnvcvfmx46xlcb5iale
Signal-domain representation of symbolic music for learning embedding spaces
2020
Zenodo
This improvement is reflected in the metric properties and in the generation ability of the space learned from our signal-like representation according to music theory properties. ...
In this paper, we introduce a novel representation of symbolic music data, which transforms a polyphonic score into a continuous signal. ...
Symbolic music representations The performances of machine learning techniques for symbolic generation is critically influenced by the properties of the input representation. ...
doi:10.5281/zenodo.4285386
fatcat:fi7fn7obvvc3hipuuoxdxmgcyi
Mapping Natural-language Problems to Formal-language Solutions Using Structured Neural Representations
[article]
2020
arXiv
pre-print
The encoder of TP-N2F employs TPR 'binding' to encode natural-language symbolic structure in vector space and the decoder uses TPR 'unbinding' to generate, in symbolic space, a sequential program represented ...
Analysis of the learned structures shows how TPRs enhance the interpretability of TP-N2F. ...
Acknowledgements We are grateful to Aida Amini from the University of Washington for providing execution scripts for the MathQA dataset. ...
arXiv:1910.02339v3
fatcat:fgf2rwqzvvh2xmquxspscsh5li
Compositional Processing Emerges in Neural Networks Solving Math Problems
[article]
2021
arXiv
pre-print
., auditory speech), and use this knowledge to guide the composition of simpler meanings into complex wholes. ...
A longstanding question in cognitive science concerns the learning mechanisms underlying compositionality in human cognition. ...
The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. Illustrations by H. Tomkiewicz. ...
arXiv:2105.08961v1
fatcat:aamueaexqbaddmdby6mrmb7jha
IR2Vec: LLVM IR based Scalable Program Embeddings
[article]
2020
arXiv
pre-print
Symbolic encodings are obtained from the seed embedding vocabulary, and Flow-Aware encodings are obtained by augmenting the Symbolic encodings with the flow information. ...
The entities of the IR are modeled as relationships, and their representations are learned to form a seed embedding vocabulary. ...
We also thank Swapnil Dewalkar, Akash Banerjee and Rahul Utkoor for the thoughtful discussions in the early stages of this work. ...
arXiv:1909.06228v3
fatcat:nmrwcya6ejfp7cj23kbtbltl5y
Large Patterns Make Great Symbols: An Example of Learning from Example
[chapter]
2000
Lecture Notes in Computer Science
We look at distributed representation of structure with variable binding, that is natural for neural nets and allows traditional symbolic representation and processing. ...
This is demonstrated by taking several instances of the mother-of relation implying the parent-of relation, by encoding them into a mapping vector, and by showing that the mapping vector maps new instances ...
That the averaging of vectors for structured entities should be meaningful, is a consequence of the representation used and has no counterpart in traditional symbolic representation. ...
doi:10.1007/10719871_13
fatcat:u7fiuqj5bzgtbmkwmqbd3umfgm
Towards Graph Representation Learning in Emergent Communication
[article]
2020
arXiv
pre-print
In order to communicate, we flatten the complex representation of entities and their attributes into a single word or a sentence. ...
We show that the emerged communication protocol is robust, that the agents uncover the true factors of variation in the game, and that they learn to generalize beyond the samples encountered during training ...
The node features consist of a concatenation of the property encoding and the type encoding (represented as one-hot vectors). ...
arXiv:2001.09063v2
fatcat:nxalwl5eefblrflps5uv47lmoi
Correlating neural and symbolic representations of language
[article]
2019
arXiv
pre-print
Analysis methods which enable us to better understand the representations and functioning of neural models of language are increasingly needed as deep learning becomes the dominant approach in NLP. ...
Here we present two methods based on Representational Similarity Analysis (RSA) and Tree Kernels (TK) which allow us to directly quantify how strongly the information encoded in neural activation patterns ...
A vector representation for a given structured symbolic input is built based on kernel evaluations between the input and a subset of training examples known as landmarks, and the network decision is then ...
arXiv:1905.06401v2
fatcat:vnexem4cbnd5xa4hyape6kvabq
« Previous
Showing results 1 — 15 out of 35,245 results