A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Unsupervised Learning of Neurosymbolic Encoders
[article]
2021
arXiv
pre-print
We present a framework for the unsupervised learning of neurosymbolic encoders, i.e., encoders obtained by composing neural networks with symbolic programs from a domain-specific language. ...
Such a framework can naturally incorporate symbolic expert knowledge into the learning process and lead to more interpretable and factorized latent representations than fully neural encoders. ...
{q φ , q (α,ψ) } Algorithm 2 Learning a neurosymbolic encoder with k programs 1: Input: program space P, program graph G, k 2: for i = 1..k do 3: fix programs {q (α 1 ,ψ 1 ) , . . . , q (α i−1 ,ψ i−1 ...
arXiv:2107.13132v1
fatcat:hkxaiu2i3nev3j2f2pc67ckbda
Neuro-Symbolic VQA: A review from the perspective of AGI desiderata
[article]
2021
arXiv
pre-print
It is my hope that through this work we can temper model evaluation on benchmarks with a discussion of the properties of these systems and their potential for future extension. ...
An ultimate goal of the AI and ML fields is artificial general intelligence (AGI); although such systems remain science fiction, various models exhibit aspects of AGI. ...
done by a 2-layer MLP given the encoded question (from the encoder-decoder reasoning instruction parser), and a vector representation of the graph. ...
arXiv:2104.06365v1
fatcat:y35ddxgaujgttbeiemcanvk2fy
Program Synthesis and Semantic Parsing with Learned Code Idioms
[article]
2019
arXiv
pre-print
It accomplishes this by automatically mining common code idioms from a given corpus, incorporating them into the underlying language for neural synthesis, and training a tree-based neural synthesizer to ...
We evaluate PATOIS on two complex semantic parsing datasets and show that using learned code idioms improves the synthesizer's accuracy. ...
The workflow of EC 2 is similar to PATOIS, with three stages: (a) learn new DSL subroutines from a corpus of tasks, (b) train a recognition model that maps a task specification to a distribution over DSL ...
arXiv:1906.10816v4
fatcat:frgb473ctjgz7pklc5ut5jfvpq
Learning Transferable Graph Exploration
[article]
2019
arXiv
pre-print
We formulate this task as a reinforcement learning problem where the 'exploration' agent is rewarded for transitioning to previously unseen environment states and employ a graph-structured memory to encode ...
We propose a 'learning to explore' framework where we learn a policy from a distribution of environments. ...
Acknowledgments We would like to thank Hengxiang Hu, Shu-Wei Cheng and other members in the team for providing data and engineering suggestions. ...
arXiv:1910.12980v1
fatcat:4qruvw3szjfvdo2v7xmb3jv5oq
Neuro-Symbolic Program Synthesis
[article]
2016
arXiv
pre-print
While achieving impressive results, these approaches have a number of important limitations: (a) they are computationally expensive and hard to train, (b) a model has to be trained for each task (program ...
Given a set of input-output examples, these architectures are able to learn mappings that generalize to new test inputs. ...
Given a DSL L, we learn a generative model of programs in the DSL L that is conditioned on input-output examples to efficiently search for consistent programs. ...
arXiv:1611.01855v1
fatcat:osn7netvtnehpcl7l4n2ryl6o4
Graph-based software knowledge: Storage and semantic querying of domain models for run-time adaptation
2016
2016 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR)
However, these models are merely seen as a way to support humans during the robot's software design process. ...
Software development for robots is a knowledgeintensive exercise. ...
Furthermore, the authors gratefully acknowledge the on-going support of the Bonn-Aachen International Center for Information Technology. ...
doi:10.1109/simpar.2016.7862379
dblp:conf/simpar/Hochgeschwender16
fatcat:ayclhdhcufbndmyigqrdkw7lba
A domain-specific approach to heterogeneous parallelism
2011
Proceedings of the 16th ACM symposium on Principles and practice of parallel programming - PPoPP '11
To demonstrate the potential of this approach we present OptiML, a DSL for machine learning. ...
To address this concern, we introduce Delite, a system designed specifically for DSLs that is both a framework for creating an implicitly parallel DSL as well as a dynamic runtime providing automated targeting ...
Delite: A Framework for Parallelizing DSLs Constructing a new DSL is a difficult process in general. ...
doi:10.1145/1941553.1941561
dblp:conf/ppopp/ChafiSBLAO11
fatcat:2dh6myssffbvnnz6wc33elr7tm
A domain-specific approach to heterogeneous parallelism
2011
SIGPLAN notices
To demonstrate the potential of this approach we present OptiML, a DSL for machine learning. ...
To address this concern, we introduce Delite, a system designed specifically for DSLs that is both a framework for creating an implicitly parallel DSL as well as a dynamic runtime providing automated targeting ...
Delite: A Framework for Parallelizing DSLs Constructing a new DSL is a difficult process in general. ...
doi:10.1145/2038037.1941561
fatcat:gubjcwiv5radja76t5glnoudji
Learning Libraries of Subroutines for Neurally-Guided Bayesian Program Induction
2018
Neural Information Processing Systems
We contribute a program induction algorithm called EC 2 that learns a DSL while jointly training a neural network to efficiently search for programs in the learned DSL. ...
We use our model to synthesize functions on lists, edit text, and solve symbolic regression problems, showing how the model learns a domain-specific library of program components for expressing solutions ...
Acknowledgments We are grateful for collaborations with Eyal Dechter, whose EC algorithm directly inspired this work, and for funding from the NSF GRFP, AFOSR award FA9550-16-1-0012, the MIT-IBM Watson ...
dblp:conf/nips/EllisMSST18
fatcat:uys4e7twyzakbmp3qizfxdggcu
A survey of grammatical inference in software engineering
2014
Science of Computer Programming
Unlike these other fields, which use grammars as a convenient tool to model naturally occuring patterns, software engineering treats grammars as first-class objects typically created and maintained for ...
Grammatical inference -used successfully in a variety of fields such as pattern recognition, computational biology and natural language processing -is the process of automatically inferring a grammar by ...
This process spans two axes of complexity: the language class to be learned and the learning model employed. ...
doi:10.1016/j.scico.2014.05.008
fatcat:xwasotc745ekbhaoq2n43vrufm
A Flexible Approach to Automated RNN Architecture Generation
[article]
2017
arXiv
pre-print
Using two different candidate generation techniques, random search with a ranking function and reinforcement learning, we explore the novel architectures produced by the RNN DSL for language modeling and ...
We propose a domain-specific language (DSL) for use in automated architecture search which can produce novel RNNs of arbitrary depth and width. ...
For language modeling, we explore the core DSL using randomly constructed architectures (random search) directed by a learned ranking function. ...
arXiv:1712.07316v1
fatcat:3yz2abdvebgqjgc3r5oraucrbi
Implementing Domain-Specific Languages for Heterogeneous Parallel Computing
2011
IEEE Micro
To provide productivity, a parallel programming model must raise the level of abstraction above that of current low-level programming models such as Pthreads and CUDA. ...
Ideally, such a programming model would also be general, allowing the application developer to express arbitrary semantics. ...
Acknowledgments We thank the anonymous reviewers for their valuable feedback. ...
doi:10.1109/mm.2011.68
fatcat:a77o6qub7fhyrlqccdiyqkbj3y
Delite
2014
ACM Transactions on Embedded Computing Systems
We present Delite DSLs for machine learning, data querying, graph analysis, and scientific computing and show that they all achieve performance competitive to or exceeding C++ code. ...
Developing high-performance software is a difficult task that requires the use of low-level, architecturespecific programming models (e.g., OpenMP for CMPs, CUDA for GPUs, MPI for clusters). ...
ACKNOWLEDGMENTS We are grateful to the anonymous reviewers for their comments and suggestions. ...
doi:10.1145/2584665
fatcat:pyjyrojtdnge3ign6e3eaxhxpq
Learning Differentiable Programs with Admissible Neural Heuristics
[article]
2021
arXiv
pre-print
We frame this optimization problem as a search in a weighted graph whose paths encode top-down derivations of program syntax. ...
Such programmatic models can offer benefits such as composability and interpretability; however, learning them requires optimizing over a combinatorial space of program "architectures". ...
We frame our problem as searching a graph, in which nodes encode program architectures with missing expressions, and paths encode top-down program derivations. ...
arXiv:2007.12101v5
fatcat:u2f6zkoclfbexbtbicoczaqfpq
Representing Data Visualization Goals and Tasks through Meta-Modeling to Tailor Information Dashboards
2020
Applied Sciences
Choosing a wrong design could compromise the entire dashboard's effectiveness, selecting the appropriate encoding or configuration for each potential context, user, or data domain is a crucial task. ...
This work presents a dashboard meta-model that abstracts all these factors and the integration of a visualization task taxonomy to account for the different actions that can be performed with information ...
Acknowledgments: The authors would like to thank the InterAction and eLearning Research Group (GRIAL) for its support to conduct the present research https://grial.usal.es. ...
doi:10.3390/app10072306
fatcat:fudnlytqqvai3pi7xgqfmxe2mm
« Previous
Showing results 1 — 15 out of 1,997 results