A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Planning in Factored State and Action Spaces with Learned Binarized Neural Network Transition Models
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
In this paper, we leverage the efficiency of Binarized Neural Networks (BNNs) to learn complex state transition models of planning domains with discretized factored state and action spaces. ...
Experimentally, we show the effectiveness of learning complex transition models with BNNs, and test the runtime efficiency of both encodings on the learned factored planning problem. ...
Binarized Neural Networks (BNNs) are neural networks with binary weights and activation functions [Hubara et al., 2016]. ...
doi:10.24963/ijcai.2018/669
dblp:conf/ijcai/SayS18
fatcat:zosgoamtlzbnlamxqyp2apxfpu
Compact and Efficient Encodings for Planning in Factored State and Action Spaces with Learned Binarized Neural Network Transition Models
[article]
2020
arXiv
pre-print
In this paper, we leverage the efficiency of Binarized Neural Networks (BNNs) to learn complex state transition models of planning domains with discretized factored state and action spaces. ...
transition models with BNNs. ...
As an alternative to ReLU-based DNNs, Binarized Neural Networks (BNNs) [8] have been introduced with the specific ability to learn compact models over discrete variables, providing a new formalism for ...
arXiv:1811.10433v12
fatcat:wa4httvsh5brbejmoxmbw77jfq
Asynchronous network of cellular automaton-based neurons for efficient implementation of Boltzmann machines
2018
Nonlinear Theory and Its Applications IEICE
Artificial neural networks with stochastic state transitions and calculations, such as Boltzmann machines, have excelled over other machine learning approaches in various benchmark tasks. ...
The networks often achieve better results than deterministic neural networks of similar sizes, but they require implementation of nonlinear continuous functions for probabilistic density functions, thus ...
Acknowledgments This study was partially supported by the KAKENHI (16K12487), Kayamori Foundation of Information Science Advancement, and The Nakajima Foundation. ...
doi:10.1587/nolta.9.24
fatcat:2l42p2kw25b6thujaqcw2wkcbe
Application of Artificial Intelligence Nuclear Medicine Automated Images Based on Deep Learning in Tumor Diagnosis
2022
Journal of Healthcare Engineering
model based on boundary constraints, and proposes a superpixel boundary-aware convolution network to realize the automatic CT cutting algorithm. ...
The experimental results in this paper show that the improved algorithm in this paper is more robust than the traditional CT algorithm in terms of accuracy and sensitivity, an increase of about 12%, and ...
Deep learning is a branch of machine learning algorithms. e essence is a multilayer neural network structure. ...
doi:10.1155/2022/7247549
pmid:35140903
pmcid:PMC8820925
fatcat:cnus6hwmrbb7ph2ixho4syug3u
Integrating long-range regulatory interactions to predict gene expression using graph convolutional neural networks
[article]
2020
bioRxiv
pre-print
Here, we propose a graph convolutional neural network (GCNN) framework to integrate measurements probing spatial genomic organization and measurements of local regulatory factors, specifically histone ...
Long-range spatial interactions among genomic regions are critical for regulating gene expression and their disruption has been associated with a host of diseases. ...
Acknowledgments We are grateful to members of the COBRE-CBHD Computational Biology Core (CBC) at Brown University for helpful discussions and suggestions. ...
doi:10.1101/2020.11.23.394478
fatcat:wlcmq3temjfivid7kgodhbzxx4
A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges
[article]
2022
arXiv
pre-print
Holographic Reduced Representations is an influential HDC/VSA model that is well-known in the machine learning domain and often used to refer to the whole family. ...
The survey is written to be useful for both newcomers and practitioners. ...
Interplay with neural networks
HVs as input to neural networks One of the most obvious ways to make an interplay between HDC/VSA and neural networks is by using HVs to represent input to neural networks ...
arXiv:2112.15424v2
fatcat:uteoq33hgna2fhs2o46rkde2iq
Deep Learning-Assisted Classification of Site-Resolved Quantum Gas Microscope Images
[article]
2019
arXiv
pre-print
We present a novel method for the analysis of quantum gas microscope images, which uses deep learning to improve the fidelity with which lattice sites can be classified as occupied or unoccupied. ...
We devise two feedforward neural network architectures which are both able to improve upon the fidelity of threshold-based methods, following training on large data sets of simulated images. ...
[16, 17, 18] and evaluating theoretical models of interactions of fermions in an optical lattice [19] . ...
arXiv:1904.08074v1
fatcat:hdznndhaqzcsho4vxpt6a476tm
Deep-learning-assisted classification of site-resolved quantum gas microscope images
2019
Measurement science and technology
We present a novel method for the analysis of quantum gas microscope images, which uses deep learning to improve the fidelity with which lattice sites can be classified as occupied or unoccupied. ...
We devise two neural network architectures which are both able to improve upon the fidelity of threshold-based methods, following training on large data sets of simulated images. ...
The neural networks presented were trained using the HPC infrastructure LEO of the University of Innsbruck. ...
doi:10.1088/1361-6501/ab44d8
fatcat:mdhjgkenbnblzdpwnm4rddu2cq
Verification for Machine Learning, Autonomy, and Neural Networks Survey
[article]
2018
arXiv
pre-print
Autonomy in CPS is enabling by recent advances in artificial intelligence (AI) and machine learning (ML) through approaches such as deep neural networks (DNNs), embedded in so-called learning enabled components ...
Recently, the formal methods and formal verification community has developed methods to characterize behaviors in these LECs with eventual goals of formally verifying specifications for LECs, and this ...
The general neural network model is discussed along with recurrent and feed-forward neural networks. ...
arXiv:1810.01989v1
fatcat:a5ax66lsxbho3fuxuh55ypnm6m
Learning Self-Game-Play Agents for Combinatorial Optimization Problems
[article]
2019
arXiv
pre-print
The ZG also provides a specially designed neural MCTS. We use a combinatorial planning problem for which the ground-truth policy is efficiently computable to demonstrate that ZG is promising. ...
Recent progress in reinforcement learning (RL) using self-game-play has shown remarkable performance on several board games (e.g., Chess and Go) as well as video games (e.g., Atari games and Dota2). ...
Acknowledgements: We would like to thank Tal Puhov for his feedback on our paper. ...
arXiv:1903.03674v2
fatcat:a4hkhku42ff63lohosmghdcyze
Vector Space Semantic Parsing: A Framework for Compositional Vector Space Models
2013
Annual Meeting of the Association for Computational Linguistics
We present vector space semantic parsing (VSSP), a framework for learning compositional models of vector space semantics. ...
We present experiments using noun-verbnoun and adverb-adjective-noun phrases which demonstrate that VSSP can learn composition operations that RNN (Socher et al., 2011) and MV-RNN (Socher et al., 2012) ...
We thank Matt Gardner, Justin Betteridge, Brian Murphy, Partha Talukdar, Alona Fyshe and the anonymous reviewers for their helpful comments. ...
dblp:conf/acl/KrishnamurthyM13
fatcat:4iama44ad5cotcj7dkdg64gzwe
Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons
[article]
2007
arXiv
pre-print
The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural networks with biological connectivity, i.e. sparse connections and separate populations of excitatory ...
We show that the application of such Hebbian learning leads to drastic changes in the network dynamics and structure. ...
While the model studied in the present work is much more compatible with our knowledge of biological neural networks, it is very different from the model studied in [40] . ...
arXiv:0706.2602v1
fatcat:h7ei5nyxwveetfxydatqubk4xy
LC: A Flexible, Extensible Open-Source Toolkit for Model Compression
2021
Proceedings of the 30th ACM International Conference on Information & Knowledge Management
, which results in a "learning-compression" (LC) algorithm. ...
model compression will remain a critical need for the foreseeable future. ...
ACKNOWLEDGMENTS Work partially supported by NSF award IIS-1423515 and by several GPU donations from the NVIDIA Corporation. ...
doi:10.1145/3459637.3482005
fatcat:rbmlfdct75fyvm2wv6ts2h4f2a
NeuralLog: Natural Language Inference with Joint Neural and Logical Reasoning
[article]
2021
arXiv
pre-print
To merge symbolic and deep learning methods, we propose an inference framework called NeuralLog, which utilizes both a monotonicity-based logical inference engine and a neural network language model for ...
Deep learning (DL) based language models achieve high performance on various benchmarks for Natural Language Inference (NLI). ...
Acknowledgements We thank the anonymous reviewers for their insightful comments. We also thank Dr. Michael Wollowski from Rose-hulman Institute of Technology for his helpful feedback on this paper. ...
arXiv:2105.14167v3
fatcat:clpvn6m5hnd6zemyh4k7tsxtwa
26th Annual Computational Neuroscience Meeting (CNS*2017): Part 1
2017
BMC Neuroscience
Our preliminary results show that with learning the network is reweighted into a new structure with relatively high levels of SW (Fig. 1A) , but a fully connected pattern. ...
To study changes in oscillation patterns with learning, we modeled brain processing using a directed random network of phase-coupled oscillators interacting according to the Kuramoto model [1]. ...
Our model also demonstrates that such hierarchical learning and planning can be performed by an unsupervised neural network and therefore hints at a biological implementation. ...
doi:10.1186/s12868-017-0370-3
fatcat:qq2cmqlotbg7vpqlqmmcql4u5i
« Previous
Showing results 1 — 15 out of 642 results