Filters








6,248 Hits in 4.3 sec

Techniques for Learning Binary Stochastic Feedforward Neural Networks [article]

Tapani Raiko, Mathias Berglund, Guillaume Alain, Laurent Dinh
2015 arXiv   pre-print
Stochastic binary hidden units in a multi-layer perceptron (MLP) network give at least three potential benefits when compared to deterministic MLP networks. (1) They allow to learn one-to-many type of  ...  We propose two new estimators for the training gradient and propose benchmark tests for comparing training algorithms.  ...  STOCHASTIC FEEDFORWARD NEURAL NETWORKS We study a model that maps inputs x to outputs y through stochastic binary hidden units h.  ... 
arXiv:1406.2989v3 fatcat:sfgbyogvwncszpvd2r2za2q6au

Implementation of an incremental deep learning model for survival prediction of cardiovascular patients

Sanaa Elyassami, Achraf Ait Kaddour
2021 IAES International Journal of Artificial Intelligence (IJ-AI)  
In this paper, an incremental deep learning model was developed and trained with stochastic gradient descent using feedforward neural networks.  ...  The impact of the learning rate and the depth of neural networks on the performance were explored.  ...  In this paper, we developed a feedforward neural network model (FFNN) based on a multilayer feedforward artificial neural network.  ... 
doi:10.11591/ijai.v10.i1.pp101-109 fatcat:j642dyc3jbem3fof2smkfsvnyi

Hybrid artificial neural network

Nadia Nedjah, Ajith Abraham, Luiza M. Mourelle
2007 Neural computing & applications (Print)  
In the fourth paper, entitled "Reconfigurable hardware for neural networks: binary vs. stochastic", N. Nedjah and L. M.  ...  Blum focus on the training of feedforward neural networks for pattern classification to test the efficiency and practicality of continuous ant colony optimization.  ...  In the fourth paper, entitled ''Reconfigurable hardware for neural networks: binary vs. stochastic'', N. Nedjah and L. M.  ... 
doi:10.1007/s00521-007-0083-0 fatcat:ktd2xql7ybbuvaccf2z42duydm

Autonomous self-configuration of artificial neural networks for data classification or system control

Wolfgang Fink, Wolfgang Fink
2009 Space Exploration Technologies II  
We report on the use of a Stochastic Optimization Framework (SOF; Fink, SPIE 2008) for the autonomous self-configuration of Artificial Neural Networks (i.e., the determination of number of hidden layers  ...  Artificial neural networks (ANNs) are powerful methods for the classification of multi-dimensional data as well as for the control of dynamic systems.  ...  For the neural couplings we considered binary couplings ±1.  ... 
doi:10.1117/12.821836 fatcat:vsdedhnjnbdtzelpx5xwtdbyp4

Toward Computation and Memory Efficient Neural Network Acoustic Models with Binary Weights and Activations [article]

Liang Lu
2017 arXiv   pre-print
This paper investigates the use of binary weights and activations for computation and memory efficient neural network acoustic models.  ...  In this paper, we study the applications of binary weights and activations for neural network acoustic modeling, reporting encouraging results on the WSJ and AMI corpora.  ...  ACKNOWLEDGEMENTS We thank the NVIDIA Corporation for the donation of a Titan X GPU used in this work, and Karen Livescu for proofreading and comments that have improved the manuscript.  ... 
arXiv:1706.09453v2 fatcat:3drgayse3raujkco3cb3mdmp2u

Binary Output Layer of Feedforward Neural Networks for Solving Multi-Class Classification Problems

Sibo Yang, Chao Zhang, Wei Wu
2019 IEEE Access  
Considered in this short note is the design of output layer nodes of feedforward neural networks for solving multiple-class classification problems with r (r ≥ 3) classes of samples.  ...  This idea of binary output is also applied for other classifiers, such as support vector machines and associative pulsing neural networks.  ...  Theorem 1: Suppose that feedforward neural networks with two hidden nodes are used for solving a four-class classification problem.  ... 
doi:10.1109/access.2018.2888852 fatcat:242sjuotkjbhdf4x6nvdvx4aou

Stochastic Neural Networks [chapter]

Xuerong Mao
2011 Stochastic Differential Equations and Applications  
We sample some basic results about neural networks as they relate to stochastic and statistical processes.  ...  Artificial neural networks are brain-like models of parallel computations and cognitive phenomena.  ...  Unsupervised learning is useful with many^hidden layer feedforward networks where backpropagation is very slow and expensive, as a follow-up to supervised learning to allow for adaptation of the network  ... 
doi:10.1533/9780857099402.351 fatcat:ahbtvskc35c2npu4utxlxjbg7q

RETRACTED: Evolutionary design of constructive multilayer feedforward neural network

Ching-Han Chen, Tun-Kai Yao, Chia-Ming Kuo, Chen-Yuan Chen
2012 Journal of Vibration and Control  
Based on the constructive representation of multilayer feedforward neural networks, we use a genetic encoding method, after which the evolution process is elaborated for designing the optimal neural network  ...  This paper proposes an evolutionary design methodology of multilayer feedforward neural networks based on constructive approach.  ...  A feedforward multi-layer neural network is composed of K neural layers connected in serials.  ... 
doi:10.1177/1077546312456726 fatcat:uuhvn5r34beijcoz4pmouef3zu

Neuron clustering for mitigating catastrophic forgetting in feedforward neural networks

Ben Goodrich, Itamar Arel
2014 2014 IEEE Symposium on Computational Intelligence in Dynamic and Uncertain Environments (CIDUE)  
In this paper, we examine a novel neural network architecture which utilizes online clustering for the selection of a subset of hidden neurons to be activated in the feedforward and back propagation passes  ...  Catastrophic forgetting is a fundamental problem with artificial neural networks (ANNs) in which learned representations are lost as new representations are acquired.  ...  ACKNOWLEDGEMENT The authors would like to thank James Bergstra, Daniel Yamins, and David D Cox for the hyperopt library for doing automated hyperparameter optimization [15] .  ... 
doi:10.1109/cidue.2014.7007868 dblp:conf/cidue/GoodrichA14 fatcat:4vcfrfld5reqlj3ejhthytymwq

Quantum Neural Machine Learning: Backpropagation and Dynamics

Carlos Pedro Gonçalves
2016 NeuroQuantology  
The current work addresses quantum machine learning in the context of Quantum Artificial Neural Networks such that the networks' processing is divided in two stages: the learning stage, where the network  ...  regime that intermixes patterns of dynamical stochasticity and persistent quasiperiodic dynamics, making emerge a form of noise resilient dynamical record. the feedforward direction, during the quantum  ...  Quantum Neural Machine Learning Learning and Backpropagation in Feedforward Networks In classical ANNs, a neuron with a binary firing activity can be described in terms of a binary alphabet   2 =  ... 
doi:10.14704/nq.2017.15.1.1008 fatcat:k7aur5gayzgtlkktw3reyodbf4

Shortest Path Distance Approximation Using Deep Learning Techniques

Fatemeh Salehi Rizi, Joerg Schloetterer, Michael Granitzer
2018 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM)  
We show that a feedforward neural network fed with embeddings can approximate distances with relatively low distortion error.  ...  In this paper, we utilize vector embeddings learnt by deep learning techniques to approximate the shortest paths distances in large graphs.  ...  The definitions of the binary operations are listed in the Table I . Eventually, vectors of the training set serve as input for a feedforward neural network.  ... 
doi:10.1109/asonam.2018.8508763 dblp:conf/asunam/RiziSG18 fatcat:7oxiop5lgrbu7oytwouraxdksy

Providing understanding of the behavior of feedforward neural networks

S.H. Huang, M.R. Endsley
1997 IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)  
Several papers which propose extracting rules from feedforward neural networks can be found in the literature, however, these approaches can only deal with networks with binary inputs.  ...  subsymbolic level and hence is very difficult for a human user to comprehend.  ...  The similar-weight approach is not always useful due to the stochastic nature of neural network learning. V.  ... 
doi:10.1109/3477.584953 pmid:18255885 fatcat:oec6m3ykz5fexcertf77pkopju

Learning Stochastic Feedforward Neural Networks

Yichuan Tang, Ruslan Salakhutdinov
2013 Neural Information Processing Systems  
In this paper, we propose a stochastic feedforward network with hidden layers composed of both deterministic and stochastic variables.  ...  Multilayer perceptrons (MLPs) or neural networks are popular models used for nonlinear regression and classification tasks.  ...  These techniques have previously been used by [22, 23] with success. Figure 1 : 1 Figure 1: Stochastic Feedforward Neural Networks. Left: Network diagram.  ... 
dblp:conf/nips/TangS13 fatcat:jbgbf2nvmfa4vczhjn6ekzfora

Quantum Neural Machine Learning - Backpropagation and Dynamics [article]

Carlos Pedro Gonçalves
2016 arXiv   pre-print
The current work addresses quantum machine learning in the context of Quantum Artificial Neural Networks such that the networks' processing is divided in two stages: the learning stage, where the network  ...  The results are extended to general architectures including recurrent networks that interact with an environment, coupling with it in the neural links' activation order, and self-organizing in a dynamical  ...  Quantum Neural Machine Learning Learning and Backpropagation in Feedforward Networks In classical ANNs, a neuron with a binary firing activity can be described in terms a binary alphabet A 2 = {0, 1}  ... 
arXiv:1609.06935v1 fatcat:ettnlbe5onfrbgj5cssxlw4rli

CIS Publication Spotlight [Publication Spotlight]

Haibo He, Jon Garibaldi, Kay Chen Tan, Julian Togelius, Yaochu Jin, Yew Soon Ong
2019 IEEE Computational Intelligence Magazine  
We introduce the proposed continuous dropout to a feedforward neural network and com- Systems, Vol. 29, No. 8, August 2018 , pp. 3573-3587.  ...  In this paper, we propose a costsensitive (CoSen) deep neural network, which can automatically learn robust feature representations for both the majority and minority classes.  ... 
doi:10.1109/mci.2018.2881780 fatcat:upclrns47fgjnjxbjybczxzp34
« Previous Showing results 1 — 15 out of 6,248 results