Filters








58 Hits in 4.1 sec

Static and dynamic attractors of autoassociative neural networks [chapter]

Dmitry O. Gorodnichy, Alexandre M. Reznik
1997 Lecture Notes in Computer Science  
Of particular interest is the pseudo-inverse network with reduced self-connection.  ...  In this paper we study the problem of the occurrence of cycles in autoassociative neural networks.  ...  In particular, it increases attraction radii and allows the network to retrieve up to 0.75N prototypes.  ... 
doi:10.1007/3-540-63508-4_128 fatcat:iowz2qakebf4nhbpiesswuzgnq

Increasing the capacity of a hopfield network without sacrificing functionality [chapter]

Amos Storkey
1997 Lecture Notes in Computer Science  
This capacity can be increased to n by using the pseudo-inverse rule. However, capacity is not the only consideration.  ...  The Hebbian rule is all of these, but the pseudo-inverse is never incremental, and local only if not immediate.  ...  The Hebb rule is local, incremental and immediate, but has an absolute capacity of n=(2 lnn) 6]. To increase the capacity of the network, the pseudo-inverse learning rule can be used.  ... 
doi:10.1007/bfb0020196 fatcat:lcs5vonldrhslh4izittkxb5ga

Face Recognition by Applying Wavelet Subband Representation and Kernel Associative Memory

B.-L. Zhang, H. Zhang, S.S. Ge
2004 IEEE Transactions on Neural Networks  
Our scheme using modular autoassociative memory for face recognition is inspired by the same motivation as using autoencoders for optical character recognition (OCR), for which the advantages has been  ...  Detailed comparisons with earlier published results are provided and our proposed scheme offers better recognition accuracy on all of the face datasets.  ...  There is another type of linear associative memory known as the pseudo-inverse or generalized-inverse memory [11] , [15] , [16] .  ... 
doi:10.1109/tnn.2003.820673 pmid:15387257 fatcat:w72irccuifca5exuxqsn2hsg5q

Recurrent Neural Networks: Associative Memory and Optimization

K. -L. Du
2011 Journal of Information Technology & Software Engineering  
pseudo-inverse rule. 4.  ...  A learning problem in a Hopfield network with J units can be transformed into a learning problem for a perceptron of dimension ( 1) 2 J J + [99], and thus every learning  ...  The inverse-Hebbian rule provides ideal initial conditions for any algorithm capable of increasing the pattern stability.  ... 
doi:10.4172/2165-7866.1000104 fatcat:zqfpc2fpwzhufigwozssmylotu

Linear and logarithmic capacities in associative neural networks

S.S. Venkatesh, D. Psaltis
1989 IEEE Transactions on Information Theory  
The storage capacity of the spectral strategy is linear in n, the dimension of the state space under consideration, while an asymptotic result of n/4logn holds for the storage capacity of the outer product  ...  Comparisons are made between this spectral strategy and a prior proposed scheme which utilizes the sum of Kronecker outer products of the prescribed set of state vectors which are to function nominally  ...  the pseudo-inverse of a matrix.  ... 
doi:10.1109/18.30977 fatcat:ikin54s6ozcppkpbpqtjsen3fe

An Exponential Response Neural Net

Shlomo Geva, Joaquin Sitte
1991 Neural Computation  
Maximum storage capacity in neural networks. Europhys. Lett. 4, 481-485. Geva, S., and Sitte, J. 1990a. Increasing the storage capacity and shaping the basins of attraction of neural nets.  ...  A pseudo-inverse neural net with storage capacity exceeding N. Proceedings of the International Joint Conference on Neural Net- work, San Diego, June, Vol. I, pp. 783-7868.  ... 
doi:10.1162/neco.1991.3.4.623 fatcat:r6qbgeupm5a45m2lguiknvem5u

A Simple Derivation of a Bound on the Perceptron Margin Using Singular Value Decomposition

Omri Barak, Mattia Rigotti
2011 Neural Computation  
carried out in the limit of infinite input size.  ...  Here we present a short analysis of perceptron classification using singular value decomposition.  ...  increasing the size of the basin of attraction of the fixed points of autoassociative neural networks (Krauth & Mézard, 1987; Forrest, 1988; Gardner & Derrida, 1988; Kepler & Abbott, 1988) .  ... 
doi:10.1162/neco_a_00152 fatcat:n2dgrqiyxbg7ngsc2priqkral4

Autoassociative Memory and Pattern Recognition in Micromechanical Oscillator Network

Ankit Kumar, Pritiraj Mohanty
2017 Scientific Reports  
We also thank him for the experimental results demonstrating synchronization in a network of two coupled MEMS oscillators. These results served as the central motivation for the current study.  ...  We thank Diego Guerra for collaboration during the early part of the project.  ...  of attraction of the fixed phase configurations remain attractive.  ... 
doi:10.1038/s41598-017-00442-y pmid:28341856 pmcid:PMC5428492 fatcat:xh2o7iglqjafjnlq46qs62nlwu

Convergence dynamics and pseudo almost periodicity of a class of nonautonomous RFDEs with applications

Meng Fan, Dan Ye
2005 Journal of Mathematical Analysis and Applications  
networks, the hybrid network models of the cellular neural network type, and some population growth model.  ...  Sufficient criteria are established for the globally exponential stability and the existence and uniqueness of pseudo almost periodic solution.  ...  A network of the form (1.2) generalizes the single layer autoassociative Hebbian correlator network model and some unidirectional network models of Cohen and Grossberg [13] .  ... 
doi:10.1016/j.jmaa.2004.10.050 fatcat:3ygpfif7bvei5p5kx32xnza3qi

The optimal value of self-connection

D.O. Gorodnichy
IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)  
The fact that reducing self-connections improves the performance of the autoassociative networks built by the pseudo-inverse learning rule is known already for quite a while, but is not studied completely  ...  In particular, it is known that decreasing of self-connection increases the direct attraction radius of the network, but it also known that it increases the number of spurious dynamic attractors.  ...  Conclusions We studied how reduction of self-connection a ects the performance of pseudo-inverse neural networks.  ... 
doi:10.1109/ijcnn.1999.831579 dblp:conf/ijcnn/Gorodnichy99 fatcat:w4rdj5xbk5f2hnhdvmf2htc3xu

Reservoir Computing with Output Feedback

René Felix Reinhart
2012 Künstliche Intelligenz  
It is shown that output feedback enables the implementation of ambiguous inverse models by means of multi-stable dynamics.  ...  Forward and inverse models are trained in associative recurrent neural networks that are based on non-linear random projections.  ...  Both, the left and right pseudo-inverse of A, are then used for the regression of the network weights according to targeted neural activities in the next time step.  ... 
doi:10.1007/s13218-012-0187-2 fatcat:57hsx5ut25bfrmechcnjouekhi

Catastrophic Forgetting and the Pseudorehearsal Solution in Hopfield-type Networks

ANTHONY ROBINS, SIMON McCALLUM
1998 Connection science  
By using the ratio of high energy to low energy parts of the network we can robustly distinguish the learnt patterns from the large number of spurious "fantasy" patterns that are common in these networks  ...  In this thesis we discuss the problem of catastrophic forgetting in Hopfield networks, and investigate various potential solutions.  ...  The pseudo inverse (Hertz, Krough, and Palmer, 1991; Storkey, 1997) learning rule is an example of a non-local learning procedure.  ... 
doi:10.1080/095400998116530 fatcat:4g5qyus4pjcvzbvkraujxxqxoe

Recurrent Neural Networks [chapter]

Sajid A. Marhon, Christopher J. F. Cameron, Stefan C. Kremer
2013 Intelligent Systems Reference Library  
Learning is a critical issue and one of the primary advantages of neural networks.  ...  This book is a summary of work on recurrent neural networks and is exemplary of current research ideas and challenges in this subfield of artificial neural network research and development.  ...  Lund and Stefano Nolfi who did most of the development of the kepsim simulator, which has been used (in slightly adapted form) to implement the experiments documented in this paper.  ... 
doi:10.1007/978-3-642-36657-4_2 fatcat:jnmgv7rlifhuncqhi5kxldepdm

Associative memories based on continuous-time cellular neural networks designed using space-invariant cloning templates

Zhigang Zeng, Jun Wang
2009 Neural Networks  
The designed associative memories are robust in terms of design parameter selection. In addition, the hosting cellular neural networks are guaranteed to be globally exponentially stable.  ...  inputs instead of initial states.  ...  The design procedure herein, based on duality in Lemma 1, results in a set of inequalities instead of a set of equations solved using matrix pseudo-inverse in some existing approaches (e.g., Grassi (1997  ... 
doi:10.1016/j.neunet.2009.06.031 pmid:19604674 fatcat:inkg4kpjevdfnikzrgw3vg44oi

Designs of Second-Order Associated Memory Networks with Threshold Logics: Winner-Take-All and Selective Voting

Toshiro Kubota
2009 Journal of Computers  
Due to these properties, it is of both practical and scientific interests to investigate efficient computational mechanisms of such network.  ...  Among higher order associative memory models (d > 1), the second order memory (d = 2) has attractive properties: a relatively small implementation cost of O(N 2 ), a small number of spurious states, and  ...  In [5] , Gorodnichy demonstrated that the size of the attraction basin could be increased with a proper amount of self-feedback in a network trained with the pseudo-inverse learning.  ... 
doi:10.4304/jcp.4.10.962-974 fatcat:sxkiy5hisrc7vimboswmskc5za
« Previous Showing results 1 — 15 out of 58 results