53,266 Hits in 3.1 sec

Review of Perceptrons

Stephen Grossberg
1989 The AI Magazine  
In the original edition of Perceptrons, Minsky and Papert focused their analysis on a single line of neural network research. Frank Rosenblatt's seminal book on perceptrons.  ...  Thus, substantive technical reviews or informed general assessments of the broad sweep of neural network research are most welcome to help interested scientists find their way into this rapidly evolving  ... 
doi:10.1609/aimag.v10i2.748 dblp:journals/aim/Grossberg89 fatcat:gvpvwxzj5nen3nss7gtt5pjzmu

A review of "perceptrons: An introduction to computational geometry≓

H.D. Block
1970 Information and Control  
function, etc.) are nevertheless much closer to the spirit of Rosenblatt's Perceptron than is the book under review.  ...  exposition in this review, I have slightly altered Minsky and Papert's definition, which initially takes R to be an arbitrary finite set of points.  ...  ACKNOWLEDGMENTS I am grateful to several colleagues for their criticisms of earlier drafts of this review.  ... 
doi:10.1016/s0019-9958(70)90409-2 fatcat:kzfmkuiyqvcwla3hy4amj4adam

Statistical mechanics of lossy compression using multilayer perceptrons

Kazushi Mimura, Masato Okada
2006 Physical Review E  
Statistical mechanics is applied to lossy compression using multilayer perceptrons for unbiased Boolean messages.  ...  For compression using committee tree, a lower bound of achievable distortion becomes small as the number of hidden units K increases. However, it cannot reach the Shannon bound even where K -> infty.  ...  to a monotonic perceptron, ŷ ͑−s͒ =−ŷ ͑s͒ holds.  ... 
doi:10.1103/physreve.74.026108 pmid:17025504 fatcat:5qgej6ht3na2zluwvfe3vp4wlu

Statistical mechanics of an error correcting code using monotonic and nonmonotonic treelike multilayer perceptrons

Florent Cousseau, Kazushi Mimura, Masato Okada
2010 Physical Review E  
An error correcting code using a tree-like multilayer perceptron is proposed.  ...  The influence of the monotonicity of the units on the performance is also discussed.  ...  The purpose of the present paper is to discuss the performance of the same treelike perceptron models but in the error correcting code framework, thus completing the topic of perceptron type network applications  ... 
doi:10.1103/physreve.81.021104 pmid:20365527 fatcat:gyyaw5xf4fdhjbgexxbeozcy5m

Multifractal analysis of perceptron learning with errors

M. Weigt
1998 Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics  
Random input patterns induce a partition of the coupling space of a perceptron into cells labeled by their output sequences.  ...  The results also allow some conclusions on the spatial distribution of cells.  ...  of the perceptron in various supervised learning problems.  ... 
doi:10.1103/physreve.57.955 fatcat:vqeg57l45bbnpemoeg2xuagk2y

Pattern Capacity of a Perceptron for Sparse Discrimination

Vladimir Itskov, L. F. Abbott
2008 Physical Review Letters  
If q is a sublinear function of N, the number of inputs to the perceptron, these capacities are exponential in N=q.  ...  We evaluate the capacity and performance of a perceptron discriminator operating in a highly sparse regime where classic perceptron results do not apply.  ...  A simple model of neural selectivity is the single-layer perceptron [4] .  ... 
doi:10.1103/physrevlett.101.018101 pmid:18764154 fatcat:5qcqdyesmvgdhgaqzkitzopika

Statistical mechanics of lossy compression for nonmonotonic multilayer perceptrons

Florent Cousseau, Kazushi Mimura, Toshiaki Omori, Masato Okada
2008 Physical Review E  
The AT stability of the Replica Symmetric solution is analyzed, and the tuning of the non-monotonic transfer function is also discussed.  ...  Each of these architectures applies a different transformation to the codeword s. The general architecture of these perceptron-based decoders is shown in Fig. 2 .  ...  However, there is still a lot of work to be done for densely connected systems. One such system is given by using a perceptron-based decoder.  ... 
doi:10.1103/physreve.78.021124 pmid:18850803 fatcat:dajlmzjblnfpfotd5vqvfxwdha

Analysis of ensemble learning using simple perceptrons based on online learning theory

Seiji Miyoshi, Kazuyuki Hara, Masato Okada
2005 Physical Review E  
Ensemble learning of K nonlinear perceptrons, which determine their outputs by sign functions, is discussed within the framework of online learning and statistical mechanics.  ...  The concrete forms of these differential equations are derived analytically in the cases of three well-known rules: Hebbian learning, perceptron learning and AdaTron learning.  ...  ACKNOWLEDGMENT This research was partially supported by the Ministry of Education, Culture, Sports, Science and Technology, Japan, with Grant-in-Aid for Scientific Research 13780313, 14084212, 14580438  ... 
doi:10.1103/physreve.71.036116 pmid:15903502 fatcat:c7bvcowmhnfl7g7f2kp4wt5fhi

Storage capacity of correlated perceptrons

D. Malzahn, A. Engel, I. Kanter
1997 Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics  
We consider an ensemble of K single-layer perceptrons exposed to random inputs and investigate the conditions under which the couplings of these perceptrons can be chosen such that prescribed correlations  ...  A general formalism is introduced using a multi-perceptron costfunction that allows to determine the maximal number of random inputs as a function of the desired values of the correlations.  ...  the outputs of the perceptrons occur.  ... 
doi:10.1103/physreve.55.7369 fatcat:3l4xfhecwvbafo6udq2dfbz6wa

Convergence of stochastic learning in perceptrons with binary synapses

Walter Senn, Stefano Fusi
2005 Physical Review E  
As an additional constraint, the synapses are only changed if the output neuron does not give the desired response, as in the case of classical perceptron learning.  ...  The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered ͑i.e., the synapses are immediately modified by each pattern͒: the trace of  ...  As the patterns are not linearly separable and therefore not classifiable by a single perceptron, each class is learned by a group of ten independent perceptrons.  ... 
doi:10.1103/physreve.71.061907 pmid:16089765 fatcat:s3gqw4zlhjai7czaqvzwqjk7ha

Generalization ability of a perceptron with nonmonotonic transfer function

Jun-ichi Inoue, Hidetoshi Nishimori, Yoshiyuki Kabashima
1998 Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics  
By the perceptron algorithm the generalization error is shown to decrease by the α^-1/3-law similarly to the case of a simple perceptron in a restricted range of the parameter a characterizing the non-monotonic  ...  We investigate the generalization ability of a perceptron with non-monotonic transfer function of a reversed-wedge type in on-line mode.  ...  Y.K. was partially supported by the program ''Research for the future ͑RFTF͒'' of Japan Society for the Promotion of Science.  ... 
doi:10.1103/physreve.58.849 fatcat:3mwhlqmrnbeexjmahuife6xdca

Efficient training of multilayer perceptrons using principal component analysis

Christoph Bunzmann, Michael Biehl, Robert Urbanczik
2005 Physical Review E  
In this context, methods from the statistical physics of disordered systems have been applied successfully in the analysis of model learning scenarios, see Refs. ͓1,2͔ for reviews.  ...  It may furthermore depend on the field W T of an auxiliary perceptron with weights W R N .  ... 
doi:10.1103/physreve.72.026117 pmid:16196654 fatcat:u5mwlhgbunadrpcjaavpjuejcy

Multifractality and percolation in the coupling space of perceptrons

M. Weigt, A. Engel
1997 Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics  
The coupling space of perceptrons with continuous as well as with binary weights gets partitioned into a disordered multifractal by a set of p=γ N random input patterns.  ...  The storage capacity and the generalization behaviour of the perceptron are shown to be related to properties of f(α) which are correctly described within the replica symmetric ansatz.  ...  Part of these investigations are summarized in recent reviews ͓2,3͔.  ... 
doi:10.1103/physreve.55.4552 fatcat:wfhnqa2f45eidfwdox5csgzmda

Learning rate and attractor size of the single-layer perceptron

Martin S. Singleton, Alfred W. Hübler
2007 Physical Review E  
We also demonstrate that the learning rate is determined by the attractor size, and that the attractors of a single-layer perceptron with N inputs partition R N R N .  ...  Based on our studies, we conclude that a single-layer perceptron with N inputs will converge in an average number of steps given by an Nth order polynomial in t l , where t is the threshold, and l is the  ...  The dashed lines corresponding to w 1 = t, w 2 = t, and w 2 =−w 1 + t partition the initial weight vector space into the seven regions A 1 -A 7 .LEARNING RATE AND ATTRACTOR SIZE OF THE… PHYSICAL REVIEW  ... 
doi:10.1103/physreve.75.026704 pmid:17358448 fatcat:h3oyd5x4s5asbaglc6kfcxx26y

Book Review: Perceptrons, An Introduction to Computational Geometry

Jan Mycielski
1972 Bulletin of the American Mathematical Society  
{The reviewers ideology is different. He thinks that the benefits of synthetic thinking are unpredictable.  ...  This book is a very interesting and penetrating study of the power of expression of perceptrons and some other mathematical problems concerning memory and learning.  ...  Each of these books has a strong allure for the modern analyst, for Maurin has assembled unique collections of interesting topics.  ... 
doi:10.1090/s0002-9904-1972-12831-3 fatcat:d74vgzxogfavrhfazsbcs6wgh4
« Previous Showing results 1 — 15 out of 53,266 results