A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Review of Perceptrons

1989
*
The AI Magazine
*

In the original edition

doi:10.1609/aimag.v10i2.748
dblp:journals/aim/Grossberg89
fatcat:gvpvwxzj5nen3nss7gtt5pjzmu
*of**Perceptrons*, Minsky and Papert focused their analysis on a single line*of*neural network research. Frank Rosenblatt's seminal book on*perceptrons*. ... Thus, substantive technical*reviews*or informed general assessments*of*the broad sweep*of*neural network research are most welcome to help interested scientists find their way into this rapidly evolving ...##
###
A review of "perceptrons: An introduction to computational geometry≓

1970
*
Information and Control
*

function, etc.) are nevertheless much closer to the spirit

doi:10.1016/s0019-9958(70)90409-2
fatcat:kzfmkuiyqvcwla3hy4amj4adam
*of*Rosenblatt's*Perceptron*than is the book under*review*. ... exposition in this*review*, I have slightly altered Minsky and Papert's definition, which initially takes R to be an arbitrary finite set*of*points. ... ACKNOWLEDGMENTS I am grateful to several colleagues for their criticisms*of*earlier drafts*of*this*review*. ...##
###
Statistical mechanics of lossy compression using multilayer perceptrons

2006
*
Physical Review E
*

Statistical mechanics is applied to lossy compression using multilayer

doi:10.1103/physreve.74.026108
pmid:17025504
fatcat:5qgej6ht3na2zluwvfe3vp4wlu
*perceptrons*for unbiased Boolean messages. ... For compression using committee tree, a lower bound*of*achievable distortion becomes small as the number*of*hidden units K increases. However, it cannot reach the Shannon bound even where K -> infty. ... to a monotonic*perceptron*, ŷ ͑−s͒ =−ŷ ͑s͒ holds. ...##
###
Statistical mechanics of an error correcting code using monotonic and nonmonotonic treelike multilayer perceptrons

2010
*
Physical Review E
*

An error correcting code using a tree-like multilayer

doi:10.1103/physreve.81.021104
pmid:20365527
fatcat:gyyaw5xf4fdhjbgexxbeozcy5m
*perceptron*is proposed. ... The influence*of*the monotonicity*of*the units on the performance is also discussed. ... The purpose*of*the present paper is to discuss the performance*of*the same treelike*perceptron*models but in the error correcting code framework, thus completing the topic*of**perceptron*type network applications ...##
###
Multifractal analysis of perceptron learning with errors

1998
*
Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
*

Random input patterns induce a partition

doi:10.1103/physreve.57.955
fatcat:vqeg57l45bbnpemoeg2xuagk2y
*of*the coupling space*of*a*perceptron*into cells labeled by their output sequences. ... The results also allow some conclusions on the spatial distribution*of*cells. ...*of*the*perceptron*in various supervised learning problems. ...##
###
Pattern Capacity of a Perceptron for Sparse Discrimination

2008
*
Physical Review Letters
*

If q is a sublinear function

doi:10.1103/physrevlett.101.018101
pmid:18764154
fatcat:5qcqdyesmvgdhgaqzkitzopika
*of*N, the number*of*inputs to the*perceptron*, these capacities are exponential in N=q. ... We evaluate the capacity and performance*of*a*perceptron*discriminator operating in a highly sparse regime where classic*perceptron*results do not apply. ... A simple model*of*neural selectivity is the single-layer*perceptron*[4] . ...##
###
Statistical mechanics of lossy compression for nonmonotonic multilayer perceptrons

2008
*
Physical Review E
*

The AT stability

doi:10.1103/physreve.78.021124
pmid:18850803
fatcat:dajlmzjblnfpfotd5vqvfxwdha
*of*the Replica Symmetric solution is analyzed, and the tuning*of*the non-monotonic transfer function is also discussed. ... Each*of*these architectures applies a different transformation to the codeword s. The general architecture*of*these*perceptron*-based decoders is shown in Fig. 2 . ... However, there is still a lot*of*work to be done for densely connected systems. One such system is given by using a*perceptron*-based decoder. ...##
###
Analysis of ensemble learning using simple perceptrons based on online learning theory

2005
*
Physical Review E
*

Ensemble learning

doi:10.1103/physreve.71.036116
pmid:15903502
fatcat:c7bvcowmhnfl7g7f2kp4wt5fhi
*of*K nonlinear*perceptrons*, which determine their outputs by sign functions, is discussed within the framework*of*online learning and statistical mechanics. ... The concrete forms*of*these differential equations are derived analytically in the cases*of*three well-known rules: Hebbian learning,*perceptron*learning and AdaTron learning. ... ACKNOWLEDGMENT This research was partially supported by the Ministry*of*Education, Culture, Sports, Science and Technology, Japan, with Grant-in-Aid for Scientific Research 13780313, 14084212, 14580438 ...##
###
Storage capacity of correlated perceptrons

1997
*
Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
*

We consider an ensemble

doi:10.1103/physreve.55.7369
fatcat:3l4xfhecwvbafo6udq2dfbz6wa
*of*K single-layer*perceptrons*exposed to random inputs and investigate the conditions under which the couplings*of*these*perceptrons*can be chosen such that prescribed correlations ... A general formalism is introduced using a multi-*perceptron*costfunction that allows to determine the maximal number*of*random inputs as a function*of*the desired values*of*the correlations. ... the outputs*of*the*perceptrons*occur. ...##
###
Convergence of stochastic learning in perceptrons with binary synapses

2005
*
Physical Review E
*

As an additional constraint, the synapses are only changed if the output neuron does not give the desired response, as in the case

doi:10.1103/physreve.71.061907
pmid:16089765
fatcat:s3gqw4zlhjai7czaqvzwqjk7ha
*of*classical*perceptron*learning. ... The finite number*of*synaptic states dramatically reduce the storage capacity*of*a network when online learning is considered ͑i.e., the synapses are immediately modified by each pattern͒: the trace*of*... As the patterns are not linearly separable and therefore not classifiable by a single*perceptron*, each class is learned by a group*of*ten independent*perceptrons*. ...##
###
Generalization ability of a perceptron with nonmonotonic transfer function

1998
*
Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
*

By the

doi:10.1103/physreve.58.849
fatcat:3mwhlqmrnbeexjmahuife6xdca
*perceptron*algorithm the generalization error is shown to decrease by the α^-1/3-law similarly to the case*of*a simple*perceptron*in a restricted range*of*the parameter a characterizing the non-monotonic ... We investigate the generalization ability*of*a*perceptron*with non-monotonic transfer function*of*a reversed-wedge type in on-line mode. ... Y.K. was partially supported by the program ''Research for the future ͑RFTF͒''*of*Japan Society for the Promotion*of*Science. ...##
###
Efficient training of multilayer perceptrons using principal component analysis

2005
*
Physical Review E
*

In this context, methods from the statistical physics

doi:10.1103/physreve.72.026117
pmid:16196654
fatcat:u5mwlhgbunadrpcjaavpjuejcy
*of*disordered systems have been applied successfully in the analysis*of*model learning scenarios, see Refs. ͓1,2͔ for*reviews*. ... It may furthermore depend on the field W T*of*an auxiliary*perceptron*with weights W R N . ...##
###
Multifractality and percolation in the coupling space of perceptrons

1997
*
Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
*

The coupling space

doi:10.1103/physreve.55.4552
fatcat:wfhnqa2f45eidfwdox5csgzmda
*of**perceptrons*with continuous as well as with binary weights gets partitioned into a disordered multifractal by a set*of*p=γ N random input patterns. ... The storage capacity and the generalization behaviour*of*the*perceptron*are shown to be related to properties*of*f(α) which are correctly described within the replica symmetric ansatz. ... Part*of*these investigations are summarized in recent*reviews*͓2,3͔. ...##
###
Learning rate and attractor size of the single-layer perceptron

2007
*
Physical Review E
*

We also demonstrate that the learning rate is determined by the attractor size, and that the attractors

doi:10.1103/physreve.75.026704
pmid:17358448
fatcat:h3oyd5x4s5asbaglc6kfcxx26y
*of*a single-layer*perceptron*with N inputs partition R N R N . ... Based on our studies, we conclude that a single-layer*perceptron*with N inputs will converge in an average number*of*steps given by an Nth order polynomial in t l , where t is the threshold, and l is the ... The dashed lines corresponding to w 1 = t, w 2 = t, and w 2 =−w 1 + t partition the initial weight vector space into the seven regions A 1 -A 7 .LEARNING RATE AND ATTRACTOR SIZE*OF*THE… PHYSICAL*REVIEW*...##
###
Book Review: Perceptrons, An Introduction to Computational Geometry

1972
*
Bulletin of the American Mathematical Society
*

{The

doi:10.1090/s0002-9904-1972-12831-3
fatcat:d74vgzxogfavrhfazsbcs6wgh4
*reviewers*ideology is different. He thinks that the benefits*of*synthetic thinking are unpredictable. ... This book is a very interesting and penetrating study*of*the power*of*expression*of**perceptrons*and some other mathematical problems concerning memory and learning. ... Each*of*these books has a strong allure for the modern analyst, for Maurin has assembled unique collections*of*interesting topics. ...
« Previous

*Showing results 1 — 15 out of 53,266 results*