Filters








388 Hits in 3.7 sec

Vapnik-Chervonenkis dimension of recurrent neural networks [chapter]

Pascal Koiran, Eduardo D. Sontag
1997 Lecture Notes in Computer Science  
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks.  ...  This paper provides lower and upper bounds for the VC dimension of such networks.  ...  In particular, we use recurrent (sometimes "feedback" or "dynamic") neural networks.  ... 
doi:10.1007/3-540-62685-9_19 fatcat:matqxwvbxvbxzne6766zyw2x7m

Vapnik-Chervonenkis dimension of recurrent neural networks

Pascal Koiran, Eduardo D. Sontag
1998 Discrete Applied Mathematics  
Most of the work on the Vapnik-~he~one~~s dimension of neural networks has been focused on feedforward networks.  ...  This paper provides lower and upper bounds for the VC dimension of such networks.  ...  Acknowledgements We thank an anonymous referee for his very careful reading of the manuscript and many useful comments.  ... 
doi:10.1016/s0166-218x(98)00014-6 fatcat:2u6wcl74vjgjrcf7ohplioqrfi

Special issue of DAM on the vapnik-chervonenkis dimension

John Shawe-Taylor
1998 Discrete Applied Mathematics  
In contrast, Koiran and Sontag consider the classical VC dimension but compute its value for various classes of recurrent neural networks.  ...  Special issue of DAM on the Vapnik-Chervonenkis dimension The Vapnik-Chervonenkis (VC) is a combinatorial parameter of a class of binary functions or set system which has been shown to characterise the  ... 
doi:10.1016/s0166-218x(98)00019-5 fatcat:pvw5pow6anamdgma34w3z3bqu4

Page 288 of Neural Computation Vol. 5, Issue 2 [page]

1993 Neural Computation  
Including hints in training neural networks. Neural Comp. 3, 418-427. Blumer, A., Ehrenfeucht, A., Haussler, D., and Warmuth, M. 1989. Learnability and the Vapnik-Chervonenkis dimension. J.  ...  Symbolic neural systems and the use of hints for developing complex systems. Intl. ]. Machine Stud. 35, 291. Vapnik, V., and Chervonenkis, A. 1971.  ... 

Page 1348 of Neural Computation Vol. 8, Issue 6 [page]

1996 Neural Computation  
Urbanczik VapnikChervonenkis Generalization Bounds for Real Valued Neural Networks Arne Hole The Error Surface of the Simplest XOR Network Has Only Global Minima Ida G.  ...  Article The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine Extraction Mike Casey Note Using Bottlenecks in Feedforward Networks as a Dimension  ... 

Page 5274 of Mathematical Reviews Vol. , Issue 98H [page]

1998 Mathematical Reviews  
Summary: “Most of the work on the Vapnik-Chervonenkis dimen- sion of neural networks has been focused on feedforward networks.  ...  {For the entire collection see MR 98f:68007.} 98h:68194 68T0S5 Koiran, Pascal (F-ENSLY-IP; Lyon); Sontag, Eduardo D. (1-RTG; New Brunswick, NJ) Vapnik-Chervonenkis dimension of recurrent neural networks  ... 

Page 471 of Neural Computation Vol. 4, Issue 3 [page]

1992 Neural Computation  
Bayesian Framework for Backpropagation Networks References Abu-Mostafa, Y. S. 1990a. The Vapnik-Chervonenkis dimension: Information versus complexity in learning. Neural Comp. 1(3), 312-317.  ...  Recurrent back-propagation and the dynamical approach to adaptive neural computation. Neural Comp. 1, 161-172.  ... 

Page 39 of Neural Computation Vol. 8, Issue 1 [page]

1996 Neural Computation  
Advances in Neural Information Processing Systems, Vol. 7, 183-190. MIT Press, Cambridge, MA. Maass, W. 1995a. Vapnik-Chervonenkis dimension of neural nets.  ...  Computational Power of Networks of Spiking Neurons 39 Horne, B. G., and Hush, D. R. 1994. Bounds on the complexity of recurrent neural network implementations of finite state machines.  ... 

Bounding sample size with the Vapnik-Chervonenkis dimension

John Shawe-Taylor, Martin Anthony, N.L. Biggs
1993 Discrete Applied Mathematics  
A proof that a concept is learnable provided the Vapnik-Chervonenkis dimension is finite is given.  ...  The proof is more explicit than previous proofs and introduces two new parameters which allow bounds on the sample size obtained to be improved by a factor of approximately 4log 2 (e).  ...  Vapnik-Chervonenkis Dimension We now consider generalising the result from finite sets of hypotheses to sets of hypotheses with finite Vapnik-Chervonenkis dimension.  ... 
doi:10.1016/0166-218x(93)90179-r fatcat:rqkpkjwjh5afrejjdortokkdey

An Efficient Method for Selecting the Optimal Structure of a Fuzzy Neural Network Architecture

Bojan Novak
2001 Journal of Computing and Information Technology  
The Vapnik Chervonenkis (VC) dimension is introduced as a measure of the capacity of the learning machine.  ...  A comparison between fuzzy neural network and the neural network ARX model is presented.  ...  The Vapnik Chervonenkis (VC) dimension is applied as a measure of the capacity of the learning machine.  ... 
doi:10.2498/cit.2001.02.02 fatcat:cqqzoyts55cj5ckhdxzir73gl4

Learning pattern classification-a survey

S.R. Kulkarni, G. Lugosi, S.S. Venkatesh
1998 IEEE Transactions on Information Theory  
Topics discussed include nearest neighbor, kernel, and histogram methods, Vapnik-Chervonenkis theory, and neural networks.  ...  The presentation and the large (thogh nonexhaustive) list of references is geared to provide a useful overview of this field for both specialists and nonspecialists.  ...  Topics discussed include nearest neighbor, kernel, and histogram methods, Vapnik-Chervonenkis theory, and neural networks.  ... 
doi:10.1109/18.720536 fatcat:pboyft5ze5gwphln5bpglatbam

Computing Time Lower Bounds for Recurrent Sigmoidal Neural Networks

M. Schmitt
2001 Neural Information Processing Systems  
Recurrent neural networks of analog units are computers for realvalued functions. We study the time complexity of real computation in general recurrent neural networks.  ...  Thus, evidence is given of the computational limitations that time-bounded analog recurrent neural networks are subject to.  ...  This work was also supported in part by the ESPRIT Working Group in Neural and Computational Learning II, NeuroCOLT2, No. 27150.  ... 
dblp:conf/nips/Schmitt01 fatcat:4kp7g32yffcadbsnh2if5fd46q

Page 4470 of Psychological Abstracts Vol. 79, Issue 10 [page]

1992 Psychological Abstracts  
(U Washington, Seat- tle) How tight are the Vapnik-Chervonenkis bounds? Neural Computation, 1992(Mar), Vol 4(2), 249-269.  ...  These experiments test the relationship between average generalization performance and the worse-case bounds obtained from formal learning theory using the Vapnik-Chervonenkis (VC) dimension (A.  ... 

Page 1601 of Mathematical Reviews Vol. , Issue 92c [page]

1992 Mathematical Reviews  
) Results on learnability and the Vapnik-Chervonenkis dimension.  ...  It is known that a concept class @ is pac-learnable (by static sampling) if and only if it has a finite Vapnik-Chervonekis (VC) dimension.  ... 

A Neurodynamical System for finding a Minimal VC Dimension Classifier [article]

Jayadeva, Sumit Soman, Amit Bhaya
2015 arXiv   pre-print
The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an exact bound on the Vapnik-Chervonenkis (VC) dimension.  ...  In this paper, we describe a neural network based on a linear dynamical system, that converges to the MCM solution.  ...  The complexity of learning systems, such as SVMs, can be estimated by the Vapnik-Chervonenkis (VC) dimension.  ... 
arXiv:1503.03148v1 fatcat:24w5kmuq7jhkvknbhy7wnossri
« Previous Showing results 1 — 15 out of 388 results