Finite size scaling of the Bayesian perceptron

Arnaud Buhot, Juan-Manuel Torres Moreno, Mirta B. Gordon
1997 Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics  
We study numerically the properties of the bayesian perceptron through a gradient descent on the optimal cost function. The theoretical distribution of stabilities is deduced. It predicts that the optimal generalizer lies close to the boundary of the space of (error-free) solutions. The numerical simulations are in good agreement with the theoretical distribution. The extrapolation of the generalization error to infinite input space size agrees with the theoretical results. Finite size
more » ... ns are negative and exhibit two different scaling regimes, depending on the training set size. The variance of the generalization error vanishes for N →∞ confirming the property of self-averaging.
doi:10.1103/physreve.55.7434 fatcat:qo3hhlm2jrd7thyyj5xp74yd6m