Phase transitions in the generalization behaviour of multilayer perceptrons: II. The influence of noise release_dzs5c2wv7vd2xanikbgybomw2u

by B. Schottky

Released as a article .

1997  

Abstract

We extend our study of phase transitions in the generalization behaviour of multilayer perceptrons with non-overlapping receptive fields to the problem of the influence of noise, concerning e.g. the input units and/or the couplings between the input units and the hidden units of the second layer (='input noise'), or the final output unit (='output noise'). Without output noise, the output itself is given by a general, permutation-invariant Boolean function of the outputs of the hidden units. As a result we find that the phase transitions, which we found in the deterministic case, mostly persist in the presence of noise. The influence of the noise on the position of the phase transition, as well as on the behaviour in other regimes of the loading parameter α, can often be described by a simple rescaling of α depending on strength and type of the noise. We then consider the problem of the optimal noise level for Gibbsian and Bayesian learning, looking on replica symmetry breaking as well. Finally we consider the question why learning with errors is useful at all.
In text/plain format

Archived Files and Locations

application/pdf   429.3 kB
file_hkdfp3yxj5egfp3ffoq3wabz64
archive.org (archive)
web.archive.org (webarchive)
core.ac.uk (web)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   1997-10-08
Version   v1
Language   en ?
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 29ea3712-9cb1-492e-8fe8-8ca0821840b1
API URL: JSON