Filters








30 Hits in 5.9 sec

CB3: An Adaptive Error Function for Backpropagation Training

Michael Rimer, Tony Martinez
2006 Neural Processing Letters  
Effective backpropagation training of multi-layer perceptrons depends on the incorporation of an appropriate error or objective function.  ...  This work presents CB3, a novel CB approach that learns the error function to be used while training.  ...  It is an adapting error function that dynamically sets output target values during training by learning confidence margins on each pattern in the training set.  ... 
doi:10.1007/s11063-006-9014-9 fatcat:7aslkmbjszhffeaysfzlfft4na

Deep learning as optimal control problems: Models and numerical methods

Martin Benning, ,School of Mathematical Sciences, Queen Mary University of London, London E1 4NS, UK, Elena Celledoni, Matthias J. Ehrhardt, Brynjulf Owren, Carola-Bibiane Schönlieb, ,Department of Mathematical Sciences, NTNU, 7491 Trondheim, Norway, ,Institute for Mathematical Innovation, University of Bath, Bath BA2 7JU, UK, ,Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge CB3 0WA, UK
2019 Journal of Computational Dynamics  
We consider recent work of [18] and [9] , where deep learning neural networks have been interpreted as discretisations of an optimal control problem subject to an ordinary differential equation constraint  ...  This leads to a class of algorithms for solving the discrete optimal control problem which guarantee that the corresponding discrete necessary conditions for optimality are fulfilled.  ...  MB acknowledges support from the Leverhulme Trust Early Career Fellowship ECF-2016-611 'Learning from mistakes: a supervised feedbackloop for imaging applications'.  ... 
doi:10.3934/jcd.2019009 fatcat:u7k3rr5qijec3nr6xbvo66vmta

Fast Learning in Networks of Locally-Tuned Processing Units

John Moody, Christian J. Darken
1989 Neural Computation  
The total squared error for an entire training set is: N B=5>(F@) FG), (1.5) i=1 where Z; is the i-th training. pattern, f is the total network output, and f* is the desired result.  ...  The number of modifiable network parameters, training set error, test set error, and total computation time in Sun 3/50 CPU seconds is shown in the following table: Network #Par Time Train Error Test Error  ... 
doi:10.1162/neco.1989.1.2.281 fatcat:kwc7ehh7jfcnfes4byceoxap2u

Forecasting Oil Production Flowrate Based on an Improved Backpropagation High-Order Neural Network with Empirical Mode Decomposition

Joko Nugroho Prasetyo, Noor Akhmad Setiawan, Teguh Bharata Adji
2022 Processes  
This study proposes a machine learning system based on a hybrid empirical mode decomposition backpropagation higher-order neural network (EMD-BP-HONN) for oilfields with less frequent measurement.  ...  Developing a forecasting model for oilfield well production plays a significant role in managing mature oilfields as it can help to identify production loss earlier.  ...  After the error backpropagation is completed, the weights for all connecting layers shall then be updated using the error-correction learning rule adapted to MLMVN.  ... 
doi:10.3390/pr10061137 fatcat:v6q3llcdhvfk3fvi7wvtmtc6m4

Image Restoration with Operators Modeled by Artificial Neural Networks

Ana Paula Abrantes de Castro, José Demisio Simões da Silva, Elcio Hideiti Shiguemori
2009 2009 XXII Brazilian Symposium on Computer Graphics and Image Processing  
Due to the huge amount of data generated for training the ANN, this paper uses clustering techniques to reduce the training set.  ...  The main advantage of the proposed approach is related to the fact that it does not require an estimation of prior knowledge of the degradation causes for each image.  ...  Chosen the centers of the rst hidden layer, the training of the second hidden layer and output layer employs the error backpropagation algorithm to train the MLP.  ... 
doi:10.1109/sibgrapi.2009.44 dblp:conf/sibgrapi/CastroSS09 fatcat:lcgzaedinvf7rlsspyldsppzyi

Group control and identification of residential appliances using a nonintrusive method

Sunil SEMWAL, Munendra SINGH, Rai Sachindra PRASAD
2015 Turkish Journal of Electrical Engineering and Computer Sciences  
Identifying and controlling (ON/OFF) electrical appliance(s) from a remote location is an essential part of energy management.  ...  Amplitudes of the major eight harmonics of load signatures were selected as a feature for the classification. Various classification algorithms were applied to data to check their feasibility.  ...  The working principle of the MLP-ANN is based on minimizing the error function by using a gradient backpropagation algorithm.  ... 
doi:10.3906/elk-1404-59 fatcat:l6whpzvb6ncphn2ng4c5g6vmpy

Application of Intelligent Systems in Volt-VAr Centralized Control in Modern Distribution Systems of Electrical Energy

Hugo A. R. Florez, Gloria P. López, Edgar M. Carreño-Franco, Jesús M. López-Lezama, Nicolás Muñoz-Galeano
2022 Electronics  
Training was performed from available measurements and actions recorded at the system control center.  ...  control variables such as active and reactive power generation of distributed generators (DGs), modules in operation of capacitor banks, and voltage regulator taps; these with the purpose of ensuring an  ...  The main SVMs objective is to build an optimal decision function that, from a data set with expected inputs and outputs, classifies new inputs while minimizing the classification error [23] .  ... 
doi:10.3390/electronics11030446 fatcat:iujeno42jnfrvjh4wf3ycdz6ii

From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
2019 Zenodo  
developed for the massive extraction of features from EEG signals.  ...  Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone.  ...  MLP utilizes the backpropagation that is a supervised learning technique for the ANN training phase.  ... 
doi:10.5281/zenodo.3455611 fatcat:xtobcnv24nbu7jpeqozw6pagne

A Survey of ReRAM-Based Architectures for Processing-In-Memory and Neural Networks

Sparsh Mittal
2018 Machine Learning and Knowledge Extraction  
In this paper, we present a survey of techniques for designing ReRAM-based PIM and NN architectures.  ...  This paper will be valuable for computer architects, chip designers and researchers in the area of machine learning.  ...  Due to this, an error in MSB leads to much higher penalty than the error in LSB and, hence, training process places higher effort on reducing MSB error for improving final accuracy.  ... 
doi:10.3390/make1010005 dblp:journals/make/Mittal19 fatcat:ti3ud2v6l5bffegfn3gzrm2lca

Learning the Sampling Pattern for MRI [article]

Ferdia Sherry, Martin Benning, Juan Carlos De los Reyes, Martin J. Graves, Georg Maierhofer, Guy Williams, Carola-Bibiane Schönlieb, Matthias J. Ehrhardt
2020 arXiv   pre-print
We demonstrate that this is indeed the case, even if the training data consists of just 7 training pairs of measurements and ground-truth images; with a training set of brain images of size 192 by 192,  ...  for instance, one of the learned patterns samples only 35% of k-space, however results in reconstructions with mean SSIM 0.914 on a test set of similar images.  ...  On the theoretical side, the compressed sensing assumptions have been refined to derive optimal densities for variable density sampling [13] - [15] , to prove bounds on reconstruction errors for variable  ... 
arXiv:1906.08754v2 fatcat:wzrm2bzgbfcupmjwt73oqkmngy

CORPORATE CONTRIBUTORS

1988 The Hastings center report  
The resulting transfer function scales with the noise. The backpropagation algorithm is also adjusted for this network architecture.  ...  The method used to train the network is standard backpropagation. The second model is a modified version of Zipser-Andersen network.  ...  Some have argued that this demonstrates that understanding neural computation is not relevant for understanding cognitive function (see Fodor and Pylyshyn, 1988; Jackendoff, 2002) .  ... 
doi:10.1002/j.1552-146x.1988.tb03932.x fatcat:bkotcyah2ngbfdk4kbx3xnf5v4

Neuro-wavelet based islanding detection technique

Yara Fayyad, Ahmed Osman
2010 2010 IEEE Electrical Power & Energy Conference  
Integrating distributed generator into the existing distribution network is predicted to play an important role in the near future.  ...  The Electric Power Research Institute estimates that distributed generations account for 20% of all new generations going online in the US.  ...  The wavelet power is then compared to an adaptive threshold.  ... 
doi:10.1109/epec.2010.5697180 fatcat:nkcvdjhjvrfyrnffglcggwhvzm

Bayesian Interpolation

David J. C. MacKay
1992 Neural Computation  
Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. "Occam's razor" is automatically embodied by this process.  ...  The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set.  ...  For example, p = 4 yields the cubic splines model. (d) Test error for splines. The number of data points in the test set was 90, cf. number of data points in training set = 37.  ... 
doi:10.1162/neco.1992.4.3.415 fatcat:gl2cbwdpx5dsdd67d7job3sxcu

Bayesian Interpolation [chapter]

David J. C. MacKay
1992 Maximum Entropy and Bayesian Methods  
Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. "Occam's razor" is automatically embodied by this process.  ...  The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set.  ...  For example, p = 4 yields the cubic splines model. (d) Test error for splines. The number of data points in the test set was 90, cf. number of data points in training set = 37.  ... 
doi:10.1007/978-94-017-2219-3_3 fatcat:ojegixvnjbhr7gamd46v2vish4

A PLS-Neural Network Analysis of Motivational Orientations Leading to Facebook Engagement and the Moderating Roles of Flow and Age

Inma Rodríguez-Ardura, Antoni Meseguer-Artola
2020 Frontiers in Psychology  
Built upon the uses and gratifications theory, we develop an integrative and context-specific model that links engagement with enjoyment, self-presentation, and community belonging-identified as motivational  ...  The results provide strong support for the validity of the hypothesized causal, mediating and moderating relationships embodied in the model.  ...  error function to minimize.  ... 
doi:10.3389/fpsyg.2020.01869 pmid:32903790 pmcid:PMC7438855 fatcat:xo5ebgvlg5fqbhadd3pkzgb664
« Previous Showing results 1 — 15 out of 30 results