A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Power Series Expansion Neural Network
[article]
2021
arXiv
pre-print
This new set of neural networks embeds the power series expansion (PSE) into the neural network structure. ...
In this paper, we develop a new neural network family based on power series expansion, which is proved to achieve a better approximation accuracy in comparison with existing neural networks. ...
Inspired by the power series expansion for a smooth function ( ), namely, ( ) ≈ ∑ =0 , we define each layer by analogy to a power series expansion. ...
arXiv:2102.13221v2
fatcat:srrfowklrfbrdc6e7ppl53gfxy
Thermoelastic steam turbine rotor control based on neural network
2015
Journal of Physics, Conference Series
Simultaneous neural networks are algorithms which can be implemented on PLC controllers. This allows for the application neural networks to control steam turbine stress in industrial power plants. ...
This model was used to train the neural networks on the basis of steam turbine transient operating data. ...
Here the neural network training was performed using the Levenberg-Marquardt algorithm, which was possible because of the series-parallel architecture. ...
doi:10.1088/1742-6596/662/1/012021
fatcat:ienh52ne65dptmgqzbrk3lmo44
Exponential Functional Link Artificial Neural Networks for Denoising of Image Corrupted by Gaussian Noise
2009
2009 International Conference on Advanced Computer Control
With the proper choice of functional expansion in a FLANN , this network performs as good as and in some case even better than the MLP structure for the problem of denoising of an image corrupted with ...
In the presence of additive white Gaussian noise in the image, the performance of the proposed network is found superior to that of a MLP .In particular FLANN structure with exponential function expansion ...
Here the functional expansion block comprises of sine, cosine or power series polynomial. If network has two input i.e. ]' [ 2 1 x x X = . ...
doi:10.1109/icacc.2009.120
fatcat:npdiunwt4vaoln5ukgmgu2mvye
Rational Approximant Architecture For Neural Networks
2015
Zenodo
We will illustrate this correspondence by applying an algorithm based on Fourier series expansion. The result will be a novel architecture for the neural network. ...
Moreover, the resulting network is efficient since it is based on Fourier expansion.
3 Fig. 1 : 1 Class of neural networks considered in [1] . ...
doi:10.5281/zenodo.36259
fatcat:ktu2x7ocnfefdj2gdldosyfpbu
Title:
1996
Zenodo
We will illustrate this correspondence by applying an algorithm based on Fourier series expansion. The result will be a novel architecture for the neural network. ...
Moreover, the resulting network is efficient since it is based on Fourier expansion.
3 Fig. 1 : 1 Class of neural networks considered in [1] . ...
doi:10.5281/zenodo.36207
fatcat:nrlsarmbsbfrvpagkvrc73kbau
Steam turbine stress control using NARX neural network
2015
Open Engineering
Additionally NARX neural network, which were trained base on FE model, includes: nonlinearity of steam expansion in turbine steam path during transients, nonlinearity of heat exchange inside the turbine ...
In this article NARX neural networks stress controls is shown as an example of HP rotor of 18K390 turbine. ...
Designations: NN -neural network, TDL -tapped delay inputs,
Figure 3 : 3 MATLAB implementation of NARX neural network responsible assessment of critical point stress: (a) parallel architecture, (b) series ...
doi:10.1515/eng-2015-0043
fatcat:z4b2t4d2cnbmdebjm5d6sivaam
Taylor Series Prediction of Time Series Data with Error Propagated by Artificial Neural Network
2014
International Journal of Computer Applications
A new algorithm based on Taylor series expansion and artificial neural network is presented. Based on Taylor series algorithm and ARIMA model, the Sunspot numbers are forecasted and compared. ...
Modeling and forecasting of a time series data is an integral part of the Data Mining. Sun spot numbers observed on the sun are a good candidate for a time series. ...
Its definition was given like this [10] : If f has a power series representation (expansion) at point a and the radius of convergence of the power series is 0 R , that is, if R a x a x c x f ...
doi:10.5120/15470-4112
fatcat:d7u7zapmzbcitdfyeawpemhtzi
A New Technique for Unbalance Current and Voltage Estimation With Neural Networks
2005
IEEE Transactions on Power Systems
In this paper, a new measurement procedure based on neural networks for the estimation of harmonic powers and current/voltage-symmetrical components is presented. ...
A third block is another feedforward neural network that obtains symmetrical components of current/voltage harmonics and harmonic active/reactive powers. ...
It comes from the Fourier series expansion of (14) Equation (14) is equal to (13). ...
doi:10.1109/tpwrs.2005.846051
fatcat:kc7n43bg65dbtnsprcfbuppqqy
The Power of Extra Analog Neuron
[chapter]
2014
Lecture Notes in Computer Science
)
neural networks with one analog unit accept at most context-sensitive languages ? ...
Series a power series ∞ k=0 b k a k is eventually quasi-periodic if there is a real number P and an increasing infinite sequence of indices 0 ≤ k 1 < k 2 < k 3 < · · · such that m i = k i+1 − k i is bounded ...
Open Problems: • complete the analysis for |a| > 1 • a necessary condition for L(R) to be regular (Rényi, 1957) • a power series c = ∞ k=1 b k a k can be interpreted as a so-called β-expansion of c ∈ ...
doi:10.1007/978-3-319-13749-0_21
fatcat:2cu4sojsubgtbcshg6n3e2p7ue
ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units using Chebyshev Approximations
[article]
2019
arXiv
pre-print
, by converting polynomial approximation given in power series into deep neural networks with optimal complexity and no approximation error. ...
However, in practice, power series are not easy to compute. In this paper, we propose a new and more stable way to construct deep RePU neural networks based on Chebyshev polynomial approximations. ...
The structures of deep RePU networks constructed by using hierarchical Chebyshev polynomial expansion (i.e. ChebNet) and those constructed by using power series expansions (i.e. ...
arXiv:1911.05467v2
fatcat:bb6znbtfuvbnpfzwzgd5fgctmq
Polynomial Based Functional Link Artificial Recurrent Neural Network adaptive System for predicting Indian Stocks
2015
International Journal of Computational Intelligence Systems
+1.0 2) -2.0 to +2.0 3) -3.0 to +3.0 Power Series Polynomial Function with Range of network weights 1) -1.0 to +1.0 2) -2.0 to +2.0 3) -3.0 to +3.0 ...
A low complexity Polynomial Functional link Artificial Recurrent Neural Network (PFLARNN) has been proposed for the prediction of financial time series data. ...
Effect of varying network weights on the forecasting performance using Power series Polynomial function. ...
doi:10.1080/18756891.2015.1099910
fatcat:lrh5jyc4dbcvrbrxdr3zktkhwq
Time Series Forecasting Based on Cloud Process Neural Network
2015
International Journal of Computational Intelligence Systems
In recent years, a large literature has evolved on the use of artificial neural networks (ANN) in time series forecasting. ...
., randomness, fuzziness ) hidden in time series. Thus a cloud process neural network (CPNN) model is put forward in the paper for time series forecasting. ...
Time Series Forecasting Based
Co-published by Atlantis Press and Taylor & Francis Copyright: the authors ...
doi:10.1080/18756891.2015.1099905
fatcat:7wpsiyvtwnckjh6vuc6yivcg3a
Electricity Load Forecasting based on Framelet Neural Network Technique
2009
American Journal of Applied Sciences
It enhances the energy-efficient and reliable operation of a power system. This study shows Electricity Load Forecasting modeling based on Framelet Neural Network Technique (FNN) for Baghdad City. ...
Framelet technique is implemented to the time series data, decomposing the data into number of Framelet coefficient signals. The decomposed signals are then fed into neural network for training. ...
A multi-layer perception neural network is applied to each decomposed series to predict the time series. ...
doi:10.3844/ajas.2009.970.973
fatcat:kedurt3hbzahthadd2f7h4nt3u
Electricity Load Forecasting based on Framelet Neural Network Technique
2009
American Journal of Applied Sciences
It enhances the energy-efficient and reliable operation of a power system. This study shows Electricity Load Forecasting modeling based on Framelet Neural Network Technique (FNN) for Baghdad City. ...
Framelet technique is implemented to the time series data, decomposing the data into number of Framelet coefficient signals. The decomposed signals are then fed into neural network for training. ...
A multi-layer perception neural network is applied to each decomposed series to predict the time series. ...
doi:10.3844/ajassp.2009.970.973
fatcat:2oj73bjrsvdgbh5xh4w65hvd4e
Exploring Transfer Function Nonlinearity in Echo State Networks
[article]
2015
arXiv
pre-print
ESN is a simple neural network architecture in which a fixed recurrent network is driven with an input signal, and the output is generated by a readout layer from the measurements of the network states ...
How, why, and to what degree the transfer function nonlinearity helps biologically inspired neural network models is not fully understood. ...
Owing to its fixed recurrent connections, training an ESN is much more efficient than ordinary recurrent neural networks (RNN), making it feasible to use its power in practical applications. ...
arXiv:1502.04423v2
fatcat:3xe4hv2hpbaevizutp5wbthr34
« Previous
Showing results 1 — 15 out of 48,471 results