Convergence of Gradient Descent Algorithm with Penalty Term For Recurrent Neural Networks

Xiaoshuai Ding, Kuaini Wang
2014 International Journal of Multimedia and Ubiquitous Engineering  
This paper investigates a gradient descent algorithm with penalty for a recurrent neural network. The penalty we considered here is a term proportional to the norm of the weights. Its primary roles in the methods are to control the magnitude of the weights. After proving that all of the weights are automatically bounded during the iteration process, we also present some deterministic convergence results for this learning methods, indicating that the gradient of the error function goes to
more » ... ak convergence) and the weight sequence goes to a fixed point(strong convergence), respectively. A numerical example is provided to support the theoretical analysis. indicates that the convergence of () m Ew and () m w Ew , which are called weak convergence. The strong convergence of {} m w itself is guaranteed in Conclusion (d).
doi:10.14257/ijmue.2014.9.9.17 fatcat:hwadqnxnnfhyfltoqxuzqebyza