A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is
EigenNet: Towards Fast and Structural Learning of Deep Neural Networks
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Deep Neural Network (DNN) is difficult to train and easy to overfit in training. We address these two issues by introducing EigenNet, an architecture that not only accelerates training but also adjusts number of hidden neurons to reduce over-fitting. They are achieved by whitening the information flows of DNNs and removing those eigenvectors that may capture noises. The former improves conditioning of the Fisher information matrix, whilst the latter increases generalization capability. Thesedoi:10.24963/ijcai.2017/338 dblp:conf/ijcai/Luo17 fatcat:7esmnhdfifcz7pmxsif7iczkma