A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Exploring active subspaces in neural network cost functions
[article]
2017
Neural networks are a popular method in supervised machine learning for solving classification problems. To solve a classification problem, a neural network must first be trained on data. To do this, a measure of error called a cost function is minimized via a heuristic method like gradient descent. However, neural networks commonly seek to classify high dimensional data, which requires that many parameters be learned. This process is computationally expensive and gradient descent is often
doi:10.25676/11124/170925
fatcat:uuclvytzczehbodem3dnolbacm