Exploring active subspaces in neural network cost functions [article]

Jonathan W. Helland, Colorado School Of Mines. Arthur Lakes Library, Colorado School Of Mines. Arthur Lakes Library, Paul G. Constantine
2017
Neural networks are a popular method in supervised machine learning for solving classification problems. To solve a classification problem, a neural network must first be trained on data. To do this, a measure of error called a cost function is minimized via a heuristic method like gradient descent. However, neural networks commonly seek to classify high dimensional data, which requires that many parameters be learned. This process is computationally expensive and gradient descent is often
more » ... t by local minima in the network's cost function that may be far from the global minimum. It is desirable, then, to work with an exploitable low-dimensional subspace of the highdimensional parameter space. We seek this dimensionality reduction by searching for active subspaces in neural network cost functions.
doi:10.25676/11124/170925 fatcat:uuclvytzczehbodem3dnolbacm