A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
High-performing neural network models of visual cortex benefit from high latent dimensionality
[article]
2022
bioRxiv
pre-print
Geometric descriptions of deep neural networks (DNNs) have the potential to uncover core principles of computational models in neuroscience, while abstracting over the details of model architectures and training paradigms. Here we examined the geometry of DNN models of visual cortex by quantifying the latent dimensionality of their natural image representations. The prevailing view holds that optimal DNNs compress their representations onto low-dimensional manifolds to achieve invariance and
doi:10.1101/2022.07.13.499969
fatcat:hyb2biojrbgfjf675y4vaguhgm