A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Tensorized Embedding Layers
2020
Findings of the Association for Computational Linguistics: EMNLP 2020
unpublished
The embedding layers transforming input words into real vectors are the key components of deep neural networks used in natural language processing. However, when the vocabulary is large, the corresponding weight matrices can be enormous, which precludes their deployment in a limited resource setting. We introduce a novel way of parameterizing embedding layers based on the Tensor Train decomposition, which allows compressing the model significantly at the cost of a negligible drop or even a
doi:10.18653/v1/2020.findings-emnlp.436
fatcat:kl6zzwuhavha7iaa5gyhkawy2e