Tensorized Embedding Layers

Oleksii Hrinchuk, Valentin Khrulkov, Leyla Mirvakhabova, Elena Orlova, Ivan Oseledets
2020 Findings of the Association for Computational Linguistics: EMNLP 2020   unpublished
The embedding layers transforming input words into real vectors are the key components of deep neural networks used in natural language processing. However, when the vocabulary is large, the corresponding weight matrices can be enormous, which precludes their deployment in a limited resource setting. We introduce a novel way of parameterizing embedding layers based on the Tensor Train decomposition, which allows compressing the model significantly at the cost of a negligible drop or even a
more » ... t gain in performance. We evaluate our method on a wide range of benchmarks in natural language processing and analyze the trade-off between performance and compression ratios for a wide range of architectures, from MLPs to LSTMs and Transformers. Weronika Buczyńska, Jarosław Buczyński, and Mateusz Michałek. 2015. The hackbusch conjecture on tensor formats. Journal de Mathématiques Pures et Appliquées, 104(4):749-761. J Douglas Carroll and Jih-Jie Chang. 1970. Analysis of individual differences in multidimensional scaling via an n-way generalization of Eckart-Young decomposition. Psychometrika, 35(3).
doi:10.18653/v1/2020.findings-emnlp.436 fatcat:kl6zzwuhavha7iaa5gyhkawy2e