SHAPE: Shifted Absolute Position Embedding for Transformers

Shun Kiyono, Sosuke Kobayashi, Jun Suzuki, Kentaro Inui
2021 Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing   unpublished
Position representation is crucial for building position-aware representations in Transformers. Existing position representations suffer from a lack of generalization to test data with unseen lengths or high computational cost. We investigate shifted absolute position embedding (SHAPE) to address both issues. The basic idea of SHAPE is to achieve shift invariance, which is a key property of recent successful position representations, by randomly shifting absolute positions during training. We
more » ... monstrate that SHAPE is empirically comparable to its counterpart while being simpler and faster 1 .
doi:10.18653/v1/2021.emnlp-main.266 fatcat:fnwua4n6cjggjmtxjkptxgxpoq