Char2Subword: Extending the Subword Embedding Space Using Robust Character Compositionality

Gustavo Aguilar, Bryan McCann, Tong Niu, Nazneen Rajani, Nitish Shirish Keskar, Thamar Solorio
2021 Findings of the Association for Computational Linguistics: EMNLP 2021   unpublished
Byte-pair encoding (BPE) is a ubiquitous algorithm in the subword tokenization process of language models as it provides multiple benefits. However, this process is solely based on pre-training data statistics, making it hard for the tokenizer to handle infrequent spellings. On the other hand, though robust to misspellings, pure character-level models often lead to unreasonably long sequences and make it harder for the model to learn meaningful words. To alleviate these challenges, we propose a
more » ... character-based subword module (char2subword) 1 that learns the subword embedding table in pre-trained models like BERT. Our char2subword module builds representations from characters out of the subword vocabulary, and it can be used as a dropin replacement of the subword embedding table. The module is robust to character-level alterations such as misspellings, word inflection, casing, and punctuation. We integrate it further with BERT through pre-training while keeping BERT transformer parameters fixedand thus, providing a practical method. Finally, we show that incorporating our module to mBERT significantly improves the performance on the social media linguistic codeswitching evaluation (LinCE) benchmark. * Work performed as summer intern at Salesforce. ** Work performed as manager while at Salesforce.
doi:10.18653/v1/2021.findings-emnlp.141 fatcat:fg7rcen6mjf6zpwzvnvhx4fnze