CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade

Lei Li, Yankai Lin, Deli Chen, Shuhuai Ren, Peng Li, Jie Zhou, Xu Sun
2021 Findings of the Association for Computational Linguistics: EMNLP 2021   unpublished
Dynamic early exiting aims to accelerate the inference of pre-trained language models (PLMs) by emitting predictions in internal layers without passing through the entire model. In this paper, we empirically analyze the working mechanism of dynamic early exiting and find that it faces a performance bottleneck under high speed-up ratios. On one hand, the PLMs' representations in shallow layers lack high-level semantic information and thus are not sufficient for accurate predictions. On the other
more » ... hand, the exiting decisions made by internal classifiers are unreliable, leading to wrongly emitted early predictions. We instead propose a new framework for accelerating the inference of PLMs, CascadeBERT, which dynamically selects proper-sized and complete models in a cascading manner, providing comprehensive representations for predictions. We further devise a difficulty-aware objective, encouraging the model to output the class probability that reflects the real difficulty of each instance for a more reliable cascading mechanism. Experimental results show that Cascade-BERT can achieve an overall 15% improvement under 4× speed-up compared with existing dynamic early exiting methods on six classification tasks, yielding more calibrated and accurate predictions. 1
doi:10.18653/v1/2021.findings-emnlp.43 fatcat:67vepzribjhutg6dys7vvfgjeq