A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
TA-MAMC at SemEval-2021 Task 4: Task-adaptive Pretraining and Multi-head Attention for Abstract Meaning Reading Comprehension
2021
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021)
unpublished
This paper describes our system used in the SemEval-2021 Task 4 Reading Comprehension of Abstract Meaning, achieving 1st for subtask 1 and 2nd for subtask 2 on the leaderboard. We propose an ensemble of ELECTRAbased models with task-adaptive pretraining and a multi-head attention multiple-choice classifier on top of the pre-trained model. The main contributions of our system are 1) revealing the performance discrepancy of different transformer-based pretraining models on the downstream task, 2)
doi:10.18653/v1/2021.semeval-1.5
fatcat:kunw2j4ecbd4zfqfsolzgkdk4y