A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
On the Dangers of Stochastic Parrots
2021
Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency
The past 3 years of work in NLP have been characterized by the development and deployment of ever larger language models, especially for English. BERT, its variants, GPT-2/3, and others, most recently Switch-C, have pushed the boundaries of the possible both through architectural innovations and through sheer size. Using these pretrained models and the methodology of fine-tuning them for specific tasks, researchers have extended the state of the art on a wide array of tasks as measured by
doi:10.1145/3442188.3445922
fatcat:qoqcd66fsnc4pdfhebn3mbq5ba