A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Table Pretraining: A Survey on Model Architectures, Pretraining Objectives, and Downstream Tasks
[article]
2022
Since a vast number of tables can be easily collected from web pages, spreadsheets, PDFs, and various other document types, a flurry of table pretraining frameworks have been proposed following the success of text and images, and they have achieved new state-of-the-arts on various tasks such as table question answering, table type recognition, column relation classification, table search, formula prediction, etc. To fully use the supervision signals in unlabeled tables, a variety of pretraining
doi:10.48550/arxiv.2201.09745
fatcat:xx3r3o5qgjc3nizxhbt7x5ytf4