Filters








43 Hits in 4.3 sec

TAPAS: Weakly Supervised Table Parsing via Pre-training [article]

Jonathan Herzig, Paweł Krzysztof Nowak, Thomas Müller, Francesco Piccinno, Julian Martin Eisenschlos
2020 arXiv   pre-print
TAPAS extends BERT's architecture to encode tables as input, initializes from an effective joint pre-training of text segments and tables crawled from Wikipedia, and is trained end-to-end.  ...  TAPAS trains from weak supervision, and predicts the denotation by selecting table cells and optionally applying a corresponding aggregation operator to such selection.  ...  Fine-tuning Overview We formally define table parsing in a weakly supervised setup as follows.  ... 
arXiv:2004.02349v2 fatcat:tbcfct2wgvcdvdeaqfu6iuiuoy

TaPas: Weakly Supervised Table Parsing via Pre-training

Jonathan Herzig, Pawel Krzysztof Nowak, Thomas Müller, Francesco Piccinno, Julian Eisenschlos
2020 Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics   unpublished
TAPAS extends BERT's architecture to encode tables as input, initializes from an effective joint pre-training of text segments and tables crawled from Wikipedia, and is trained end-to-end.  ...  TAPAS trains from weak supervision, and predicts the denotation by selecting table cells and optionally applying a corresponding aggregation operator to such selection.  ...  Fine-tuning Overview We formally define table parsing in a weakly supervised setup as follows.  ... 
doi:10.18653/v1/2020.acl-main.398 fatcat:z64zdhwjnba7tf5s6fjzkgx3ge

GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing [article]

Tao Yu and Chien-Sheng Wu and Xi Victoria Lin and Bailin Wang and Yi Chern Tan and Xinyi Yang and Dragomir Radev and Richard Socher and Caiming Xiong
2021 arXiv   pre-print
On four popular fully supervised and weakly supervised table semantic parsing benchmarks, GraPPa significantly outperforms RoBERTa-large as the feature representation layers and establishes new state-of-the-art  ...  We present GraPPa, an effective pre-training approach for table semantic parsing that learns a compositional inductive bias in the joint representations of textual and tabular data.  ...  WEAKLY-SUPERVISED SEMANTIC PARSING We also consider weakly-supervised semantic parsing tasks, which are very different from SQLguided learning in pre-training.  ... 
arXiv:2009.13845v2 fatcat:dzy5bu2cbna2lg7m6zun4lavhm

Exploring Decomposition for Table-based Fact Verification [article]

Xiaoyu Yang, Xiaodan Zhu
2021 arXiv   pre-print
Leveraging the programs synthesized by a weakly supervised semantic parser, we propose a program-guided approach to constructing a pseudo dataset for decomposition model training.  ...  Although pre-trained language models have demonstrated a strong capability in verifying simple statements, they struggle with complex statements that involve multiple operations.  ...  TaPas: Weakly supervised table parsing via pre-training. In Proceedings of the 58th Annual Meeting of the Association for Computational Lin- guistics, pages 4320-4333, Online.  ... 
arXiv:2109.11020v1 fatcat:ssdxrjlp7jdx5mb4bh7t7m2ega

On the Potential of Lexico-logical Alignments for Semantic Parsing to SQL Queries [article]

Tianze Shi, Chen Zhao, Jordan Boyd-Graber, Hal Daumé III, Lillian Lee
2020 arXiv   pre-print
Large-scale semantic parsing datasets annotated with logical forms have enabled major advances in supervised approaches. But can richer supervision help even more?  ...  We propose and test two methods: (1) supervised attention; (2) adopting an auxiliary objective of disambiguating references in the input queries to table columns.  ...  TaPas: Weakly supervised table parsing via pre-training. In Proceedings of the Asso- ciation for Computational Linguistics, pages 4320- 4333. Sepp Hochreiter and Jürgen Schmidhuber. 1997.  ... 
arXiv:2010.11246v1 fatcat:wpmjjsbhffhxpf4ugtfkjjfspq

PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation [article]

Ao Liu, Haoyu Dong, Naoaki Okazaki, Shi Han, Dongmei Zhang
2022 arXiv   pre-print
Hence even large-scale pre-trained language models present low logical fidelity on logical table-to-text.  ...  Logical table-to-text generation is a task that involves generating logically faithful sentences from tables, which requires models to derive logical level facts from table records via logical inference  ...  For SP-Acc, a sentence is first parsed into a logical program based on a weakly-supervised semantic parsing algorithm. Then, its correctness is verified by executing the logical program on the table.  ... 
arXiv:2205.12697v1 fatcat:rpt7e6d55zar3cqldvnl5xcb6m

TAPEX: Table Pre-training via Learning a Neural SQL Executor [article]

Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou
2022 arXiv   pre-print
To our knowledge, this is the first work to exploit table pre-training via synthetic executable programs and to achieve new state-of-the-art results on various downstream tasks.  ...  Recent progress in language model pre-training has achieved a great success via leveraging large-scale unstructured textual data.  ...  TABLE PRE-TRAINING VIA EXECUTION As mentioned in § 1, TAPEX achieves efficient table pre-training by training LMs to mimic the behavior of a SQL execution engine.  ... 
arXiv:2107.07653v3 fatcat:qdou2xqzezarbbou7p664oo3cu

Structure-aware Pre-training for Table Understanding with Tree-based Transformers [article]

Zhiruo Wang, Haoyu Dong, Ran Jia, Jia Li, Zhiyi Fu, Shi Han, Dongmei Zhang
2020 arXiv   pre-print
In this paper, we propose TUTA, a unified pre-training architecture for understanding generally structured tables.  ...  TUTA pre-trains on a wide range of unlabeled tables and fine-tunes on a critical task in the field of table structure understanding, i.e. cell type classification.  ...  TAPAS and TaBERT target question answering over relational tables via joint pre-training of tables and their text [19, 42] .  ... 
arXiv:2010.12537v2 fatcat:kli75htw6rbprecahx5zavk544

Bridging Textual and Tabular Data for Cross-Domain Text-to-SQL Semantic Parsing [article]

Xi Victoria Lin and Richard Socher and Caiming Xiong
2020 arXiv   pre-print
We present BRIDGE, a powerful sequential architecture for modeling dependencies between natural language questions and relational databases in cross-DB semantic parsing.  ...  The hybrid sequence is encoded by BERT with minimal subsequent layers and the text-DB contextualization is realized via the fine-tuned deep attention in BERT.  ...  Furthermore, anchor texts provide more focused signals that link the text and the DB schema. trained text-table LM that supports arithmetic op- erations for weakly supervised table QA.  ... 
arXiv:2012.12627v2 fatcat:rfqrrkkuynbhxcthoda2af53qq

A Graph Representation of Semi-structured Data for Web Question Answering [article]

Xingyao Zhang, Linjun Shou, Jian Pei, Ming Gong, Lijie Wen, Daxin Jiang
2020 arXiv   pre-print
We also develop pre-training and reasoning techniques on the graph model for the QA task.  ...  Different from plain text passages in Web documents, Web tables and lists have inherent structures, which carry semantic correlations among various elements in tables and lists.  ...  Herzig et al. (2020) propose TAPAS, a weakly supervised table parsing method. TAPAS models the structure information of tables by explicitly encoding rows and columns.  ... 
arXiv:2010.06801v1 fatcat:4677kcbx3jgclggpdthv2vvwvy

Learning from Explanations with Neural Execution Tree [article]

Ziqi Wang, Yujia Qin, Wenxuan Zhou, Jun Yan, Qinyuan Ye, Leonardo Neves, Zhiyuan Liu, Xiang Ren
2020 arXiv   pre-print
In this paper, we propose a novel Neural Execution Tree (NExT) framework to augment training data for text classification using NL explanations.  ...  Natural language (NL) explanations have been demonstrated very useful additional supervision, which can provide sufficient domain knowledge for generating more labeled data over new instances, while the  ...  ACKNOWLEDGMENTS This research is based upon work supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via Contract  ... 
arXiv:1911.01352v3 fatcat:fqbntowaj5addj4h3o6fasbykm

Topic Transferable Table Question Answering [article]

Saneem Ahmed Chemmengath, Vishwajeet Kumar, Samarth Bharadwaj, Jaydeep Sen, Mustafa Canim, Soumen Chakrabarti, Alfio Gliozzo, Karthik Sankaranarayanan
2021 arXiv   pre-print
Weakly-supervised table question-answering(TableQA) models have achieved state-of-art performance by using pre-trained BERT transformer to jointly encoding a question and a table to produce structured  ...  We empirically show that, despite pre-training on large open-domain text, performance of models degrades significantly when they are evaluated on unseen topics.  ...  However, Figure 2 : 2 Overview of the proposed T3QA framework for weakly-supervised TableQA.  ... 
arXiv:2109.07377v1 fatcat:4y4y5gj2tfdz7iysutdl2coo4m

Incorporating External Knowledge to Enhance Tabular Reasoning [article]

J. Neeraja, Vivek Gupta, Vivek Srikumar
2021 arXiv   pre-print
Reasoning about tabular information presents unique challenges to modern NLP approaches which largely rely on pre-trained contextualized embeddings of text.  ...  We show via systematic experiments that these strategies substantially improve tabular inference performance.  ...  TaPas: Weakly supervised table parsing via pre-training. In Proceedings of the 58th Annual Meeting of the Association for Computational Lin- guistics, pages 4320-4333, Online.  ... 
arXiv:2104.04243v1 fatcat:ccmp772mindobirnpnoyyvplmi

Hey AI, Can You Solve Complex Tasks by Talking to Agents? [article]

Tushar Khot and Kyle Richardson and Daniel Khashabi and Ashish Sabharwal
2022 arXiv   pre-print
Training giant models from scratch for each complex task is resource- and data-inefficient.  ...  For instance, using text and table QA agents to answer questions such as "Who had the longest javelin throw from USA?".  ...  TaPas: Weakly supervised table parsing via pre-training. In ACL. Xanh Ho, A. Nguyen, Saku Sugawara, and Akiko Aizawa. 2020.  ... 
arXiv:2110.08542v2 fatcat:xuxwyqraybfezo4645p5k4odmu

Logic-level Evidence Retrieval and Graph-based Verification Network for Table-based Fact Verification [article]

Qi Shi, Yu Zhang, Qingyu Yin, Ting Liu
2021 arXiv   pre-print
Table-based fact verification task aims to verify whether the given statement is supported by the given semi-structured table.  ...  However, due to the lack of fully supervised signals in the program generation process, spurious programs can be derived and employed, which leads to the inability of the model to catch helpful logical  ...  Because our approach is centered around evidence rather than table. The pre-trained model with evidence-statement pair as input is not adapted to the TAPAS model.  ... 
arXiv:2109.06480v1 fatcat:wjrsp2u23natzn3fdgst2irm2y
« Previous Showing results 1 — 15 out of 43 results