Filters








87 Hits in 3.3 sec

AMR Parsing as Sequence-to-Graph Transduction [article]

Sheng Zhang and Xutai Ma and Kevin Duh and Benjamin Van Durme
2019 arXiv   pre-print
We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction.  ...  of labeled AMR data.  ...  Government is authorized to reproduce and distribute reprints for Governmental purposes.  ... 
arXiv:1905.08704v2 fatcat:w2tdvlo6hvc65phutke32t7j4a

Broad-Coverage Semantic Parsing as Transduction [article]

Sheng Zhang and Xutai Ma and Kevin Duh and Benjamin Van Durme
2019 arXiv   pre-print
We unify different broad-coverage semantic parsing tasks under a transduction paradigm, and propose an attention-based neural framework that incrementally builds a meaning representation via a sequence  ...  Experiments conducted on three separate broad-coverage semantic parsing tasks -- AMR, SDP and UCCA -- demonstrate that our attention-based neural transducer improves the state of the art on both AMR and  ...  Government is authorized to reproduce and distribute reprints for Governmental purposes.  ... 
arXiv:1909.02607v2 fatcat:enhzznavwfeztdzm4ddpxxx7my

AMR Parsing via Graph-Sequence Iterative Inference [article]

Deng Cai, Wai Lam
2020 arXiv   pre-print
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph.  ...  in the output graph to construct the new concept.  ...  Seq2seq-based parsing (Barzdins and Gosko, 2016; Konstas et al., 2017; van Noord and Bos, 2017; Peng et al., 2018) views parsing as sequence-to-sequence transduction by some linearization of the AMR  ... 
arXiv:2004.05572v2 fatcat:levfg6nhzvgydbghluqaqzo2fa

Broad-Coverage Semantic Parsing as Transduction

Sheng Zhang, Xutai Ma, Kevin Duh, Benjamin Van Durme
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
We unify different broad-coverage semantic parsing tasks under a transduction paradigm, and propose an attention-based neural framework that incrementally builds a meaning representation via a sequence  ...  Experiments conducted on three separate broadcoverage semantic parsing tasks -AMR, SDP and UCCA -demonstrate that our attentionbased neural transducer improves the state of the art on both AMR and UCCA  ...  Government is authorized to reproduce and distribute reprints for Governmental purposes.  ... 
doi:10.18653/v1/d19-1392 dblp:conf/emnlp/ZhangMDD19 fatcat:gzndqwnxxfbtdellsdjnyozya4

Levi Graph AMR Parser using Heterogeneous Attention [article]

Han He, Jinho D. Choi
2021 arXiv   pre-print
Coupled with biaffine decoders, transformers have been effectively adapted to text-to-graph transduction and achieved state-of-the-art performance on AMR parsing.  ...  This paper presents a novel approach to AMR parsing by combining heterogeneous data (tokens, concepts, labels) as one input to a transformer to learn attention, and use only attention matrices from the  ...  3.1 Text-to-Graph Transducer Figure 1 shows the overview of our Text-to-Graph Transduction model.  ... 
arXiv:2107.04152v1 fatcat:245h5dumffbvdbjd37q7swo2cq

One SPRING to Rule Them Both: Symmetric AMR Semantic Parsing and Generation without a Complex Pipeline

Michele Bevilacqua, Rexhina Blloshmi, Roberto Navigli
2021 Zenodo  
In contrast, state-of-the-art AMR-to-Text generation, which can be seen as the inverse to parsing, is based on simpler seq2seq.  ...  In this paper, we cast Text-toAMR and AMR-to-Text as a symmetric transduction task and show that by devising a careful graph linearization and extending a pretrained encoder-decoder model, it is possible  ...  Text-to-AMR Parsing Pure seq2seq Seq2seq approaches model Text-to-AMR parsing as a transduction of the sentence into a linearization of the AMR graph.  ... 
doi:10.5281/zenodo.5543380 fatcat:j7chtyqzn5hdledgcwpnnogdda

Making Better Use of Bilingual Information for Cross-Lingual AMR Parsing [article]

Yitao Cai, Zhe Lin, Xiaojun Wan
2021 arXiv   pre-print
However, they find that concepts in their predicted AMR graphs are less specific. We argue that the misprediction of concepts is due to the high relevance between English tokens and AMR concepts.  ...  In this work, we introduce bilingual input, namely the translated texts as well as non-English texts, in order to enable the model to predict more accurate concepts.  ...  ., 2020) , we treat crosslingual AMR parsing as a sequence-to-sequence transduction problem and improve seq2seq models with bilingual input and auxiliary task.  ... 
arXiv:2106.04814v1 fatcat:jppfrhdupnhdfmrlrgqvrile2m

Graph Pre-training for AMR Parsing and Generation [article]

Xuefeng Bai, Yulong Chen, Yue Zhang
2022 arXiv   pre-print
Recently, pre-trained language models (PLMs) have advanced tasks of AMR parsing and AMR-to-text generation, respectively.  ...  We further design a unified framework to bridge the gap between pre-training and fine-tuning tasks. Experiments on both AMR parsing and AMR-to-text generation show the superiority of our model.  ...  We would like to thank anonymous reviewers for their insightful comments.  ... 
arXiv:2203.07836v4 fatcat:e5gq5h4jyfdgxdmz77czf7hhlq

Universal Decompositional Semantic Parsing [article]

Elias Stengel-Eskin, Aaron Steven White, Sheng Zhang, Benjamin Van Durme
2020 arXiv   pre-print
We introduce a transductive model for parsing into Universal Decompositional Semantics (UDS) representations, which jointly learns to map natural language utterances into UDS graph structures and annotate  ...  We also introduce a strong pipeline model for parsing into the UDS graph structure, and show that our transductive parser performs comparably while additionally performing attribute prediction.  ...  The views and conclusions expressed herein are those of the authors and should not be interpreted as representing official policies or endorsements of DARPA, IARPA, or the U.S. Government.  ... 
arXiv:1910.10138v3 fatcat:z7vava7ljfhudmejey4ann762m

Neural AMR: Sequence-to-Sequence Models for Parsing and Generation [article]

Ioannis Konstas, Srinivasan Iyer, Mark Yatskar, Yejin Choi, Luke Zettlemoyer
2017 arXiv   pre-print
We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.  ...  of the AMR graphs.  ...  We define a linearization order for an AMR graph as any sequence of its nodes and edges.  ... 
arXiv:1704.08381v3 fatcat:oupyiqgmwzbbtnvw64tpuhgsuu

Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

Ioannis Konstas, Srinivasan Iyer, Mark Yatskar, Yejin Choi, Luke Zettlemoyer
2017 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
of the AMR graphs.  ...  We present extensive ablative and qualitative analysis including strong evidence that sequencebased AMR models are robust against ordering variations of graph-to-sequence conversions. 157  ...  Conclusions We applied sequence-to-sequence models to the tasks of AMR parsing and AMR generation, by carefully preprocessing the graph representation and scaling our models via pretraining on millions  ... 
doi:10.18653/v1/p17-1014 dblp:conf/acl/KonstasIYCZ17 fatcat:l6kw32l2evg6nl55obver3s6zq

AMR Parsing using Stack-LSTMs [article]

Miguel Ballesteros, Yaser Al-Onaizan
2017 arXiv   pre-print
We present a transition-based AMR parser that directly generates AMR parses from plain text. We use Stack-LSTMs to represent our parser state and make decisions greedily.  ...  Adding additional information, such as POS tags and dependency trees, improves the results further.  ...  We implemented an oracle that produces the sequence of actions that leads to the gold (or close to gold) AMR graph.  ... 
arXiv:1707.07755v2 fatcat:y7zx5kb3ljgsxikkd3xaajds6i

AMR Parsing using Stack-LSTMs

Miguel Ballesteros, Yaser Al-Onaizan
2017 Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing  
We present a transition-based AMR parser that directly generates AMR parses from plain text. We use Stack-LSTMs to represent our parser state and make decisions greedily.  ...  Adding additional information, such as POS tags and dependency trees, improves the results further.  ...  We implemented an oracle that produces the sequence of actions that leads to the gold (or close to gold) AMR graph.  ... 
doi:10.18653/v1/d17-1130 dblp:conf/emnlp/BallesterosA17 fatcat:ybe32z5ebzbzrk2isnq7ibdzpm

ShanghaiTech at MRP 2019: Sequence-to-Graph Transduction with Second-Order Edge Inference for Cross-Framework Meaning Representation Parsing [article]

Xinyu Wang, Yixian Liu, Zixia Jia, Chengyue Jiang, Kewei Tu
2020 arXiv   pre-print
This paper presents the system used in our submission to the CoNLL 2019 shared task: Cross-Framework Meaning Representation Parsing.  ...  Our system is a graph-based parser which combines an extended pointer-generator network that generates nodes and a second-order mean field variational inference module that predicts edges.  ...  Lemma and Named Entity Tags Conclusion In this paper, we present our graph-based parsing system for MRP 2019, which combines two state-of-the-art methods for sequence to graph node generation and second-order  ... 
arXiv:2004.03849v1 fatcat:653vdmfi35dhnmx2ujaibc6odu

Ensembling Graph Predictions for AMR Parsing [article]

Hoang Thanh Lam, Gabriele Picco, Yufang Hou, Young-Suk Lee, Lam M. Nguyen, Dzung T. Phan, Vanessa López, Ramon Fernandez Astudillo
2022 arXiv   pre-print
For example, in natural language processing, it is very common to parse texts into dependency trees or abstract meaning representation (AMR) graphs.  ...  As the problem is NP-Hard, we propose an efficient heuristic algorithm to approximate the optimal solution. To validate our approach, we carried out experiments in AMR parsing problems.  ...  Cai&Lam The model proposed in [Cai and Lam, 2020b] treats AMR parsing as a series of dual decisions (i.e., which parts of the sequence to abstract, and where in the graph to construct) on the input sequence  ... 
arXiv:2110.09131v2 fatcat:5jhi7fvasrgrpf226vwf6743mi
« Previous Showing results 1 — 15 out of 87 results