Filters








5,561 Hits in 8.2 sec

Combinators for bidirectional tree transformations

J. Nathan Foster, Michael B. Greenwald, Jonathan T. Moore, Benjamin C. Pierce, Alan Schmitt
2007 ACM Transactions on Programming Languages and Systems  
We propose a novel approach to the view update problem for tree-structured data: a domainspecific programming language in which all expressions denote bi-directional transformations on trees.  ...  We then instantiate this semantic framework in the form of a collection of lens combinators that can be assembled to describe bi-directional transformations on trees.  ...  Trevor Jim provided the initial push to start the project by observing that the next step beyond the Unison file synchronizer (of which Trevor was a co-designer) would be synchronizing XML.  ... 
doi:10.1145/1232420.1232424 fatcat:gfoort2hszgrfi5q67jdootst4

A Java Library for Bidirectional XML Transformation

Dongxi Liu, Zhenjiang Hu, Masato Takeichi, Kazuhiko Kakehi, Hao Wang
2007 Information and Media Technologies  
We propose a Java library BiXJ for bidirectional XML transformation.  ...  A bidirectional transformation generates target XML documents from source XML documents in forward transformations, and updates source documents in backward transformations by reflecting back modifications  ...  Combinators for bi-directional tree transformations: a linguistic approach to the view update problem.  ... 
doi:10.11185/imt.2.748 fatcat:3pdbogv6pre6tjbafus7dhmfuu

Code generation by model transformation: a case study in transformation modularity

Zef Hemel, Lennart C. L. Kats, Danny M. Groenewegen, Eelco Visser
2009 Journal of Software and Systems Modeling  
This paper refines our earlier description of code generation by model transformation with an improved architecture for the composition of model-to-model normalization rules, solving the problem of combining  ...  The technique can also be applied to 'internal code generation' for the translation of high-level extensions of a DSL to lower-level constructs within the same DSL using model-to-model transformations.  ...  Acknowledgments We would like to thank the anonymous reviewers of ICMT 2008 and SOSYM for their comments on earlier versions of this paper.  ... 
doi:10.1007/s10270-009-0136-1 fatcat:pyapirp4rfh3tcr3iwli7pfufe

SG-Net: Syntax Guided Transformer for Language Representation [article]

Zhuosheng Zhang, Yuwei Wu, Junru Zhou, Sufeng Duan, Hai Zhao, Rui Wang
2021 arXiv   pre-print
Syntax-guided network (SG-Net) is then composed of this extra SDOI-SAN and the SAN from the original Transformer encoder through a dual contextual architecture for better linguistics inspired representation  ...  The proposed SG-Net is applied to typical Transformer encoders.  ...  The idea of updating the representation of a word with information from its neighbors in the dependency tree, which benefits from explicit syntactic constraints, is well linguistically motivated.  ... 
arXiv:2012.13915v2 fatcat:2zyyd4s6ibcuvjal3k7t2e4v44

Developing a fake news identification model with advanced deep language transformers for Turkish covid-19 misinformation data

2021 Turkish Journal of Electrical Engineering and Computer Sciences  
Bi-directional Encoder Representations from Transformers and its variations, to improve efficiency of the proposed approach.  ...  As a next step, we used novel deep learning algorithms such as Long Short-Term Memory, Bi-directional Long-Short-Term-Memory, Convolutional Neural Networks, Gated Recurrent Unit and Bi-directional Gated  ...  This shows a new direction to the researchers.  ... 
doi:10.3906/elk-2106-55 fatcat:y4ncge7c4zazvjztsrd5mrvgvm

A survey and comparison of transformation tools based on the transformation tool contest

Edgar Jakumeit, Sebastian Buchwald, Dennis Wagelaar, Li Dan, Ábel Hegedüs, Markus Herrmannsdörfer, Tassilo Horn, Elina Kalnina, Christian Krause, Kevin Lano, Markus Lepper, Arend Rensink (+3 others)
2014 Science of Computer Programming  
Acknowledgements We want to thank Stephan Hildebrandt for his contributions to this article, the organizers of the Transformation Tool Contest for rendering this comparison possible, and the SHARE maintainers  ...  Furthermore, we want to thank the reviewers for their valuable comments.  ...  +B: The tools supports traversing edges in both directions (bi-directional navigationability).  ... 
doi:10.1016/j.scico.2013.10.009 fatcat:rft2mq6z6zhulc6eetgbxuuuii

Taming Pretrained Transformers for Extreme Multi-label Text Classification [article]

Wei-Cheng Chang, Hsiang-Fu Yu, Kai Zhong, Yiming Yang, Inderjit Dhillon
2020 arXiv   pre-print
In this paper, we propose X-Transformer, the first scalable approach to fine-tuning deep transformer models for the XMC problem.  ...  However, naively applying deep transformer models to the XMC problem leads to sub-optimal performance due to the large output space and the label sparsity issue.  ...  For instance, ELMo uses a (bi-directional LSTM) model pretrained on large unlabeled text data to obtain contexualized word embeddings.  ... 
arXiv:1905.02331v4 fatcat:wm3x3jwpnngvfgjpr3ljos3srq

Pretrained Transformers for Text Ranking: BERT and Beyond [article]

Jimmy Lin, Rodrigo Nogueira, Andrew Yates
2021 arXiv   pre-print
In this survey, we provide a synthesis of existing work as a single point of entry for practitioners who wish to gain a better understanding of how to apply transformers to text ranking problems and researchers  ...  The combination of transformers and self-supervised pretraining has been responsible for a paradigm shift in natural language processing (NLP), information retrieval (IR), and beyond.  ...  In addition, we would like to thank the TPU Research Cloud for resources used to obtain new results in this work.  ... 
arXiv:2010.06467v3 fatcat:obla6reejzemvlqhvgvj77fgoy

Model transformation intents and their properties

Levi Lúcio, Moussa Amrani, Juergen Dingel, Leen Lambers, Rick Salay, Gehan M. K. Selim, Eugene Syriani, Manuel Wimmer
2014 Journal of Software and Systems Modeling  
In this paper, a framework for the description of model transformation intents is defined which includes, for instance, a description of properties a model transformation has to satisfy to qualify as a  ...  The notion of model transformation intent is proposed to capture the purpose of a transformation.  ...  Furthermore, bi-directional bridges are usually required for enabling round-trips between technical spaces.  ... 
doi:10.1007/s10270-014-0429-x fatcat:d2etvwpprjacxiioalp27z4eni

Pretrained Transformers for Text Ranking: BERT and Beyond

Andrew Yates, Rodrigo Nogueira, Jimmy Lin
2021 Proceedings of the 14th ACM International Conference on Web Search and Data Mining  
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query for a particular task.  ...  The combination of transformers and self-supervised pretraining has, without exaggeration, revolutionized the fields of natural language processing (NLP), information retrieval (IR), and beyond.  ...  However, there remain many open research questions, and thus in addition to laying out the foundations of pretrained transformers for text ranking, this survey also attempts to prognosticate where the  ... 
doi:10.1145/3437963.3441667 fatcat:6teqmlndtrgfvk5mneq5l7ecvq

A Survey to View Update Problem

Haitao Chen, Husheng Liao
2011 Journal of clean energy technologies  
For a long time, the view update problem is an open question in database community. With the development of various data models, the corresponding view update problem has been widely researched.  ...  In this paper, we introduce the conception of view update problem. We survey and compare previous approaches. Especially, we emphasize the role of semantics.  ...  Bi-directional Transformation Breenwald et al [35] introduces the idea of bi-directional tree transformations, which is implemented in a universal synchronization framework for tree-structured data.  ... 
doi:10.7763/ijcte.2011.v3.278 fatcat:fjsooybx55h53jtiaerqadpzs4

Well-Behaved Model Transformations with Model Subtyping [article]

Artur Boronat
2017 arXiv   pre-print
a domain-specific language for representing the action part of model transformations.  ...  It is, thus, crucially important to provide pragmatic, reliable methods to verify that model transformations guarantee the correctness of generated models in order to ensure the quality of the final end  ...  Acknowledgements The author thanks the anonymous referees of SLE'15 and SLE'16 for their helpful comments on a previous draft of this document.  ... 
arXiv:1703.08113v1 fatcat:b3q55g2bczhi5gpjuzg2y7awj4

Transformer visualization via dictionary learning: contextualized embedding as a linear superposition of transformer factors [article]

Zeyu Yun, Yubei Chen, Bruno A Olshausen, Yann LeCun
2021 arXiv   pre-print
Though a great effort has been made to explain the representation in transformers, it is widely recognized that our understanding is not sufficient.  ...  While some of these patterns confirm the conventional prior linguistic knowledge, the rest are relatively unexpected, which may provide new insights.  ...  Highlevel transformer factors correspond to those linguistic patterns that span a long-range in the text.  ... 
arXiv:2103.15949v1 fatcat:gjvuf6q2vjastbeq4co3jwjw3a

Aspect-oriented model-driven skeleton code generation: A graph-based transformation approach

Jeannette Bennett, Kendra Cooper, Lirong Dai
2010 Science of Computer Programming  
Here, a model-driven code generation approach based on graph transformations for aspect-oriented development is proposed. The approach has two main transformation activities.  ...  XML is the target notation for this step; the transformation uses the XML meta-model to ensure that the output complies with the language.  ...  and operational approaches seem to be a natural fit for the model-to-code problem in general.  ... 
doi:10.1016/j.scico.2009.05.005 fatcat:le5wufik7fbbzowfxjbptm3jty

A Review of Bangla Natural Language Processing Tasks and the Utility of Transformer Models [article]

Firoj Alam, Arid Hasan, Tanvirul Alam, Akib Khan, Janntatul Tajrin, Naira Khan, Shammur Absar Chowdhury
2021 arXiv   pre-print
In this study, we first provide a review of Bangla NLP tasks, resources, and tools available to the research community; we benchmark datasets collected from various platforms for nine NLP tasks using current  ...  We hope that such a comprehensive survey will motivate the community to build on and further advance the research on Bangla NLP.  ...  [63] in which rules from source sentences were extracted using a parse tree, with the parse tree then transferred to the target sentence rules.  ... 
arXiv:2107.03844v3 fatcat:hermrinleneercodguko6kwxhu
« Previous Showing results 1 — 15 out of 5,561 results