Filters








141,532 Hits in 3.3 sec

Learning Tag Dependencies for Sequence Tagging

Yuan Zhang, Hongshen Chen, Yihong Zhao, Qun Liu, Dawei Yin
2018 Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence  
Despite successes in learning long term token sequence dependencies with neural network, tag dependencies are rarely considered previously.  ...  Sequence tagging is the basis for multiple applications in natural language processing.  ...  We thank the anonymous reviewers for their constructive comments. Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence  ... 
doi:10.24963/ijcai.2018/637 dblp:conf/ijcai/ZhangCZLY18 fatcat:tpzgokv5ereavio4u57tftsc2i

Exploring the Statistical Derivation of Transformational Rule Sequences for Part-of-Speech Tagging [article]

Lance A. Ramshaw, and Mitchell P. Marcus
1994 arXiv   pre-print
A fast, incremental implementation and a mechanism for recording the dependencies that underlie the resulting rule sequence are also described.  ...  The method learns a series of symbolic transformational rules, which can then be applied in sequence to a test corpus to produce predictions.  ...  We point out how it manages to largely avoid di culties with overtraining, and show a way of recording the dependencies between rules in the learned sequence.  ... 
arXiv:cmp-lg/9406011v1 fatcat:zxwnqubbgreubkzgxdydydssb4

Easy-First POS Tagging and Dependency Parsing with Beam Search

Ji Ma, Jingbo Zhu, Tong Xiao, Nan Yang
2013 Annual Meeting of the Association for Computational Linguistics  
In this paper, we combine easy-first dependency parsing and POS tagging algorithms with beam search and structured perceptron.  ...  On CTB, we achieve 94.01% tagging accuracy and 86.33% unlabeled attachment score with a relatively small beam width. On PTB, we also achieve state-of-the-art performance.  ...  Research Funds for the Central Universities (N100204002).  ... 
dblp:conf/acl/MaZXY13 fatcat:rpoeg25xtzbbtdvv5jgrqbrkaq

Investigating NP-Chunking with Universal Dependencies for English

Ophélie Lacroix
2018 Proceedings of the Second Workshop on Universal Dependencies (UDW 2018)  
We then demonstrate how the task of NP-chunking can benefit PoS-tagging in a multi-task learning setting -comparing two different strategies -and how it can be used as a feature for dependency parsing  ...  in order to learn enriched models.  ...  Acknowledgment The author would like to thank the anonymous reviewers for their comments and suggestions, and add as well a special thanks to her colleagues from the Data Science team at Siteimprove.  ... 
doi:10.18653/v1/w18-6010 dblp:conf/acludw/Lacroix18 fatcat:bniccpxqurdvtlocasiei3lmqy

Neural sequence labeling for Vietnamese POS Tagging and NER [article]

Duong Nguyen Anh, Hieu Nguyen Kiem, Vi Ngo Van
2018 arXiv   pre-print
This paper presents a neural architecture for Vietnamese sequence labeling tasks including part-of-speech (POS) tagging and named entity recognition (NER).  ...  Experiments on benchmark datasets show that this work achieves state-of-the-art performances on both tasks - 93.52% accuracy for POS tagging and 94.88% F1 for NER. Our sourcecode is available at here.  ...  We would like to thank the NNVLP [22] team for publishing the pre-trained word embedding set that we used during training and evaluating stage of our model.  ... 
arXiv:1811.03754v2 fatcat:a6c6cdosyjhdbhqpwhpu73242q

Sequence Tagging for Fast Dependency Parsing

Michalina Strzyz, David Vilares, Carlos Gómez-Rodríguez
2019 Proceedings (MDPI)  
In this study we adopt a radically different approach and cast full dependency parsing as a pure sequence tagging task.  ...  In particular, we apply a linearization function to the tree that results in an output label for each token that conveys information about the word's dependency relations.  ...  In a similar fashion, we propose to apply sequence tagging models for dependency parsing [8] , using NCRF++ [9] as our sequence tagging framework.  ... 
doi:10.3390/proceedings2019021049 fatcat:l5e4ohej7vasflamvnjl2l6zku

Natural Language Processing Tools for Tamil Grammar Learning and Teaching

V Dhanalakshmi, S Rajendran
2010 International Journal of Computer Applications  
Tools like Character Analyzer for analyzing character, Morphological Analyzer and Generator and Verb Conjugator for the word level analysis and Parts of Speech Tagger, Chunker and Dependency parser for  ...  In this paper we present the Grammar teaching tools for analyzing and learning character, word and sentence of Tamil Language.  ...  CRFs are used for sequence tagging tasks where a sequence of words must be annotated with a sequence of labels, one per word.  ... 
doi:10.5120/1314-1790 fatcat:oupplfz5ubc65fuilgqr2bnmo4

Deep Learning for Character-Based Information Extraction [chapter]

Yanjun Qi, Sujatha G. Das, Ronan Collobert, Jason Weston
2014 Lecture Notes in Computer Science  
characters; (2) abundant online sequences (unlabeled) are utilized to improve the vector representation through semi-supervised learning; and (3) the constraints of spatial dependency among output labels  ...  on protein sequences.  ...  Improving Representation Learning with Unlabeled Sequences Manually labeling character-based sequences, i.e. to obtain tag label for each character, could be quite time-consuming, since it requires very  ... 
doi:10.1007/978-3-319-06028-6_74 fatcat:ubzh4ytyrngapdfnijcizhysmi

Exploring Cross-Lingual Transfer of Morphological Knowledge In Sequence-to-Sequence Models

Huiming Jin, Katharina Kann
2017 Proceedings of the First Workshop on Subword and Character Level Models in NLP  
It has recently been applied for crosslingual transfer learning for paradigm completion-the task of producing inflected forms of lemmata-with sequenceto-sequence networks.  ...  To investigate this, we propose a set of data-dependent experiments using an existing encoder-decoder recurrent neural network for the task.  ...  Acknowledgments We would like to thank Hinrich Schütze and the anonymous reviewers for their helpful comments.  ... 
doi:10.18653/v1/w17-4110 dblp:conf/emnlp/JinK17 fatcat:vi2as53gkrb45hut7jbwh2tip4

Embedded-State Latent Conditional Random Fields for Sequence Labeling

Dung Thai, Sree Harsha Ramesh, Shikhar Murty, Luke Vilnis, Andrew McCallum
2018 Proceedings of the 22nd Conference on Computational Natural Language Learning  
While RNNs have provided increasingly powerful context-aware local features for sequence tagging, they have yet to be integrated with a global graphical model of similar expressivity in the output distribution  ...  an embedding space for hidden states.  ...  In this work, while using state-of-the-art sequence tagging baselines for input representation learning, we concern ourselves with learning the global structure of the output space of label sequences,  ... 
doi:10.18653/v1/k18-1001 dblp:conf/conll/ThaiRMVM18 fatcat:nv2jel7skfbrvmm4s7qr6jevum

Embedded-State Latent Conditional Random Fields for Sequence Labeling [article]

Dung Thai, Sree Harsha Ramesh, Shikhar Murty, Luke Vilnis, Andrew McCallum
2018 arXiv   pre-print
While RNNs have provided increasingly powerful context-aware local features for sequence tagging, they have yet to be integrated with a global graphical model of similar expressivity in the output distribution  ...  an embedding space for hidden states.  ...  In this work, while using state-of-the-art sequence tagging baselines for input representation learning, we concern ourselves with learning the global structure of the output space of label sequences,  ... 
arXiv:1809.10835v1 fatcat:s6tb4cd2ujaidlgfgtjbjmur2m

Multi-Label Community-Based Question Classification via Personalized Sequence Memory Network Learning

Xinyu Duan, Shengyu Zhang, Zhou Zhao, Fei Wu, Yueting Zhuang
2018 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
capture the high-order tag dependency.  ...  We introduce the personalized sequence memory network that leverages not only the semantics of questions but also the personalized information of askers to provide the sequence tag learning function to  ...  Acknowledgement This work was supported by NSFC (U1611461), Key Program of Zhejiang Province (2015C01027), China Knowledge Center for Engineering and Microsoft Research Asia.  ... 
doi:10.1609/aaai.v32i1.12171 fatcat:cl6kewoyrfbxtffud6rkkhfez4

Discriminative Training of Sequence Taggers via Local Feature Matching

Minyoung Kim
2014 International Journal of Fuzzy Logic and Intelligent Systems  
Sequence tagging is the task of predicting frame-wise labels for a given input sequence and has important applications to diverse domains.  ...  For several real-world sequence tagging problems, we empirically demonstrate that the proposed learning algorithm achieves significantly more accurate prediction performance than standard estimators.  ...  Conclusions In this paper, we proposed a novel parameter learning method for CRFs to tackle the sequence tagging problem.  ... 
doi:10.5391/ijfis.2014.14.3.209 fatcat:koeopsvynve7hizcx74ebwbk7i

Viable Dependency Parsing as Sequence Labeling

Michalina Strzyz, David Vilares, Carlos Gómez-Rodríguez
2019 Proceedings of the 2019 Conference of the North  
We use parsing as sequence labeling as a common framework to learn across constituency and dependency syntactic abstractions. To do so, we cast the problem as multitask learning (MTL).  ...  The results across the board show that on average MTL models with auxiliary losses for constituency parsing outperform single-task ones by 1.14 F1 points, and for dependency parsing by 0.62 UAS points.  ...  We gratefully acknowledge NVIDIA Corporation for the donation of a GTX Titan X GPU. amount of non-projectivity (BIST is a projective parser).  ... 
doi:10.18653/v1/n19-1077 dblp:conf/naacl/StrzyzVG19 fatcat:hnfrw22lhjggjjymceno4cat2e

Weakly Supervised Sequence Tagging from Noisy Rules

Esteban Safranchik, Shiying Luo, Stephen Bach
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We propose a framework for training sequence tagging models with weak supervision consisting of multiple heuristic rules of unknown accuracy.  ...  In addition to supporting rules that vote on tags in the output sequence, we introduce a new type of weak supervision, called linking rules, that vote on how sequence elements should be grouped into spans  ...  Acknowledgements We thank Xufan Zhang for helpful discussions on identifiability in Bayesian networks.  ... 
doi:10.1609/aaai.v34i04.6009 fatcat:xxrrl2xnrrgtddlfihp6hce4ra
« Previous Showing results 1 — 15 out of 141,532 results