Filters








9 Hits in 4.3 sec

CAN-NER: Convolutional Attention Network for Chinese Named Entity Recognition [article]

Yuying Zhu, Guoxin Wang, Börje F. Karlsson
2020 arXiv   pre-print
Named entity recognition (NER) in Chinese is essential but difficult because of the lack of natural delimiters.  ...  In this paper, we investigate a Convolutional Attention Network called CAN for Chinese NER, which consists of a character-based convolutional neural network (CNN) with local-attention layer and a gated  ...  INTRODUCTION Named Entity Recognition (NER) aims at identifying text spans which are associated with a specific semantic entity type such as person (PER), organization (ORG), location (LOC), and geopolitical  ... 
arXiv:1904.02141v3 fatcat:qxiwyexmpzeh7cliven2zxoeem

Dual Neural Network Fusion Model for Chinese Named Entity Recognition

Dandan Zhao, Jingxiang Cao, Degen Huang, Jiana Meng, Pan Zhang
2020 International Journal of Computational Intelligence Systems  
Keywords Chinese named entity recognition Dual neural network fusion Bi-directional long-short-term memory Self-attention mechanism Dilated convolutional neural network A B S T R A C T Chinese named entity  ...  network to extract implicit context representation information in Chinese NER. (3) Dilated convolutions are used to make a tradeoff between performance and executing speed.  ...  [43] ) and CAN-NER (Zhu and Wang [42] ) in 2019 are also carried out on MSRA dataset. These show that proposed techniques are very useful for Chinese NER.  ... 
doi:10.2991/ijcis.d.201216.001 fatcat:4pagsq4fgbdm3asty2xi2yivjm

Porous Lattice-based Transformer Encoder for Chinese NER [article]

Xue Mengge, Yu Bowen, Liu Tingwen, Zhang Yue, Meng Erli, Wang Bin
2020 arXiv   pre-print
Incorporating lattices into character-level Chinese named entity recognition is an effective method to exploit explicit word information.  ...  In this paper, we propose a porous lattice-based transformer encoder for Chinese named entity recognition, which is capable to better exploit the GPU parallelism and batch the computation owing to the  ...  There are multiple venues for future work, where one promising direction is to apply our model to the pre-training procedure of Chinese Transformer language models.  ... 
arXiv:1911.02733v3 fatcat:wtntgq4cqjgb5nhgar36f5vbwm

CWPC_BiAtt: Character–Word–Position Combined BiLSTM-Attention for Chinese Named Entity Recognition

Johnson, Shen, Liu
2020 Information  
Usually taken as linguistic features by Part-Of-Speech (POS) tagging, Named Entity Recognition (NER) is a major task in Natural Language Processing (NLP).  ...  , based on which we propose a new Character–Word–Position Combined BiLSTM-Attention (CWPC_BiAtt) for the Chinese NER task.  ...  I also want to express my heartfelt thanks to my junior fellow apprentice for their help, and Haiqing Zhang, Xingwang Zhuang, Lingze Qin and Meixia Shan for their valuable advice.  ... 
doi:10.3390/info11010045 fatcat:j5q4indkvbgpbg3fa6dqr7yv5m

MECT: Multi-Metadata Embedding based Cross-Transformer for Chinese Named Entity Recognition

Shuang Wu, Xiaoning Song, Zhenhua Feng
2021 Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)   unpublished
Recently, word enhancement has become very popular for Chinese Named Entity Recognition (NER), reducing segmentation errors and increasing the semantic and boundary information of Chinese words.  ...  With the structural characteristics of Chinese characters, MECT can better capture the semantic information of Chinese characters for NER.  ...  We also thank Xiaotong Xiang and Jun Quan for their help on editing the manuscript.  ... 
doi:10.18653/v1/2021.acl-long.121 fatcat:rc6bipuo45dijewezcw62fajsy

MECT: Multi-Metadata Embedding based Cross-Transformer for Chinese Named Entity Recognition [article]

Shuang Wu, Xiaoning Song, Zhenhua Feng
2021
Recently, word enhancement has become very popular for Chinese Named Entity Recognition (NER), reducing segmentation errors and increasing the semantic and boundary information of Chinese words.  ...  With the structural characteristics of Chinese characters, MECT can better capture the semantic information of Chinese characters for NER.  ...  We also thank Xiaotong Xiang and Jun Quan for their help on editing the manuscript.  ... 
doi:10.48550/arxiv.2107.05418 fatcat:rtviekzuk5antind3qem24rbfa

Porous Lattice Transformer Encoder for Chinese NER

Xue Mengge, Bowen Yu, Tingwen Liu, Yue Zhang, Erli Meng, Bin Wang
2020 Proceedings of the 28th International Conference on Computational Linguistics   unpublished
In this paper, we propose PLTE, an extension of transformer encoder that is tailored for Chinese NER, which models all the characters and matched lexical words in parallel with batch processing.  ...  PLTE augments self-attention with positional relation representations to incorporate lattice structure.  ...  Acknowledgements We would like to thank the anonymous reviewers for their insightful comments and suggestions.  ... 
doi:10.18653/v1/2020.coling-main.340 fatcat:yu5ngipqyfbudot6dvqeepos2a

Pretrained Transformers for Text Ranking: BERT and Beyond [article]

Jimmy Lin, Rodrigo Nogueira, Andrew Yates
2021 arXiv   pre-print
This survey provides an overview of text ranking with neural network architectures known as transformers, of which BERT is the best-known example.  ...  There are two themes that pervade our survey: techniques for handling long documents, beyond typical sentence-by-sentence processing in NLP, and techniques for addressing the tradeoff between effectiveness  ...  Special thanks goes out to two anonymous reviewers for their insightful comments and helpful feedback.  ... 
arXiv:2010.06467v3 fatcat:obla6reejzemvlqhvgvj77fgoy

Generating an RDF dataset from Twitter data: A Study Using Machine Learning

Saad Alajlan
2021
named entity recognition within Twitter data.  ...  and relations can NER and RE be applied using the standard supervised learning tools and techniques available for natural language processing?  ...  • Are the generated relationship names appropriate? Introduction It is widely acknowledged that social media features a wealth of user contributed content of all kinds.  ... 
doi:10.17638/03127589 fatcat:c3n52ocsibfe3jjavsxti7h43u