A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Research on Long Text Classification Model Based on Multi-Feature Weighted Fusion
2022
Applied Sciences
combine attention mechanisms to obtain weighted local features, fuse global contextual features with weighted local features, and obtain classification results by equal-length convolutional pooling. ...
The BERT model is used to obtain feature representations containing global semantic and contextual feature information of text, convolutional neural networks to obtain features at different levels and ...
The word vector matrix is passed through the Self-attention layer by the BERT model, which enables the Encoder to learn the contextual information of the text while encoding. ...
doi:10.3390/app12136556
fatcat:wlmbriaxbzhw5hme5qz4qtj6nu
Extensive Pyramid Networks for Text Classification
2019
Australian Journal of Intelligent Information Processing Systems
The attention mechanism, especially self-attention, has also gained great performance on many NLP tasks. This paper describes an extensive pyramid network for text classification. ...
In addition, recurrent layers are used to obtain the order information and convolutional layers are applied to get the local contextual information. ...
Acknowledgements This research work has been funded by the National Natural Science Foundation of China (Grant No.61772337, U1736207), and the National Key Research and Development Program of China NO. ...
dblp:journals/ajiips/ShiYL19
fatcat:zqt6eu6mxfaepjitd2qlvoetja
Named Entity Recognition of Medical Text Based on the Deep Neural Network
2022
Journal of Healthcare Engineering
This paper proposes a hybrid neural network medical text named entity recognition model. First, a coding method based on a fully self-attentive mechanism is proposed. ...
It determines the weight distribution by scoring the characters or words in all positions and obtains the position information in the sentence that needs the most attention. ...
According to the characteristics of word ambiguity and structural complexity of medical text, a fully self-attentive coding mechanism is designed, which integrates contextual information into the coding ...
doi:10.1155/2022/3990563
pmid:35295179
pmcid:PMC8920682
fatcat:u2jzhj4jhncxfbulsfunwuxdbq
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
[chapter]
2020
Lecture Notes in Computer Science
Much progress has been made recently on text classification with methods based on neural networks. ...
In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. ...
Related Work
Self-attention and BERT As aforementioned, attention mechanisms [28, 31] based on various deep neural networks, in particular the self-attention mechanism proposed by Vaswan et al. ...
doi:10.1007/978-3-030-45439-5_25
fatcat:6kwtm2vov5dpzbfjfpspri2h5i
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
[article]
2020
arXiv
pre-print
Much progress has been made recently on text classification with methods based on neural networks. ...
In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. ...
Related Work
Self-Attention and BERT As aforementioned, attention mechanisms [31, 28] based on various deep neural networks, in particular the self-attention mechanism proposed by Vaswan et al. ...
arXiv:2004.05707v1
fatcat:s3jtioywffcj3itewjhn2rncw4
A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification
2019
IEEE Access
The model combines the bidirectional long short-term memory and the convolutional neural network with the attention mechanism and word2vec to achieve the fine-grained text classification task. ...
We also employ a maximum pool layer of convolutional neural network that judges which words play an essential role in text classification, and use the attention mechanism to give them higher weights to ...
Based on this motivation, we consider using the attention mechanism network that captures more valuable information in the text for classification. ...
doi:10.1109/access.2019.2932619
fatcat:5il6p3bcifbv5hzizvzzomxsp4
Short Text Sentiment Analysis Based on Multi-Channel CNN With Multi-head Attention Mechanism
2021
IEEE Access
neural network, as well as integrates the multi-head attention mechanism to more fully learn the sentiment information in the text. ...
A novel sentiment analysis model based on multi-channel convolutional neural network with multi-head attention mechanism (MCNN-MA) is proposed. ...
This paper takes convolutional neural network as the core model.
C. ATTENTION MECHANISM Combining the attention mechanism and neural network can often achieve better classification results. ...
doi:10.1109/access.2021.3054521
fatcat:juhpkfiinvgcpo4rmnh3ht7xnq
A multi-label text classification model based on ELMo and attention
2020
MATEC Web of Conferences
We proposed a multi-label text classification model based on ELMo and attention mechanism which help solve the problem for the sentiment classification task that there is no grammar or writing convention ...
in power supply related text and the sentiment related information disperses in the text. ...
This research was financially supported by the Self-financing for Guangdong Power Grid Co., Ltd. informatization project under Grants 037800HK42180056. ...
doi:10.1051/matecconf/202030903015
fatcat:x4qwbe22nfemjnh2iq4slpgowa
Enterprise Strategic Management From the Perspective of Business Ecosystem Construction Based on Multimodal Emotion Recognition
2022
Frontiers in Psychology
Then, two datasets, CMU-MOSI and CMU-MOSEI, are selected to design the scheme for multimodal ER based on self-attention mechanism. ...
Through the comparative analysis of the accuracy of single-modal and multi-modal ER, the self-attention mechanism is applied in the experiment. ...
FUNDING This work was supported by the Shandong Social Science Planning and Research Project (No. 20CPYJ29). ...
doi:10.3389/fpsyg.2022.857891
pmid:35310264
pmcid:PMC8927019
doaj:82cf2c71b7bf4e4f9bdeda763b6e1939
fatcat:hssh4dpwzbahvpv5vyupuuoxuu
GCNN with Self-Attention Is Better than GRU with Self-Attention for Sentiment Analysis
2022
ICIC Express Letters
not only prior but also posterior information in text should be employed and that such employment of both information should not be too far. ...
The results show that GCNN with Self-Attention is always better by approximately 1% or more of accuracy than GRU with Self-Attention, let alone GCNN without Self-Attention, and it may well indicate that ...
Parameter Value Word embedding size 256 Neural network for Self-Attention 256-128-1 Neural network for Classification 256-128-1 Stride (only for GCNN's) 1 Padding (only for GCNN's) No Minimum word occurrence ...
doi:10.24507/icicel.16.05.497
fatcat:w467h4j2avegpminuzoia4t4jy
Self Multi-Head Attention-based Convolutional Neural Networks for fake news detection
2019
PLoS ONE
neural networks and self multi-head attention mechanism. ...
In the paper, we built a model named SMHA-CNN (Self Multi-Head Attention-based Convolutional Neural Networks) that can judge the authenticity of news with high accuracy based only on content by using convolutional ...
The paper' contributions can be summarized as follows: • We built a model to detect the fake news by combining the advantages of the convolutional neural networks and the self multi-head attention mechanism ...
doi:10.1371/journal.pone.0222713
pmid:31557213
pmcid:PMC6762082
fatcat:a52q4nrwgzhqjmofkpqbwqdaee
An Integration model based on Graph Convolutional Network for Text Classification
2020
IEEE Access
Graph Convolutional Network (GCN) is extensively used in text classification tasks and performs well in the process of the non-euclidean structure data. ...
INDEX TERMS Bidirectional long short-term memory network, dependency relationship, graph convolutional network, part-of-speech information, text classification. ...
Network (BRNN) [9] , Long Short-Term Memory (LSTM) [10] Network, Gated Recurrent Unit (GRU) [11] , Recurrent Convolutional Neural Network (RCNN) [12] , Convolutional Recurrent Neural Network (CRNN ...
doi:10.1109/access.2020.3015770
fatcat:2pz7g7dh3jdmdnr2qaaebzwl5e
Targeted Sentiment Classification Based on Attentional Encoding and Graph Convolutional Networks
2020
Applied Sciences
., recurrent neural networks and convolutional neural networks combined with an attention mechanism) are not able to fully capture the semantic information of the context and they also lack a mechanism ...
graph convolutional network (AEGCN) model. ...
This has inspired many NLP scholars to explore the application of GCNs in their own research. Some researchers have explored the use of graph neural networks in text classification. Peng et al. ...
doi:10.3390/app10030957
fatcat:khzu7aqyqzh5rnyn4625tibuzi
Sentiment analysis using multi-head attention capsules with multi-channel CNN and bidirectional GRU
2021
IEEE Access
The self-attention proposed by Lin et al. [34] can extract key information in sentences. Jia et al. ...
CONCLUSION This paper proposes a multi-head attention capsule model combining convolutional neural network and bidirectional GRU for text sentiment classification tasks. ...
doi:10.1109/access.2021.3073988
fatcat:2fbkolkidrdb5howsuhjw6zacy
Bi-LSTM Model to Increase Accuracy in Text Classification: Combining Word2vec CNN and Attention Mechanism
2020
Applied Sciences
The long short-term memory (LSTM) model and the convolutional neural network for sentence classification produce accurate results and have been recently used in various natural-language processing (NLP ...
between word sequences hence are better used for text classification. ...
Structure of the 1D convolutional neural network (CNN) for text classification.
Figure 2 . 2 Figure 2. Structure of the 1D convolutional neural network (CNN) for text classification. ...
doi:10.3390/app10175841
fatcat:hmue2fbsfjamlie452vmkxbloi
« Previous
Showing results 1 — 15 out of 4,864 results