A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2015; you can also visit the original URL.
The file type is application/pdf
.
Filters
A Recursive Recurrent Neural Network for Statistical Machine Translation
2014
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
, like recurrent neural networks, so that language model and translation model can be integrated naturally; (2) a tree structure can be built, as recursive neural networks, so as to generate the translation ...
In this paper, we propose a novel recursive recurrent neural network (R 2 NN) to model the end-to-end decoding process for statistical machine translation. ...
Auli et al. (2013) propose a joint language and translation model, based on a recurrent neural network. ...
doi:10.3115/v1/p14-1140
dblp:conf/acl/LiuYLZ14
fatcat:jo6oeaohlbhkbb7hls3wdbwhnm
Deep Learning based, end-to-end metaphor detection in Greek language with Recurrent and Convolutional Neural Networks
[article]
2020
arXiv
pre-print
We combine Convolutional Neural Networks and Recurrent Neural Networks with representation learning to bear on the metaphor detection problem for the Greek language. ...
This paper presents and benchmarks a number of end-to-end Deep Learning based models for metaphor detection in Greek. ...
Similar to word2vec [19] , it produces word embeddings by training a neural language model, that is trying to predict words given context (CBOW architecture) or context given words (SkipGram architecture ...
arXiv:2007.11949v1
fatcat:vfllqijgbvcobete7fldsuldba
A Term Weighted Neural Language Model and Stacked Bidirectional LSTM Based Framework for Sarcasm Identification
2021
IEEE Access
To represent text documents, we introduce inverse gravity moment based term weighted word embedding model with trigrams. ...
The purpose of our research is to present an effective sarcasm identification framework on social media data by pursuing the paradigms of neural language models and deep neural networks. ...
The rest of this section briefly presents the neural language models employed in the empirical analysis.
1) Word2vec The word2vec model is an artificial neural network based word embedding scheme, which ...
doi:10.1109/access.2021.3049734
fatcat:kzobtjmgp5b2bcsufcmmuoufza
Recurrent Neural Network Techniques: Emphasis on Use in Neural Machine Translation
2021
Informatica (Ljubljana, Tiskana izd.)
The techniques are divided into three categories including recurrent neural network, recurrent neural network with phrasebased models and recurrent neural techniques with graph-based models. ...
In this paper, machine translation techniques based on using recurrent neural networks are analyzed and discussed. ...
This paper provides a comprehensive analysis of recent neural machine learning techniques based on recurrent neural networks. ...
doi:10.31449/inf.v45i7.3743
fatcat:yytqgxixkrgijhoglpfyqz6a4i
Efficient processing of GRU based on word embedding for text classification
2019
JOIV: International Journal on Informatics Visualization
GRU is a well-known type of recurrent neural network (RNN), which is ability of computing sequential data over its recurrent architecture. ...
In order to overcome the weakness, in this paper we proposed unified structure to investigate the effects of word embedding and Gated Recurrent Unit (GRU) for text classification on two benchmark datasets ...
[17] applied a neural network language model (NNLM) to learn word embeddings based on the preceding contexts of each word. ...
doi:10.30630/joiv.3.4.289
fatcat:3ggn2hmr4veclgja3k4yotn76q
Natural Language Processing with Improved Deep Learning Neural Networks
2022
Scientific Programming
This paper proposes a dependent syntactic analysis model based on a long-term memory neural network. ...
This model is based on the feed-forward neural network model described above and will be used as a feature extractor. ...
[19] proposed the use of a recurrent neural network to build a language model. e model uses the recurrent neural network to learn a distributed representation for each word while also modeling the word ...
doi:10.1155/2022/6028693
fatcat:6cyjzepiajfpbnw63jo6vyldga
Character-Level Neural Language Modelling in the Clinical Domain
2020
Studies in Health Technology and Informatics
More recently, character-level neural language models, exploiting recurrent neural networks, have again received attention, because they achieved similar performance against various NLP benchmarks. ...
Word embeddings have become the predominant representation scheme on a token-level for various clinical natural language processing (NLP) tasks. ...
The idea that character-level neural language models (CNLM) alone, based on recurrent neural networks (RNN) and more specifically long short-term memories (LSTM) are able to sufficiently capture the information ...
doi:10.3233/shti200127
pmid:32570351
fatcat:xhsr2prk3bhoxemgqkyfan2d7m
Neural Networks for Text Correction and Completion in Keyboard Decoding
[article]
2017
arXiv
pre-print
Neural Network (RNN) and Convolutional Neural Networks (CNN) for natural language understanding. ...
The memory footprint of our learnt model for inference and prediction is also an order of magnitude smaller than the conventional language model based text decoders. ...
Word Level Decoder -Implicit Language Model The decoder is also a GRU recurrent network that does word based processing. ...
arXiv:1709.06429v1
fatcat:zxtcemr76jb33f5axqffe6dhqi
Scope and Challenges in Conversational AI using Transformer Models
2021
International Journal of Scientific Research in Computer Science Engineering and Information Technology
This paper discusses various trends and advancements in the field of natural language processing and conversational AI like RNNs and RNN based architectures such as LSTMs, Sequence to Sequence models, ...
The authors have given a comparison between the various models discussed in terms of efficiency/accuracy and also discussed the scope and challenges in Transformer models. ...
Recurrent Neural Networks (RNN) A recurrent neural network (RNN) is a type of artificial neural network which uses sequential data or time series data. ...
doi:10.32628/cseit217696
fatcat:2fau4up5tzc3xgeo6gw25tjgni
Towards NLP with Deep Learning: Convolutional Neural Networks and Recurrent Neural Networks for Offensive Language Identification in Social Media
[article]
2019
arXiv
pre-print
This short paper presents the design decisions taken and challenges encountered in completing SemEval Task 6, which poses the problem of identifying and categorizing offensive language in tweets. ...
Recurrent Neural Network -Long Short-Term Memory (LSTM) Starting from the fact that language is sequential and maintains this property even after preprocessing, we decided to explore Recurrent Neural Networks ...
In such a case, words should be represented by their context which can be achieved by training a neural network in a skip-gram model context and pulling the trained weights which act as the embedding. ...
arXiv:1903.00665v2
fatcat:otfkvqpwsfefnmpc5j3gfocjci
A Hierarchical Approach to Neural Context-Aware Modeling
[article]
2018
arXiv
pre-print
We present a new recurrent neural network topology to enhance state-of-the-art machine learning systems by incorporating a broader context. ...
To show the potential of the newly introduced topology, we compare the approach against a context-agnostic set-up including a standard neural language model and a supervised binary classification network ...
Model Topology Our newly proposed neural network model topology introduces a context component, based on the recurrent encoder-decoder neural network design described by Mikolov et al. (2013) . ...
arXiv:1807.11582v2
fatcat:jdxdppe4qrfpnidwhpkddc44qa
Morphological Analysis for Unsegmented Languages using Recurrent Neural Network Language Model
2015
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing
We present a new morphological analysis model that considers semantic plausibility of word sequences by using a recurrent neural network language model (RNNLM). ...
To solve this problem, we do not use language models based on raw word sequences but use a semantically generalized language model, RNNLM, in morphological analysis. ...
Recurrent Neural Network Language Model RNNLM is a recurrent neural network language model (Mikolov et al., 2010) , which outputs a probability distribution of the next word, given the embedding of the ...
doi:10.18653/v1/d15-1276
dblp:conf/emnlp/MoritaKK15
fatcat:zyshrvbbkbfdbkbkbr3ad4swuu
CX-ST-RNM at SemEval-2019 Task 3: Fusion of Recurrent Neural Networks Based on Contextualized and Static Word Representations for Contextual Emotion Detection
2019
Proceedings of the 13th International Workshop on Semantic Evaluation
The model is based on two Recurrent Neural Networks, the first one is fed with a state-of-theart ELMo deep contextualized word representation and the second one is fed with a static Word2Vec embedding ...
The proposed model is compared with two baseline models based on a static word representation and a contextualized word representation, separately. ...
These derivative datasets are used to learn a two-stage model based on two Recurrent Neural Networks as described in Subsection 2.4. ...
doi:10.18653/v1/s19-2028
dblp:conf/semeval/Perelkiewicz19
fatcat:hke2bsnuyfgttpq6zvwincfi7a
A Novel Approach for Named Entity Recognition on Hindi Language Using Residual Bilstm Network
2020
International Journal on Natural Language Computing
In this paper we devise a Novel architecture based on residual network architecture for preferably Bidirectional Long Short Term Memory (BiLSTM) with fasttext word embedding layers. ...
Deep learning based algorithms are being developed in large scale as an innovative approach now a days for the advanced NER models which will give the best results out of it. ...
BiLSTM neural network. It provides word embeddings for Hindi (and 157 other languages) and is based on the CBOW (Continuous Bag-of-Words) model. ...
doi:10.5121/ijnlc.2020.9201
fatcat:zveoyqxetbgaperpvt7w62zuxq
Sequential Recurrent Neural Networks for Language Modeling
[article]
2017
arXiv
pre-print
Feedforward Neural Network (FNN)-based language models estimate the probability of the next word based on the history of the last N words, whereas Recurrent Neural Networks (RNN) perform the same task ...
based only on the last word and some context information that cycles in the network. ...
Neural Network Language Models The goal of a language model is to estimate the probability distribution p(w T 1 ) of word sequences w T 1 = w1, · · · , wT . ...
arXiv:1703.08068v1
fatcat:v4v5rwldgzfqfmqlecei5lwyia
« Previous
Showing results 1 — 15 out of 31,016 results