1,466 Hits in 3.5 sec

Predictive Precompute with Recurrent Neural Networks [article]

Hanson Wang, Zehui Wang, Yuanyuan Ma
2020 arXiv   pre-print
In this paper, we describe the novel application of recurrent neural networks (RNNs) for predictive precompute.  ...  It is therefore important to accurately predict per-user application usage in order to minimize wasted precomputation ("predictive precompute").  ...  Modeling sequences of access logs with recurrent neural networks.  ... 
arXiv:1912.06779v2 fatcat:wqgq4kih5neg7kyjpt4mrdyzie

Show and Recall: Learning What Makes Videos Memorable

Sumit Shekhar, Dhruv Singal, Harvineet Singh, Manav Kedia, Akhil Shetty
2017 2017 IEEE International Conference on Computer Vision Workshops (ICCVW)  
This approach utilizes the scene semantics derived from the titles of the videos using natural language processing (NLP) techniques and a recurrent neural network (RNN).  ...  The performance of the semantic-based methods are compared with those of the aesthetic feature-based methods using support vector regression (ϵ-SVR) and artificial neural network (ANN) models, and the  ...  Figure 1 : 1 Figure 1: Semantic-based models: the recurrent neural network model and ϵ-SVR model correspond to run 4 and 5, respectively.  ... 
doi:10.1109/iccvw.2017.321 dblp:conf/iccvw/ShekharSSKS17 fatcat:5lodigjkpjaw7djfyakajtl7mq

GLA in MediaEval 2018 Emotional Impact of Movies Task [article]

Jennifer J. Sun, Ting Liu, Gautam Prasad
2019 arXiv   pre-print
These features were computed over time and modeled using a gated recurrent unit (GRU) based network followed by a mixture of experts model to compute multiclass predictions.  ...  Our approach leverages image, audio, and face based features computed using pre-trained neural networks.  ...  Temporal Models To model the temporal dynamics of the emotion in the videos, we used recurrent neural networks.  ... 
arXiv:1911.12361v1 fatcat:hoa36iliujh2vfr74ucttp3ssm

Page 6619 of Mathematical Reviews Vol. , Issue 2001I [page]

2001 Mathematical Reviews  
The second chapter introduces artificial neural networks. At 38 pages this section only covers neural network basics such as the perceptron, delta learning and back-propagation.  ...  ISBN 3-7908-1256-0 This book provides a general overview of fuzzy sets and systems, neural networks and fuzzy-neural hybrids.  ... 

Link Prediction with Mutual Attention for Text-Attributed Networks

Robin Brochier, Adrien Guille, Julien Velcin
2019 Companion Proceedings of The 2019 World Wide Web Conference on - WWW '19  
We provide preliminary experiment results with a citation dataset on two prediction tasks, demonstrating the capacity of our model to learn a meaningful textual similarity.  ...  To train its parameters, we use the network links as supervision.  ...  Attention Mechanisms for NLP The Transformer [9] is a novel neural architecture that outperforms state-of-the-art methods in neural machine translation (NMT) without the use of convolution nor recurrent  ... 
doi:10.1145/3308560.3316587 dblp:conf/www/BrochierGV19a fatcat:pjy7a5jikneulh5jef2tg7m7cq

Modeling Labial Coarticulation with Bidirectional Gated Recurrent Networks and Transfer Learning

Théo Biasutto--Lervat, Sara Dahmani, Slim Ouni
2019 Interspeech 2019  
To do so, we experiment a sequential deep learning model, bidirectional gated recurrent networks, which have reached nice result in addressing the articulatory inversion problem and so should be able to  ...  We have trained and evaluated the model with a corpus consisting of 4 hours of French speech, and we have gotten an average RMSE close to 1.3mm.  ...  Thus, we have used a well-known sequential model : the recurrent neural networks.  ... 
doi:10.21437/interspeech.2019-2097 dblp:conf/interspeech/Biasutto-Lervat19 fatcat:tcawihzh5jec3pmydesctgatsu

Assisting Discussion Forum Users using Deep Recurrent Neural Networks

Jacob Hagstedt P Suorra, Olof Mogren
2016 Proceedings of the 1st Workshop on Representation Learning for NLP  
We present a discussion forum assistant based on deep recurrent neural networks (RNNs). The assistant is trained to perform three different tasks when faced with a question from a user.  ...  Our recurrent forum assistant is evaluated experimentally by prediction accuracy for the end-to-end trainable parts, as well as by performing an end-user study.  ...  We use a deep recurrent neural network with LSTM cells. The depth of the network is 2, and we use 650 hidden units in the LSTM cells.  ... 
doi:10.18653/v1/w16-1606 dblp:conf/rep4nlp/SuorraM16 fatcat:o3lgavqwofhprammyojnlqwx5y

Mixed Integer Neural Inverse Design [article]

Navid Ansari, Hans-Peter Seidel, Vahid Babaei
2022 arXiv   pre-print
In computational design and fabrication, neural networks are becoming important surrogates for bulky forward simulations.  ...  Here, we show that the piecewise linear property, very common in everyday neural networks, allows for an inverse design formulation based on mixed-integer linear programming.  ...  A BOUND PRECOMPUTATION ALGORITHM Extended bound precomputation algorithm calculates the upper and lower bound of each node in the neural network.  ... 
arXiv:2109.12888v2 fatcat:5sd3x3qhnveupiyrpzus5fr56u

An advanced spatio-temporal convolutional recurrent neural network for storm surge predictions [article]

Ehsan Adeli, Luning Sun, Jianxun Wang, Alexandros A. Taflanidis
2022 arXiv   pre-print
The developed neural network model is a time-series model, a Long short-term memory, a variation of Recurrent Neural Network, which is enriched with Convolutional Neural Networks.  ...  The neural network model is trained with the storm track parameters used to drive the CFD solvers, and the output of the model is the time-series evolution of the predicted storm surge across multiple  ...  To accommodate this substantial extension, a time-series Recurrent Neural Network model (RNN) is develop to predict the storm's behavior.  ... 
arXiv:2204.09501v1 fatcat:e4cndux3brdlve5kvmyfbl4dhe

NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing [article]

Nikita Klyuchnikov, Ilya Trofimov, Ekaterina Artemova, Mikhail Salnikov, Maxim Fedorov, Evgeny Burnaev
2020 arXiv   pre-print
A few benchmarks with precomputed neural architectures performances have been recently introduced to overcome this problem and ensure more reproducible experiments.  ...  Our main contribution is as follows: we have provided search space of recurrent neural networks on the text datasets and trained 14k architectures within it; we have conducted both intrinsic and extrinsic  ...  Acknowledgements This work was done during the cooperation project with Huawei Noah's Ark Lab.  ... 
arXiv:2006.07116v1 fatcat:ct5ibdzgevajbc5hy6rvlpflrq

Comparison of feedforward and recurrent neural network language models

M. Sundermeyer, I. Oparin, J.-L. Gauvain, B. Freiberg, R. Schluter, H. Ney
2013 2013 IEEE International Conference on Acoustics, Speech and Signal Processing  
Index Terms-Automatic speech recognition, feedforward neural networks, recurrent neural networks  ...  Two competing concepts have been developed: On the one hand, feedforward neural networks representing an ngram approach, on the other hand recurrent neural networks that may learn context dependencies  ...  When a recurrent neural network is used, the full sequence of predecessor words w i−1 1 is considered for predicting w i , see [5] .  ... 
doi:10.1109/icassp.2013.6639310 dblp:conf/icassp/SundermeyerOGFSN13 fatcat:enuwtrs5qjcrtg6rvoiao6cauq

Video Surveillance of Highway Traffic Events by Deep Learning Architectures [chapter]

Matteo Tiezzi, Stefano Melacci, Marco Maggini, Angelo Frosini
2018 Lecture Notes in Computer Science  
We compare different approaches that exploit architectures based on Recurrent Neural Networks (RNNs) and Convolutional Neural Networks (CNNs).  ...  The other approaches are based directly on the sequence of frames, that are eventually enriched with pixel-wise motion information.  ...  This is implemented with a Recurrent Neural Network (RNN), where s is the hidden state of the RNN.  ... 
doi:10.1007/978-3-030-01424-7_57 fatcat:kah5yv4qt5elbk6lu44w572orm

Learning Spatiotemporal Occupancy Grid Maps for Lifelong Navigation in Dynamic Scenes [article]

Hugues Thomas, Matthieu Gallet de Saint Aurin, Jian Zhang, Timothy D. Barfoot
2021 arXiv   pre-print
We provide both quantitative and qualitative insights into the predictions and validate our choices of network design with a comparison to the state of the art and ablation studies.  ...  The network is composed of a 3D back-end that extracts rich features and enables the semantic segmentation of the lidar frames, and a 2D front-end that predicts the future information embedded in the SOGMs  ...  Following the success of recurrent neural network (RNN) and in particular long short-term memory networks (LSTM) for trajectory prediction [9, 10] , the idea of isolating each obstacle as a distinct object  ... 
arXiv:2108.10585v2 fatcat:pufqd46ydrhvfbin7qdvp3fqdy

Aspect-Based Sentiment Analysis Using a Two-Step Neural Network Architecture [chapter]

Soufian Jebbara, Philipp Cimiano
2016 Communications in Computer and Information Science  
In a second step, a recurrent network processes each extracted aspect with respect to its context and predicts a sentiment label.  ...  As a first step, a recurrent neural network is used to extract aspects from a text by framing the problem as a sequence labeling task.  ...  By using a recurrent neural network, we present a novel neural network based approach to tackle aspect extraction as a sequence labeling task.  ... 
doi:10.1007/978-3-319-46565-4_12 fatcat:75zwqxozprcmzammws2kze33zm

Gating recurrent mixture density networks for acoustic modeling in statistical parametric speech synthesis

Wenfu Wang, Shuang Xu, Bo Xu
2016 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
This paper proposes a gating recurrent mixture density network (GRMDN) architecture to jointly address these two problems in neural network based SPSS.  ...  Though recurrent neural networks (RNNs) using long short-term memory (LSTM) units can address the issue of long-span dependencies across the linguistic inputs and have achieved the state-of-the-art performance  ...  Gating Recurrent Mixture Density Networks based SPSS A GRMDN combines a gating recurrent network with a mixture density model (e.g. GMM).  ... 
doi:10.1109/icassp.2016.7472733 dblp:conf/icassp/WangXX16 fatcat:6c4bms5oavcvfcf6ibvukhx7ti
« Previous Showing results 1 — 15 out of 1,466 results