A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
On Mining Conditions using Encoder-decoder Networks
2019
Proceedings of the 11th International Conference on Agents and Artificial Intelligence
We propose a taxonomy classifier based on sequenceto-sequence neural networks, which are widely used in machine translation and automatic document summarization, by treating taxonomy classification as ...
This paper describes our taxonomy classifier for SIGIR eCom Rakuten Data Challenge. ...
Also, we would like to show our gratitude to Kento Nozawa and Taro Tezuka for comments that greatly improved the manuscript. ...
doi:10.5220/0007379506240630
dblp:conf/icaart/GallegoC19
fatcat:hkywltc6wbc7bocdqskt7typj4
Atlas: A Dataset and Benchmark for E-commerce Clothing Product Categorization
[article]
2019
arXiv
pre-print
Further, we establish the benchmark by comparing image classification and Attention based Sequence models for predicting the category path. ...
Product categorization is a large scale classification task that assigns a category path to a particular product. ...
Encoder and Decoder In Encoder, we use Convolutional Neural Network (CNN) to produce fixed size vectors. ...
arXiv:1908.08984v1
fatcat:n2hbzhbynnexjov7jxqxbnhhzq
Finding Reusable Machine Learning Components to Build Programming Language Processing Pipelines
[article]
2022
arXiv
pre-print
However, it is challenging for new researchers and developers to find the right components to construct their own machine learning pipelines, given the diverse PLP tasks to be solved, the large number ...
Chami et al. also present a taxonomy but with focus in graph representation learning (GRL) [6] . The taxonomy includes GRL from network embedding, graph regularization and graph neural networks. ...
This categorization is common for a general purpose presentation of Neural Networks. We redirect the reader toward [15, 28] and [6] for taxonomies of NLP models and graph models, respectively. ...
arXiv:2208.05596v1
fatcat:4pp7zsxvynf35fraa2wz65epd4
Network Traffic Feature Engineering Based on Deep Learning
2018
Journal of Physics, Conference Series
network encode. ...
A network traffic feature extraction method based on autoencoder model is proposed. ...
A multilevel taxonomy and requirements for an optimal traffic-classification model 2014 Int. ...
doi:10.1088/1742-6596/1069/1/012115
fatcat:nuwqqtiviraw7ea7pmgboiofrm
Efficient strategies for hierarchical text classification: External knowledge and auxiliary tasks
[article]
2020
arXiv
pre-print
Most of the studies have focused on developing novels neural network architectures to deal with the hierarchical structure, but we prefer to look for efficient ways to strengthen a baseline model. ...
In hierarchical text classification, we perform a sequence of inference steps to predict the category of a document from top to bottom of a given class taxonomy. ...
Finally, we acknowledge the support of NVIDIA Corporation with the donation of a Titan Xp GPU used for the study. ...
arXiv:2005.02473v2
fatcat:wowklmteubeavds2su6btmiokq
Machine Learning on Graphs: A Model and Comprehensive Taxonomy
[article]
2022
arXiv
pre-print
Specifically, we propose a Graph Encoder Decoder Model (GRAPHEDM), which generalizes popular algorithms for semi-supervised learning on graphs (e.g. ...
The second, graph regularized neural networks, leverages graphs to augment neural network losses with a regularization objective for semi-supervised learning. ...
Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. ...
arXiv:2005.03675v3
fatcat:6eoicgprdvfbze732nsmpaumqe
Transformers in Time Series: A Survey
[article]
2022
arXiv
pre-print
In this paper, we systematically review transformer schemes for time series modeling by highlighting their strengths as well as limitations through a new taxonomy to summarize existing time series transformers ...
From the perspective of network modifications, we summarize the adaptations of module level and architecture level of the time series transformers. ...
., 2017] follows most competitive neural sequence models with an encoder-decoder structure. Both encoder and decoder are composed of multiple identical blocks. ...
arXiv:2202.07125v3
fatcat:q6vmkszqavbgrdtrbpsgaco7u4
Don't Classify, Translate: Multi-Level E-Commerce Product Categorization Via Machine Translation
[article]
2018
arXiv
pre-print
Conventional methods for product categorization are typically based on machine learning classification algorithms. ...
In our experiments on two large real-world datasets, we show that our approach achieves better predictive accuracy than a state-of-the-art classification system for product categorization. ...
We thank RIT for the collaboration, support and computation resources for our experiments. We also thank Ali Cevahir for his advice on the CUDeep-related experiments. ...
arXiv:1812.05774v1
fatcat:ki545ztt4rcynhnzg4gyg3pwz4
A Comparative Study of Real-Time Semantic Segmentation for Autonomous Driving
2018
2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
The framework is scalable for addition of new encoders and decoders developed in the community for other vision tasks. ...
We performed detailed experimental analysis on cityscapes dataset for various combinations of encoder and decoder. ...
We implemented a generic framework through the decoupled encoder-decoder design. This allows the extensibility for more encoding and decoding methods. ...
doi:10.1109/cvprw.2018.00101
dblp:conf/cvpr/SiamGAYJ018
fatcat:4dq7v7fo55elppdw2iv3akwzou
An Attentive Survey of Attention Models
[article]
2021
arXiv
pre-print
Attention Model has now become an important concept in neural networks that has been researched within diverse application domains. ...
We also describe how attention has been used to improve the interpretability of neural networks. Finally, we discuss some future research directions in attention. ...
One could exploit this benefit to introduce hybrid encoder-decoders, the most popular being Convolutional Neural Network (CNN) as an encoder, and RNN or Long Short Term Memory (LSTM) as the decoder. ...
arXiv:1904.02874v3
fatcat:fyqgqn7sxzdy3efib3rrqexs74
Knowledge-aware Document Summarization: A Survey of Knowledge, Embedding Methods and Architectures
[article]
2022
arXiv
pre-print
Particularly, we propose novel taxonomies to recapitulate knowledge and knowledge embeddings under the document summarization view. ...
This paper pursues to present the first systematic survey for the state-of-the-art methodologies that embed knowledge into document summarizers. ...
[25, 38]
Encoder-Decoder based Approaches In this category, the encoder-decoder architecture is adopted for document summarization. ...
arXiv:2204.11190v2
fatcat:rx7x6l47xzaa3gubjfvyaexpem
A Variational Graph Autoencoder for Manipulation Action Recognition and Prediction
[article]
2021
arXiv
pre-print
Our network has a variational autoencoder structure with two branches: one for identifying the input graph type and one for predicting the future graphs. ...
The current research trend heavily relies on advanced convolutional neural networks to process the structured Euclidean data, such as RGB camera images. ...
Our proposed graph network has an encoder-decoder structure, where encoder involves multiple graph convolution [14] layers and decoder has two branches: one is for the action recognition task and the ...
arXiv:2110.13280v1
fatcat:6ywfdrextrc3bd32jjvsdhyqbm
Neural Attention Models in Deep Learning: Survey and Taxonomy
[article]
2021
arXiv
pre-print
For decades, concepts and functions of attention have been studied in philosophy, psychology, neuroscience, and computing. Currently, this property has been widely explored in deep neural networks. ...
Here we propose a taxonomy that corroborates with theoretical aspects that predate Deep Learning. ...
Firstly, the encoder/decoder
RNNSearch [34], the decoder calculates from a context vector are recurrent neural networks, and the encoder ...
arXiv:2112.05909v1
fatcat:qk2gljrl2rdyfbxw62n5cu6hzu
Empirical Evaluation of the Effectiveness of Variational Autoencoders on Data Augmentation for the Image Classification Problem
2020
International Journal of Intelligent Systems and Applications in Engineering
On the other hand, generative models such as generative adversarial networks, auto-encoders, after trained with a set of image learn to generate synthetic data. ...
We evaluate the classification performance using various sized datasets and compare the classification performances on four datasets; dataset without augmentation, dataset augmented with VAE and two datasets ...
Image Classification A Deep Convolutional Neural Network (CNN) architecture is used for classification of the images. ...
doi:10.18201/ijisae.2020261593
fatcat:lkhczvixwbeuvn6nc5dshaiyam
End-to-End Trainable Attentive Decoder for Hierarchical Entity Classification
2017
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers
We address fine-grained entity classification and propose a novel attention-based recurrent neural network (RNN) encoderdecoder that generates paths in the type hierarchy and can be trained end-to-end. ...
We show that our model performs better on fine-grained entity classification than prior work that relies on flat or local classifiers that do not directly model hierarchical structure. ...
We thank Stephan Baier, Siemens CT members and the anonymous reviewers for valuable feedback. This research was supported by Bundeswirtschaftsministerium (bmwi. de), grant 01MD15010A (Smart Data Web). ...
doi:10.18653/v1/e17-2119
dblp:conf/eacl/SchutzeWK17
fatcat:frulzitxgrdsvbaohokyxwtxf4
« Previous
Showing results 1 — 15 out of 3,654 results