A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Powerset Convolutional Neural Networks
[article]
2020
arXiv
pre-print
We present a novel class of convolutional neural networks (CNNs) for set functions, i.e., data indexed with the powerset of a finite set. ...
The convolutions are derived as linear, shift-equivariant functions for various notions of shifts on set functions. ...
Conclusion We introduced a convolutional neural network architecture for powerset data. We did so by utilizing novel powerset convolutions and introducing powerset pooling layers. ...
arXiv:1909.02253v4
fatcat:5ouueacsufaz7ozx47qgin4k7y
Convolutional-neural-network-based Multilabel Text Classification for Automatic Discrimination of Legal Documents
2020
Sensors and materials
We propose a multilabel text classification model based on multilabel text convolutional neural network (MLTCNN). ...
(17) The deep learning model trains text classification models such as convolutional neural networks for sentence classification (TextCNN) (18) or recurrent neural network for text classification ...
Many deep learning algorithms, such as the convolutional neural network (CNN), are highly applicable to images and migrate them to the text field easily. ...
doi:10.18494/sam.2020.2794
fatcat:jxqetl45mfeffn6fxyl52n5b4u
A Novel Deep Neural Network Model for Multi-Label Chronic Disease Prediction
2019
Frontiers in Genetics
In this work, we use problem transform methods to convert the chronic diseases prediction into a multi-label classification problem and propose a novel convolutional neural network (CNN) architecture named ...
Binary Relevance (BR) and Label Powerset (LP) methods are adopted to transform multiple chronic disease labels. ...
METHODS
Group Convolution Strategy To improve the performance of a convolutional neural network (CNN) architecture. ...
doi:10.3389/fgene.2019.00351
pmid:31068968
pmcid:PMC6491565
fatcat:pc7pfxw3uvgg5ddldwyje4dkdy
Discrete Signal Processing with Set Functions
[article]
2020
arXiv
pre-print
Set functions are functions (or signals) indexed by the powerset (set of all subsets) of a finite set N. ...
For each shift it provides associated notions of shift-invariant filters, convolution, Fourier transform, and frequency response. ...
One prominent example are neural networks using graph convolution [10] . ...
arXiv:2001.10290v2
fatcat:c25s625dyzaupkkmulokfevf4i
Optimization and Abstraction: A Synergistic Approach for Analyzing Neural Network Robustness
[article]
2019
arXiv
pre-print
In this paper, we present a novel algorithm for verifying robustness properties of neural networks. ...
In recent years, the notion of local robustness (or robustness for short) has emerged as a desirable property of deep neural networks. ...
To answer these research questions, we collected a benchmark suite of 602 verification problems across 7 deep neural networks, including one convolutional network and several fully connected networks. ...
arXiv:1904.09959v1
fatcat:7khytmrwprfvlppxugkrf3drae
Instrument Role Classification: Auto-tagging for Loop Based Music
2020
Zenodo
We introduce a new dataset for this task, the Freesound Loop Dataset, and benchmark the performance of both neural network and non-neural network based multi-label classification models for six instrument ...
In this work, Convolutional neural networks are shown to perform well in distinguishing between compatible loops and noncompatible loops. ...
(Choi, Fazekas, & Sandler, 2016 ) introduced a deep fully convolutional neural network (FCN) and showed that deep FCN with 2D convolutions can be effectively used for automatic music tagging and classification ...
doi:10.5281/zenodo.4285366
fatcat:hewqm246sfatzds6rp7k7l3mf4
Multi-Scale Label Relation Learning for Multi-Label Classification Using 1-Dimensional Convolutional Neural Networks
[article]
2021
arXiv
pre-print
The proposed method uses the 1-dimensional convolutional neural network (1D-CNN) to serve the same purpose in a more efficient manner. ...
Modern multi-label classifiers have been adopting recurrent neural networks (RNNs) as a memory structure to capture and exploit label dependency relations. ...
In the next section, we present our model that uses 1-Dimensional Convolution Neural Network (1D-CNN) to learn the label relationships. ...
arXiv:2107.05941v1
fatcat:mabcqxdnyvfgjcgg3qebx6qtya
Multi-label classification of line chart images using convolutional neural networks
2020
SN Applied Sciences
In this paper, we propose a new convolutional neural network (CNN) architecture to build a multi-label classifier that categorizes line chart images according to their characteristics. ...
In addition, two different multi-label solution approaches are compared: label powerset (LP) and binary relevance (BR) methods. ...
Convolutional neural network Convolutional Neural Networks (CNNs) are a special type of deep neural networks and image classification is one of the most common applications of this method [26] . ...
doi:10.1007/s42452-020-3055-y
fatcat:f73se2jz3jcp7e6dnpblqrmgvy
Multilabel Classification Based on Graph Neural Networks
[chapter]
2021
Artificial Intelligence
By using trace minimization techniques, the topology of the Laplacian graph can be learned from input data, subsequently creating robust Laplacian embedding and influencing graph convolutional networks ...
Convolutional neural networks (CNNs) [17] have been successfully developed in the field of computer vision [18, 19] but are unable to process graph structured data [20] . ...
The method used in this paper is called a graph convolutional network (GCN). ...
doi:10.5772/intechopen.99681
fatcat:i2ttb6nf3jgmpdpeh3n6alyhee
An Approach Based on Multilevel Convolution for Sentence-Level Element Extraction of Legal Text
2021
Wireless Communications and Mobile Computing
The encoder applies multilevel convolutional neural networks (CNN) and Long Short-Term Memory (LSTM) as feature extraction networks to extract local neighborhood and context information from legal text ...
. [8] proposes a model that applies
convolutional neural network and recurrent neural network 3.2. ...
The encoder applies multilevel
convolutional neural networks (CNN) and Long Short-Term Memory (LSTM) as feature extraction networks to extract local
neighborhood and context information ...
doi:10.1155/2021/1043872
fatcat:tvqftw2d25dfxf6vt6v3sfsjf4
Uncertainty Flow Facilitates Zero-Shot Multi-Label Learning in Affective Facial Analysis
2018
Applied Sciences
networks. ...
multi-label annotated FER2013, our proposed Uncertainty Flow in multi-label facial expression analysis exhibited superiority to conventional multi-label learning algorithms and multi-label compatible neural ...
To further elevate the discriminative performance in multi-label prediction, we additional frame a convolutional neural network under Bayesian learning, producing Bayesian convolutional neural network ...
doi:10.3390/app8020300
fatcat:q7nuexgum5aljbtb75yw26bxbe
A novel approach for multi-label Chest X-Ray classification of common Thorax Diseases
2019
IEEE Access
neural network for thorax disease classification. ...
The Convolutional Neural Networks (CNNs) are the most favorite and popular deep learning models for the task of image classification since it provides high accuracy and impressive results compared with ...
doi:10.1109/access.2019.2916849
fatcat:s2cyuqj2ffeebh3hdhswuprraq
Deep Online Convex Optimization by Putting Forecaster to Sleep
[article]
2016
arXiv
pre-print
The main result is that error backpropagation on a convolutional network is equivalent to playing out a circadian game. ...
This paper develops the first rigorous link between online convex optimization and error backpropagation on convolutional networks. ...
Convolutional Networks This section sketches how maxout units, convolutional neural networks (LeCun et al., 1989 (LeCun et al., , 1998 , dropout (Srivastava et al., 2014) , and dropconnect (Wan et al ...
arXiv:1509.01851v2
fatcat:c7vzvwe66reu5pvzacvfaortnu
Editorial: Deep Learning for Toxicity and Disease Prediction
2020
Frontiers in Genetics
For instance, deep convolutional neural networks (CNNs) have brought about breakthroughs in computer vision and pattern recognition (Krizhevsky et al., 2012) , whereas recurrent neural networks have shed ...
The history of DL can be traced back to the 1940s when the first neural network model was developed (McCulloch and Pitts, 1943) . ...
doi:10.3389/fgene.2020.00175
pmid:32174981
pmcid:PMC7055598
fatcat:t2ujnq6mrjdeti3xcng2zh3tou
Analyzing Deep Neural Networks with Symbolic Propagation: Towards Higher Precision and Faster Verification
[article]
2019
arXiv
pre-print
Deep neural networks (DNNs) have been shown lack of robustness for the vulnerability of their classification to small perturbations on the inputs. ...
Deep neural networks We work with deep feedforward neural networks, or DNNs, which can be represented as a function f : R m → R n , mapping an inputx ∈ R m to its outputȳ = f (x) ∈ R n . ...
Related Work Verification of neural networks can be traced back to [19] , where the network is encoded after approximating every sigmoid activation function with a set of piecewise linear constraints ...
arXiv:1902.09866v1
fatcat:asa27grlwzcchgtrorl35fsybu
« Previous
Showing results 1 — 15 out of 207 results