A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Deep Node Ranking for Neuro-symbolic Structural Node Embedding and Classification
[article]
2021
arXiv
pre-print
This paper contributes a novel approach to learning network node embeddings and direct node classification using a node ranking scheme coupled with an autoencoder-based neural network architecture. ...
The main advantages of the proposed Deep Node Ranking (DNR) algorithm are competitive or better classification performance, significantly higher learning speed and lower space requirements when compared ...
Deep Node Ranking This section presents the Deep Node Ranking (DNR) algorithm for neurosymbolic structural network node embedding and end-to-end node classification (overview shown in Figure 1 ). ...
arXiv:1902.03964v6
fatcat:zlwkh66cqrclpiydbsr2ckdcf4
Deep Feature Learning of Multi-Network Topology for Node Classification
[article]
2018
arXiv
pre-print
Network Embedding, aiming to learn non-linear and low-dimensional feature representation based on network topology, has been proved to be helpful on tasks of network analysis, especially node classification ...
However, existing network embedding methods mainly focus on single network embedding and neglect the information shared among different networks. ...
Current network embedding approaches mainly focus on single-network embedding and utilize topological structure information to represent nodes. ...
arXiv:1809.02394v1
fatcat:2egjudinvncfdgfz5su7rjxmza
PINE: Universal Deep Embedding for Graph Nodes via Partial Permutation Invariant Set Functions
[article]
2019
arXiv
pre-print
., node classification, recommendation, community detection). The key problem in graph node embedding lies in how to define the dependence to neighbors. ...
Our method 1) can learn an arbitrary form of the representation function from the neighborhood, withour losing any potential dependence structures, and 2) is applicable to both homogeneous and heterogeneous ...
The numerical optimization algorithm and complexity are similar to those for standard deep neural networks. ...
arXiv:1909.12903v1
fatcat:2b6hv6qiiffvzjgx3iu3kyhm6i
Learning Edge Representations via Low-Rank Asymmetric Projections
2017
Proceedings of the 2017 ACM on Conference on Information and Knowledge Management - CIKM '17
Learning such continuous-space vector representations (or embeddings) of nodes in a graph is an important first step for using network information (from social networks, user-item graphs, knowledge bases ...
Unlike previous work, we (1) explicitly model an edge as a function of node embeddings, and we (2) propose a novel objective, the "graph likelihood", which contrasts information from sampled random walks ...
We access the trainable embeddings Y u and Y v for the nodes and feed them as input to Deep Neural Network (DNN) f . ...
doi:10.1145/3132847.3132959
dblp:conf/cikm/Abu-El-HaijaPA17
fatcat:4lj3jh7wjrhf5akj2ylqjk6dzy
Automatic Node Selection for Deep Neural Networks using Group Lasso Regularization
[article]
2016
arXiv
pre-print
We examine the effect of the Group Lasso (gLasso) regularizer in selecting the salient nodes of Deep Neural Network (DNN) hidden layers by applying a DNN-HMM hybrid speech recognizer to TED Talks speech ...
We test two types of gLasso regularization, one for outgoing weight vectors and another for incoming weight vectors, as well as two sizes of DNNs: 2048 hidden layer nodes and 4096 nodes. ...
To meet this requirement for finding a small, necessary, and sufficient DNN structure, several approaches have reshaped the network structure [3, 4, 5] or pruned the network nodes [6] . ...
arXiv:1611.05527v1
fatcat:ungst42ctfhybja4qm5so7yvgm
PINE: Universal Deep Embedding for Graph Nodes via Partial Permutation Invariant Set Functions
2021
IEEE Transactions on Pattern Analysis and Machine Intelligence
., node classification, recommendation, community detection). The key problem in graph node embedding lies in how to define the dependence to neighbors. ...
Graph node embedding aims at learning a vector representation for all nodes given a graph. ...
The numerical optimization algorithm and complexity are similar to those for standard deep neural networks. ...
doi:10.1109/tpami.2021.3061162
fatcat:jgmhyduzvfel3ljpkdigldwzdq
An Improved DeepNN with Feature Ranking for Covid-19 Detection
2022
Computers Materials & Continua
The symptoms of COVID-19 include muscle pains, loss of taste and smell, coughs, fever, and sore throat, which can lead to severe cases of breathing difficulties, organ failure, and death. ...
Additionally, it outperforms the other models in classification results as well as time. ...
And there are non-financial competing interests. ...
doi:10.32604/cmc.2022.022673
fatcat:vqrlsla2kjealidyct4n36aflu
Ranking to Learn: Feature Ranking and Selection via Eigenvector Centrality
[article]
2017
arXiv
pre-print
Ranking central nodes individuates candidate features, which turn out to be effective from a classification point of view, as proved by a thoroughly experimental section. ...
In an era where accumulating data is easy and storing it inexpensive, feature selection plays a central role in helping to reduce the high-dimensionality of huge amounts of otherwise meaningless data. ...
Compute eigenvalues {Λ} and eigenvectors {V } of A λ0 = max λ∈Λ (abs(λ)) return v0 the eigenvector associated to λ0 j)
end for
end for
-Ranking
Table 4 . 4 The tables show results obtained on the ...
arXiv:1704.05409v1
fatcat:e3jioz76r5hrhnbxkvmqpi2zjy
Pairwise Ranking Network for Affect Recognition
2021
Zenodo
We take a different approach and use a deep network architecture that performs joint training on the tasks of classification/regression of samples and ordinal ranking between pairs of samples. ...
We show that the approach proposed in this work leads to consistent performance gains when incorporated in classification/regression networks. ...
The processing pipeline for clas-sification/regression remains intact and the total architecture is trained in an end-to-end manner. ...
doi:10.5281/zenodo.5550448
fatcat:my3zdn5dabar7asj7hn4r35aje
Multi-Level Network Embedding with Boosted Low-Rank Matrix Approximation
[article]
2018
arXiv
pre-print
Nonetheless, the global low-rank assumption does not necessarily hold especially when the factorized matrix encodes complex node interactions, and the resultant single low-rank embedding matrix is insufficient ...
related to implicit matrix factorization, with the fundamental assumption that the factorized node connectivity matrix is low-rank. ...
algorithms; 5: end for 6: Return the final embedding as U = [U 1 , ..., U k ]
A. ...
arXiv:1808.08627v1
fatcat:2vtmmbcwwnbfzmqpxb6h66ibk4
Low-Rank Projections of GCNs Laplacian
[article]
2021
arXiv
pre-print
Through various ablation experiments, we evaluate the impact of bandpass filtering on the performance of a GCN: we empirically show that most of the necessary and used information for nodes classification ...
is contained in the low-frequency domain, and thus contrary to images, high frequencies are less crucial to community detection. ...
ACKNOWLEDGEMENTS EO was granted access to the HPC resources of IDRIS under the allocation 2020-[AD011011216R1] made by GENCI and this work was partially supported by ANR-19-CHIA "SCAI". ...
arXiv:2106.07360v1
fatcat:nqdldv3ag5fjtmgmd6mpbn4c64
Ranking to Learn and Learning to Rank: On the Role of Ranking in Pattern Recognition Applications
[article]
2017
arXiv
pre-print
Through these advancements, variable ranking has emerged as an active and growing research area and it is now beginning to be applied to many new problems. ...
For example, in pattern classification tasks, different data representations can complicate and hide the different explanatory factors of variation behind the data. ...
This last operation resembles the works on graph
centrality [23] (see for an example [279]), whose goal was to rank nodes in social
networks that would be visited the most, along whatever path in the structure ...
arXiv:1706.05933v1
fatcat:oc4xtmyqkvf4njpqsojewv75qu
Robustness via Deep Low-Rank Representations
[article]
2020
arXiv
pre-print
Algorithmically, the LR is scalable, generic, and straightforward to implement into existing deep learning frameworks. ...
To achieve low dimensionality of learned representations, we propose an easy-to-use, end-to-end trainable, low-rank regularizer (LR) that can be applied to any intermediate layer representation of a DNN ...
Structure in Linear Transformation: Added to these arguments, it must be noted that widely used transformation in deep networks like convolution layers etc are highly structured and introducing low structure ...
arXiv:1804.07090v5
fatcat:qq4gnlix6rbqha5hz6f2eykd2e
Large Margin Low Rank Tensor Analysis
[article]
2013
arXiv
pre-print
In this paper, we present a supervised model to learn the intrinsic structure of the tensors embedded in a high dimensional Euclidean space. ...
Experiments on applications for object recognition and face recognition demonstrate the superiority of our proposed model over state-of-the-art approaches. ...
Acknowledgments We thank the Social Sciences and Humanities Research Council of Canada (SSHRC) as well as the Natural Sciences and Engineering Research Council of Canada (NSERC) for their financial support ...
arXiv:1306.2663v1
fatcat:vlh7ofz2ujae7nqqq455zm47dm
The Deep Learning Solutions on Lossless Compression Methods for Alleviating Data Load on IoT Nodes in Smart Cities
2021
Sensors
Increasing IoT nodes leads to increasing data flow, which is a potential source of failure for IoT networks. ...
Networking is crucial for smart city projects nowadays, as it offers an environment where people and things are connected. ...
IoT nodes can be connected to an IoT gateway forming a local network. The gateway is connected to the internet which allows end-users to access (monitor or control) things. ...
doi:10.3390/s21124223
fatcat:xbd2o35mhvcjlaerwbjczeaote
« Previous
Showing results 1 — 15 out of 14,355 results