A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Deep Component Analysis via Alternating Direction Neural Networks
[article]
2018
arXiv
pre-print
By interpreting feed-forward networks as single-iteration approximations of inference in our model, we provide both a novel theoretical perspective for understanding them and a practical technique for ...
On the other hand, shallow representation learning with component analysis is associated with rich intuition and theory, but smaller capacity often limits its usefulness. ...
Generalization of Feed-Forward Networks Given proper initialization of the variables, a single iteration of this algorithm is identical to a pass through a feed-forward network. ...
arXiv:1803.06407v1
fatcat:tfivbuxbvbfc5lepgeglb5gpru
A Deep Relevance Matching Model for Ad-hoc Retrieval
2016
Proceedings of the 25th ACM International on Conference on Information and Knowledge Management - CIKM '16
By using matching histogram mapping, a feed forward matching network, and a term gating network, we can effectively deal with the three relevance matching factors mentioned above. ...
Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements. ...
Feed forward Matching Network
• FFMN
• Extract hierarchical matching patterns from different levels of interaction signals
…
…
…
…
…
…
…
…
…
Matching Histogram
Mapping
Feed Forward ...
doi:10.1145/2983323.2983769
dblp:conf/cikm/GuoFAC16
fatcat:m2zmgubzh5eztj6l6dxg4obhx4
Reframing Neural Networks: Deep Structure in Overcomplete Representations
[article]
2022
arXiv
pre-print
While exact inference requires iterative optimization, it may be approximated by the operations of a feed-forward deep neural network. ...
We also demonstrate how recurrent networks implementing iterative optimization algorithms can achieve performance comparable to their feed-forward approximations while improving adversarial robustness. ...
While [9] only considered feed-forward approximations, we propose an iterative algorithm for exact inference extending our work in [8] . ...
arXiv:2103.05804v2
fatcat:jpdvnlfhkngcniodyzyipzbfgi
Signature Authentication using Deep Learning
2019
VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE
Convolutional neural systems are actualized to parse marks and feed forward neural systems are executed to investigate the attributes of the mark. ...
SYSTEM ARCHITECTURE
ReLU activation function
Graphical representation of ReLU Representation of a feed-forward neural network
Representation of gradient descent The client transfers the filtered ...
This procedure can be rehashed commonly to build exactness. The yield of the last convolutional layer is then bolstered into a feed-forward neural system. ...
doi:10.35940/ijitee.i3103.0789s319
fatcat:dyyltmau65bxlovm7xmjsckpfi
Use of a Feed-Forward Back Propagation Network for the Prediction of Small for Gestational Age Newborns in a Cohort of Pregnant Patients with Thrombophilia
2022
Diagnostics
The aim of this study was to evaluate the predictive performance of a Feed-Forward Back Propagation Network (FFBPN) for the prediction of small for gestational age (SGA) newborns in a cohort of pregnant ...
Graphic representation of area under the curve (AUC) and receiver operating characteristic curve (ROC) for our Feed-Forward Back Propagation Network (FFBPN).
Figure 3 . 3 Figure 3. ...
Graphic representation of area under the curve (AUC) and receiver operating characteristic curve (ROC) for our Feed-Forward Back Propagation Network (FFBPN).
Figure 2 . 2 Figure 2. ...
doi:10.3390/diagnostics12041009
pmid:35454057
pmcid:PMC9025417
fatcat:6wulvviqvndqnfxv774argw7ry
ETH Zurich at TREC Precision Medicine 2017
2017
Text Retrieval Conference
second, subsequent stage, we re-rank the most relevant results based on a range of deep neural gene embeddings that project literal genetic expressions into a semantics-preserving vector space using feed-forward ...
networks trained on PubMed and NCBI information but also relying on generative adversarial methods to determine the likelihood of co-occurrence of various mutations within the same patient/article. ...
We employ a single-layer feed-forward neural network to embed the genes into a Euclidean space. The input of this network is a one-hot vector that uniquely represents a gene. ...
dblp:conf/trec/EghlidiGMWE17
fatcat:geq52v55e5fpvbptjwfw7ksuay
A Computational Model of Event Segmentation From Perceptual Prediction
2007
Cognitive Science
update such representations in a selforganizing manner. ...
The current set of simulations investigated whether this statistical structure within events can be used 1) to develop stable internal representations that facilitate perception and 2) to learn when to ...
This research was supported in part by a grant from the NSF (BCS-0353942), and a NDSEG graduate fellowship. ...
doi:10.1080/15326900701399913
pmid:21635310
fatcat:hxcivkzjrjfi5fhfcrto63wige
Case Study: Safety Verification of an Unmanned Underwater Vehicle
2020
2020 IEEE Security and Privacy Workshops (SPW)
The star-set is a computationally efficient set representation adept at characterizing large input spaces. ...
To achieve this, we utilize methods that can determine the exact output reachable set of all the UUV's components through the use of star-sets. ...
We will refer to this model as our Verifiable Model and it is composed of the four parts shown in Figure 4 , Feed Forward Neural Network Sensor, Feed Forward Neural Network Controller, Feed Forward Neural ...
doi:10.1109/spw50608.2020.00047
fatcat:5lgcitmylfgvre7kogwzeg5p3q
A Self-Attention Network for Hierarchical Data Structures with an Application to Claims Management
[article]
2018
arXiv
pre-print
We propose one model based on piecewise feed forward neural networks (deep learning) and another model based on self-attention neural networks for the task of claim management. ...
We show that the proposed methods outperform bag-of-words based models, hand designed features, and models based on convolutional neural networks, on a data set of two million health care claims. ...
The main difference is that the piecewise feed forward network forms a context independent representation, while the representation formed by self-attention can incorporate the context. ...
arXiv:1808.10543v1
fatcat:skidccc4ezc5tbjnnmrkktr7lq
Backpropagation Through Time For Networks With Long-Term Dependencies
[article]
2021
arXiv
pre-print
We propose using the 'discrete forward sensitivity equation' and a variant of it for single and multiple interacting recurrent loops respectively. ...
This solution is exact and also allows the network's parameters to vary between each subsequent step, however it does require the computation of a Jacobian. ...
Introduction Recurrent neural networks (RNNs) are a form an iterative process by feeding forward information from one state of the network into the next time-step. ...
arXiv:2103.15589v2
fatcat:q74zgi4d7feklh6souoqh4iima
3D object reconstruction and representation using neural networks
2004
Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Austalasia and Southe East Asia - GRAPHITE '04
The results show that neural network is a promising approach for reconstruction and representation of 3D objects. ...
Neural networks' capability in representing most classes of 3D objects used in computer graphics is also proven. ...
Backpropagation method is used for training multilayer feed-forward neural networks. ...
doi:10.1145/988834.988859
dblp:conf/graphite/PengS04
fatcat:koyu7njqbbb3lgavbrjzmgvuvu
Solution of Dual Fuzzy Equations Using a New Iterative Method
[chapter]
2018
Lecture Notes in Computer Science
The output from this neural network, which is also a fuzzy number, is numerically compared with the target output. ...
The comparison of the feed-back FNN method with the feed-forward FNN method shows that the less error is observed in the feed-back FNN method. ...
However, feed-back neural networks have impressive representation abilities so that can successfully overcome the futileness of feed-forward neural networks. ...
doi:10.1007/978-3-319-75420-8_23
fatcat:bhc4kfd3kzbvvnxqrkta72jlai
Training a Ranking Function for Open-Domain Question Answering
2018
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop
In machine reading, the machine reader has to extract the answer from the given ground truth paragraph. ...
In this study, we propose two neural network rankers that assign scores to different passages based on their likelihood of containing the answer to a given question. ...
This project has benefited from financial support to SB by Google, Tencent Holdings, and Samsung Research. KC thanks support by AdeptMind, eBay, TenCent, NVIDIA and CIFAR. ...
doi:10.18653/v1/n18-4017
dblp:conf/naacl/HtutBC18
fatcat:ch5rnw7stjahvheaysebrl7kn4
Training a Ranking Function for Open-Domain Question Answering
[article]
2018
arXiv
pre-print
In machine reading, the machine reader has to extract the answer from the given ground truth paragraph. ...
In this study, we propose two neural network rankers that assign scores to different passages based on their likelihood of containing the answer to a given question. ...
This project has benefited from financial support to SB by Google, Tencent Holdings, and Samsung Research. KC thanks support by AdeptMind, eBay, TenCent, NVIDIA and CIFAR. ...
arXiv:1804.04264v1
fatcat:net73xus5be6hm2ukgpizrtpfu
Distributed Iterative Gating Networks for Semantic Segmentation
[article]
2019
arXiv
pre-print
by integrating feedback signals with a feed-forward architecture. ...
Experiments reveal the high degree of capability that this recurrent approach with cascaded feedback presents over feed-forward baselines and other recurrent models for pixel-wise labeling problems on ...
The modulators take signals from the propagator gates and modulate the feature representation received as input from the preceding feed-forward stage before forwarding it to next stage. ...
arXiv:1909.12996v1
fatcat:uhqwjcnwjrgwtltrckduu3xx2q
« Previous
Showing results 1 — 15 out of 59,846 results