Filters








2,268 Hits in 3.3 sec

Backpropagation Training for Fisher Vectors within Neural Networks [article]

Patrick Wieschollek, Fabian Groh, Hendrik P.A. Lensch
2017 arXiv   pre-print
We demonstrate that FV can be embedded into neural networks at arbitrary positions, allowing end-to-end training with back-propagation.  ...  Fisher-Vectors (FV) encode higher-order statistics of a set of multiple local descriptors like SIFT features.  ...  Conlusion and Outlook We introduce feature learning in combination with Fisher-Vectors in the fashion of neural networks, which paves the way for a wider range of applications for Fisher-Vectors.  ... 
arXiv:1702.02549v1 fatcat:qtyfvezsavaw7loviqoa7hd3dm

Gabor Filter and Gershgorin Disk-based Convolutional Filter Constraining for Image Classification

Vijay John, Ali Boyali, Seiichi Mita
2017 International Journal of Machine Learning and Computing  
The Gabor filter-based training is investigated for its effectiveness in increasing the classification accuracy of the convolutional neural network.  ...  In a standard convolutional neural network the different layers are trained using the back propagation algorithm.  ...  Apart from Gabor filters, researchers have also incorporated Log mel filter banks [5] , [6] , Fisher vectors [7] and sparse filter banks [8] within the CNN framework to improve the classification  ... 
doi:10.18178/ijmlc.2017.7.4.620 fatcat:66oyy7enlvgozn3pgpbzoymwl4

Neural-Network-Based Identification of Tissue-Type Plasminogen Activator Protein Production and Glycosylation in CHO Cell Culture under Shear Environment

R.S. Senger, M.N. Karim
2003 Biotechnology progress (Print)  
A series of hybrid feed-forward backpropagation neural networks were constructed to function as a software sensor.  ...  An artificial neural network (ANN) modeling scheme has been constructed for the identification of both recombinant tissue-type plasminogen activator (r-tPA) protein production and glycosylation from Chinese  ...  Acknowledgment This research was made possible through contributions from the Colorado Institute for Research in Biotechnology and the Colorado Bioprocessing Center.  ... 
doi:10.1021/bp034109x pmid:14656163 fatcat:ck2ys3qpmrfptjr74pbc5kcvhq

Symbolic and neural learning algorithms: An experimental comparison

Jude W. Shavlik, Raymond J. Mooney, Geoffrey G. Towell
1991 Machine Learning  
Despite the fact that many symbolic and neural network (connectionist) learning algorithms address the same problem of learning from classified examples, very little is known regarding their comparative  ...  Backpropagation occasionally outperforms the other two systems when given relatively small amounts of training data.  ...  Aha); Rob Holte and Peter Clark for Alen Shapiro's chess data; and Terry Sejnowski for the NETtalk data.  ... 
doi:10.1007/bf00114160 fatcat:saio5tkoz5b7zemmi72l5wxumy

Riemannian metrics for neural networks II: recurrent networks and learning symbolic data sequences

Y. Ollivier
2015 Information and Inference A Journal of the IMA  
This metric gradient ascent is designed to have an algorithmic cost close to backpropagation through time for sparsely connected networks.  ...  Recurrent neural networks are powerful models for sequential data, able to represent complex dependencies in the sequence that simpler models such as hidden Markov models cannot handle.  ...  anonymous referees for careful reading and helpful suggestions.  ... 
doi:10.1093/imaiai/iav007 fatcat:nnidrzswmjeorfcp4h6s3uid5e

Modular Block-diagonal Curvature Approximations for Feedforward Architectures [article]

Felix Dangel, Stefan Harmeling, Philipp Hennig
2020 arXiv   pre-print
We propose a modular extension of backpropagation for the computation of block-diagonal approximations to various curvature matrices of the training objective (in particular, the Hessian, generalized Gauss-Newton  ...  They subsume recently-proposed block-diagonal approximations as special cases, and are extended to convolutional neural networks in this work.  ...  Instead, we use exact curvature matrix-vector products provided within HBP. The CNN possesses sigmoid activations and cannot be trained by SGD (cf. Figure 5a ).  ... 
arXiv:1902.01813v3 fatcat:fxovjbrq4ba7voh2lkjgjsdpey

RNN Fisher Vectors for Action Recognition and Image Annotation [chapter]

Guy Lev, Gil Sadeh, Benjamin Klein, Lior Wolf
2016 Lecture Notes in Computer Science  
The methodology we use is based on Fisher Vectors, where the RNNs are the generative probabilistic models and the partial derivatives are computed using backpropagation.  ...  Recurrent Neural Networks (RNNs) have had considerable success in classifying and predicting sequences.  ...  Acknowledgments This research is supported by the Intel Collaborative Research Institute for Computational Intelligence (ICRI-CI).  ... 
doi:10.1007/978-3-319-46466-4_50 fatcat:2wvcamwqdzebve35qbewjhf2hq

Identification of Plant Types by Leaf Textures Based on the Backpropagation Neural Network

Taufik Hidayat, Asyaroh Ramadona Nilawati
2018 International Journal of Electrical and Computer Engineering (IJECE)  
The results of the extraction will then be selected for training by using the backpropagation neural network.  ...  The result of the training (the formation of the training set) will be calculated to produce the value of recognition accuracy with which the feature value of the dataset of the leaf images is then to  ...  The results of the extraction will then be selected for training using the backpropagation neural network.  ... 
doi:10.11591/ijece.v8i6.pp5389-5398 fatcat:wtvp2erimzcrthuajaabr37niu

Deep Linear Discriminant Analysis on Fisher Networks: A Hybrid Architecture for Person Re-identification [article]

Lin Wu, Chunhua Shen, Anton van den Hengel
2016 arXiv   pre-print
in Fisher vector encoding.  ...  In this paper, we introduce a hybrid architecture which combines Fisher vectors and deep neural networks to learn non-linear representations of person images to a space where data can be linearly separable  ...  (14) b) Backpropagation in Fisher vectors: In this section, we introduce the procedure for the deep learning of LDA with Fisher vector.  ... 
arXiv:1606.01595v1 fatcat:jigt3fxz45hxfcnrbawz2njbdm

A Simplified Natural Gradient Learning Algorithm

Michael R. Bastian, Jacob H. Gunther, Todd K. Moon
2011 Advances in Artificial Neural Systems  
It also uses a prior distribution on the neural network parameters and an annealed learning rate.  ...  However, it requires a larger number of additional parameters than ordinary backpropagation in the form of the Fisher information matrix.  ...  A Probability Model for Multilayer Perceptrons. In order for this analysis to be applied to neural networks, there must be a probability density function associated with a neural network.  ... 
doi:10.1155/2011/407497 fatcat:jcffb6dsk5ez5jejqsyzss675m

A Bag-of-Words Equivalent Recurrent Neural Network for Action Recognition [article]

Alexander Richard
2017 arXiv   pre-print
In this work, we propose a recurrent neural network that is equivalent to the traditional bag-of-words approach but enables for the application of discriminative training.  ...  The model further allows to incorporate the kernel computation into the neural network directly, solving the complexity issue and allowing to represent the complete classification system within a single  ...  Investigating structural similarities of neural networks and Fisher vectors, Sydorov et al.  ... 
arXiv:1703.08089v1 fatcat:5cu3ipw7cvcuda5ev62msprbcq

Combining Fisher Discriminant Analysis and probabilistic neural network for effective on-line signature recognition

Souham Meshoul, Mohamed Batouche
2010 10th International Conference on Information Science, Signal Processing and their Applications (ISSPA 2010)  
The second stage consists in tailoring a probabilistic neural network for effective classification purposes. Several experiments have been conducted using SVC2004 database.  ...  This has opened a new perspective for the possible use of signatures as a basis for an authentication system that is accurate and trustworthy enough to be integrated in practical applications.  ...  PROBABILISTIC NEURAL NETWORK Probabilistic Neural Networks (PNN) introduced by Donald Specht in 1988 [5, 6] are a kind of radial basis network suitable for classification problems.  ... 
doi:10.1109/isspa.2010.5605586 dblp:conf/isspa/MeshoulB10 fatcat:tk52d4suyna7pgzwsgjs2x44qa

Unsupervised nonparametric density estimation: A neural network approach

Edmondo Trentin, Antonino Freno
2009 2009 International Joint Conference on Neural Networks  
Artificial neural networks are, in principle, an alternative family of nonparametric models.  ...  So far, artificial neural networks have been extensively used to estimate probabilities (e.g., class-posterior probabilities).  ...  Results are reported in Table I (for an increasing number of training patterns). A standard k 1 = 1 was used for k n -Nearest Neighbor and k n -Neural Network.  ... 
doi:10.1109/ijcnn.2009.5179010 dblp:conf/ijcnn/TrentinF09 fatcat:f6gvsc44arhp5hrn5k56xeklo4

Comparative performance of some popular artificial neural network algorithms on benchmark and function approximation problems

V. K. Dhar, A. K. Tickoo, R. Koul, B. P. Dubey
2010 Pramana (Bangalore)  
using artificial neural networks.  ...  We report an inter-comparison of some popular algorithms within the artificial neural network domain (viz., Local search algorithms, global search algorithms, higher order algorithms and the hybrid algorithms  ...  We would also like to thank the anonymous referee his valuable comments and suggestions for improving the paper.  ... 
doi:10.1007/s12043-010-0029-4 fatcat:z4odxrv2lfdcfcjq3p7q5ahipq

AFS: An Attention-based mechanism for Supervised Feature Selection [article]

Ning Gui, Danni Ge, Ziyin Hu
2019 arXiv   pre-print
This paper introduces a novel neural network-based feature selection architecture, dubbed Attention-based Feature Selec-tion (AFS).  ...  Feature weights are generated based on the distribution of respective feature se-lection patterns adjusted by backpropagation during the train-ing process.  ...  Currently supported network structures include: e.g. deep neural networks (DNN), convolutional neural networks (CNN), and Recurrent Neural Network (RNN).  ... 
arXiv:1902.11074v1 fatcat:pdnnuoglbvf6xpx4zxvitnftty
« Previous Showing results 1 — 15 out of 2,268 results