1,715 Hits in 4.4 sec

Artificial Neural Networks [chapter]

Hanspeter A Mallot
2013 Springer Series in Bio-/Neuroinformatics  
Diez Learning A-posteriori Probabilities with Multi-layer Perceptron Classifiers 273 M.JJ. Holt Stable States, Transitions and Convergance in Kohonen Self-organising Maps 281 DA.  ...  Ho Unsupervised Hebbian Learning and the Shape of the Neuron Activation Function 179 J.L. Shapiro, A. Priigel-Bennet Ockham's Nets: Self-Adaptive Minimal Neural Networks 183 GD.  ...  Self-organizing Neural Network Apllication to Technical Process Parameters Estimation 579 E. Govekar, E. Susie, P. Muzic, I. Grabec High-precision Robot Control: The Nested Network 583 A.  ... 
doi:10.1007/978-3-319-00861-5_4 fatcat:l3v3etbv6zfxlcfkxnzml4v3xu

My First Deep Learning System of 1991 + Deep Learning Timeline 1962-2013 [article]

Jürgen Schmidhuber
2013 arXiv   pre-print
Deep Learning has attracted significant attention in recent years.  ...  Here I present a brief overview of my first Deep Learner of 1991, and its historic context, with a timeline of Deep Learning highlights.  ...  Figure 1 : My first Deep Learning system of 1991 used a deep stack of recurrent neural networks (a Neural Hierarchical Temporal Memory) pre-trained in unsupervised fashion to accelerate subsequent supervised  ... 
arXiv:1312.5548v1 fatcat:urehqlw7sbbr7pu4forrkbxr2q

Unsupervised Multi-net Simulation: An Application to Child Development

A. Nyamapfene, K. Ahmad
2006 The 2006 IEEE International Joint Conference on Neural Network Proceedings  
Model Simulation In simulating the gated multi-net, we chose a network size of 8 8× for both the counterpropagation network and temporal self-organising map.  ...  The Temporal Self-Organising Map The temporal self-organising map comprises a modified counterpropagation network whose neurons each have two sets of weights, context weights and pattern weights.  ... 
doi:10.1109/ijcnn.2006.246861 dblp:conf/ijcnn/NyamapfeneA06 fatcat:6fcoifyj45ae7bciwfq2r55qca

An associative memory for the on-line recognition and prediction of temporal sequences [article]

J. Bose, S.B. Furber, J.L. Shapiro
2006 arXiv   pre-print
This paper presents the design of an associative memory with feedback that is capable of on-line temporal sequence learning.  ...  A framework for on-line sequence learning has been proposed, and different sequence learning models have been analysed according to this framework.  ...  Examples are Recurrent nets with back-propagation, such as by Schmidhuber [4] , Hopfield nets ([6]), temporal difference and reinforcement learning ( [7] , [8] ), hidden Markov models [9] , self organising  ... 
arXiv:cs/0611020v1 fatcat:rx5veoz3wfbivchvkxdudlvvca

BreizhCrops: A Time Series Dataset for Crop Type Mapping [article]

Marc Rußwurm, Charlotte Pelletier, Maximilian Zollner, Sébastien Lefèvre, Marco Körner
2020 arXiv   pre-print
We compare seven recently proposed deep neural networks along with a Random Forest baseline.  ...  We aggregated label data and Sentinel-2 top-of-atmosphere as well as bottom-of-atmosphere time series in the region of Brittany (Breizh in local language), north-east France.  ...  These recurrent layers can be stacked in multiple cascaded layers where the sequence can be introduced bi-directionally in sequence and reversed-sequence orders.  ... 
arXiv:1905.11893v2 fatcat:nyz67pium5b3tczoxcgekd7gdm

Lateral interaction in accumulative computation: a model for motion detection

Antonio Fernández-Caballero, José Mira Mira, Ana E. Delgado, Miguel A. Fernández Graciani
2003 Neurocomputing  
Some of the major computer vision techniques make use of neural nets.  ...  In this paper we present a novel model based on neural networks denominated lateral interaction in accumulative computation (LIAC).  ...  changes, to ÿlter adaptively, or to pre-process data in self-organising networks, for example.  ... 
doi:10.1016/s0925-2312(02)00571-4 fatcat:vhvbi54sc5hppnx4nigkdl3sfe

Predictive coding is a consequence of energy efficiency in recurrent neural networks [article]

Abdullahi Ali, Nasir Ahmad, Elgar de Groot, Marcel A. J. van Gerven, Tim C. Kietzmann
2021 bioRxiv   pre-print
When training recurrent neural networks to minimise their energy consumption while operating in predictive environments, the networks self-organise into prediction and error units with appropriate inhibitory  ...  and excitatory interconnections, and learn to inhibit predictable sensory input.  ...  The sequence length can be chosen in advance, but all sequences are constrained to have the same length (in our simulations we take a sequence length of 10). The sequences are organised into batches.  ... 
doi:10.1101/2021.02.16.430904 fatcat:t6wqlz7uabbdfd2qjdcaeomu4q

The grounding of higher order concepts in action and language: A cognitive robotics model

Francesca Stramandinoli, Davide Marocco, Angelo Cangelosi
2012 Neural Networks  
The use of recurrent neural network also permits the learning of higher-order concepts based on temporal sequences of action primitives.  ...  In this model, sequences of linguistic inputs lead to the development of higher-order concepts grounded on basic concepts and actions.  ...  The second model uses recurrent neural networks as it permits the implementation of the learning of temporal sequences of actions.  ... 
doi:10.1016/j.neunet.2012.02.012 pmid:22386502 fatcat:kewpgq4pjvhytan5ge52ih7suu


M. Rußwurm, C. Pelletier, M. Zollner, S. Lefèvre, M. Körner
2020 The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences  
We compare seven recently proposed deep neural networks along with a Random Forest baseline.  ...  We aggregated label data and Sentinel-2 top-of-atmosphere as well as bottom-of-atmosphere time series in the region of Brittany (Breizh in local language), north-east France.  ...  These recurrent layers can be stacked in multiple cascaded layers where the sequence can be introduced bi-directionally in sequence and reversed-sequence orders.  ... 
doi:10.5194/isprs-archives-xliii-b2-2020-1545-2020 fatcat:r3qdclzo3fcenc34chkdb3vkci

Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing

Chun-Cheng Peng, George D. Magoulas
2011 Applied Mathematics and Computation  
In this paper we propose a nonmonotone approach to recurrent neural networks training for temporal sequence processing applications.  ...  This approach allows learning performance to deteriorate in some iterations, nevertheless the network's performance is improved over time.  ...  In this paper, we focus on second-order learning algorithms for RNNs to process temporal sequences, i.e. sequences whose elements are nominal symbols from a particular alphabet.  ... 
doi:10.1016/j.amc.2010.12.012 fatcat:neps6mha25bbhetgt5tliy6jmq

Activation-Based Recursive Self-Organising Maps: A General Formulation and Empirical Results

Kevin I. Hynna, Mauri Kaipainen
2006 Neural Processing Letters  
(+358 9) 7563 0555 We generalize a class of neural network models that extend the Kohonen self-organizing map (SOM) algorithm into the sequential and temporal domain using recurrent connections.  ...  ., Kaipainen, M. (2006) "Activation-based recursive self-organising maps: A general formulation and empirical results" Neural Processing Letters, 24(2): 119-136 URL: http://dx.(+358 50) 583 3210; Fax:  ...  net.  ... 
doi:10.1007/s11063-006-9015-8 fatcat:67pni6x6kjc4xilgg6dc3w32nm

Detecting tactical patterns in basketball: Comparison of merge self-organising maps and dynamic controlled neural networks

Matthias Kempe, Andreas Grunz, Daniel Memmert
2014 European Journal of Sport Science  
In this study, we applied the merge self-organising map (MSOM) to spatio-temporal data.  ...  Neural networks derived from self-organizing maps established themselves as a useful tool to analyse static and temporal data.  ...  Acknowledgement The authors thank Roland Leser for his assistance in data collection.  ... 
doi:10.1080/17461391.2014.933882 pmid:24993662 fatcat:n6x4zrd2j5etfjjdovf4pvk37a

Improved Understanding of Urban Sprawl Using Neural Networks [chapter]

Lidia Diappi, Paola Bolchim, Massimo Buscema
2004 Recent Advances in Design and Decision Support Systems in Architecture and Urban Planning  
) Self = Self Recurrent Network (Semeion) Tasm = Temporal Associative Subjective Memory (Semeion) Order: DA = Dynamic and Adaptive Recurrency (Semeion) SA = Static and Adaptive Recurrency (Semeion  ...  As said before SOM are Self Organised NN, where the target is not predefined, but dynamically built up during the learning phase.  ... 
doi:10.1007/1-4020-2409-6_3 fatcat:xzj7evux6bcmze5mlyddz3hmia

Connectionist simulation of quantification skills

Khurshid Ahmad, Matthew Casey, Tracey Bale
2002 Connection science  
Acknowledgements Tracey Bale gratefully acknowledges the support of UK Engineering and Physical Sciences Council in terms of a PhD Studentship (1994-97) and that of the EU sponsored ACE Project (ESPRIT  ...  Unsupervised learning can be compared with an organism's built-in self-motivation.  ...  The subitizing system (section 3) can apprehend numerosity of up to 5 objects fairly accurately; this system uses a sequential multi-net system including two Kohonen (1997) Self Organising Maps (SOMs  ... 
doi:10.1080/09540090208559326 fatcat:ei42qup4gbb2lolregdzbn4ys4

The use of machine learning algorithms for detecting advanced persistent threats

Hope Nkiruka Eke, Andrei Petrovski, Hatem Ahriz
2019 Proceedings of the 12th International Conference on Security of Information and Networks - SIN '19  
This paper explores the application of Artificial Immune System (AIS) and Recurrent Neural Networks (RNNs) variants for APT detection.  ...  Advanced Persistent Threats (APTs) have been a major challenge in securing both Information Technology (IT) and Operational Technology (OT) systems.  ...  In addition to learning the temporal patterns with cyclic connections, unfolding allows the RNN model to learn the association of static features between the input and output sequences.  ... 
doi:10.1145/3357613.3357618 dblp:conf/sin/EkePA19 fatcat:d5czfhnoh5eojlnhhwg7vkwjca
« Previous Showing results 1 — 15 out of 1,715 results