Approaches Based on Markovian Architectural Bias in Recurrent Neural Networks [chapter]

Matej Makula, Michal Čerňanský, Ľubica Beňušková
2004 Lecture Notes in Computer Science  
Recent studies show that state-space dynamics of randomly initialized recurrent neural network (RNN) has interesting and potentially useful properties even without training. More precisely, when initializing RNN with small weights, recurrent unit activities reflect history of inputs presented to the network according to the Markovian scheme. This property of RNN is called Markovian architectural bias. Our work focuses on various techniques that make use of architectural bias. The first
more » ... is based on the substitution of RNN output layer with prediction model, resulting in capabilities to exploit interesting state representation. The second approach, known as echo state networks (ESNs), is based on large untrained randomly interconnected hidden layer, which serves as reservoir of interesting behavior. We have investigated both approaches and their combination and performed simulations to demonstrate their usefulness.
doi:10.1007/978-3-540-24618-3_22 fatcat:vm5ogwcvuzczfpfswior6tmm7u