Analytic Manifold Learning: Unifying and Evaluating Representations for Continuous Control [article]

Rika Antonova, Maksim Maydanskiy, Danica Kragic, Sam Devlin, Katja Hofmann
2020 arXiv   pre-print
We address the problem of learning reusable state representations from streaming high-dimensional observations. This is important for areas like Reinforcement Learning (RL), which yields non-stationary data distributions during training. We make two key contributions. First, we propose an evaluation suite that measures alignment between latent and true low-dimensional states. We benchmark several widely used unsupervised learning approaches. This uncovers the strengths and limitations of
more » ... g approaches that impose additional constraints/objectives on the latent space. Our second contribution is a unifying mathematical formulation for learning latent relations. We learn analytic relations on source domains, then use these relations to help structure the latent space when learning on target domains. This formulation enables a more general, flexible and principled way of shaping the latent space. It formalizes the notion of learning independent relations, without imposing restrictive simplifying assumptions or requiring domain-specific information. We present mathematical properties, concrete algorithms for implementation and experimental validation of successful learning and transfer of latent relations.
arXiv:2006.08718v2 fatcat:ag2hswmjbzfrpfnhnf7h24whqe