A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit <a rel="external noopener" href="https://arxiv.org/pdf/1907.01657v2.pdf">the original URL</a>. The file type is <code>application/pdf</code>.
Dynamics-Aware Unsupervised Discovery of Skills
[article]
<span title="2020-02-14">2020</span>
<i >
arXiv
</i>
<span class="release-stage" >pre-print</span>
Conventionally, model-based reinforcement learning (MBRL) aims to learn a global model for the dynamics of the environment. A good model can potentially enable planning algorithms to generate a large variety of behaviors and solve diverse tasks. However, learning an accurate model for complex dynamical systems is difficult, and even then, the model might not generalize well outside the distribution of states on which it was trained. In this work, we combine model-based learning with model-free
<span class="external-identifiers">
<a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1907.01657v2">arXiv:1907.01657v2</a>
<a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/hm4nmah6w5csxlinn7r3oiaqqy">fatcat:hm4nmah6w5csxlinn7r3oiaqqy</a>
</span>
more »
... earning of primitives that make model-based planning easy. To that end, we aim to answer the question: how can we discover skills whose outcomes are easy to predict? We propose an unsupervised learning algorithm, Dynamics-Aware Discovery of Skills (DADS), which simultaneously discovers predictable behaviors and learns their dynamics. Our method can leverage continuous skill spaces, theoretically, allowing us to learn infinitely many behaviors even for high-dimensional state-spaces. We demonstrate that zero-shot planning in the learned latent space significantly outperforms standard MBRL and model-free goal-conditioned RL, can handle sparse-reward tasks, and substantially improves over prior hierarchical RL methods for unsupervised skill discovery.
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200321140121/https://arxiv.org/pdf/1907.01657v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext">
<button class="ui simple right pointing dropdown compact black labeled icon button serp-button">
<i class="icon ia-icon"></i>
Web Archive
[PDF]
</button>
</a>
<a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1907.01657v2" title="arxiv.org access">
<button class="ui compact blue labeled icon button serp-button">
<i class="file alternate outline icon"></i>
arxiv.org
</button>
</a>