A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit <a rel="external noopener" href="https://arxiv.org/pdf/1903.09650v1.pdf">the original URL</a>. The file type is <code>application/pdf</code>.
Differentiable Programming Tensor Networks
[article]
<span title="2019-03-22">2019</span>
<i >
arXiv
</i>
<span class="release-stage" >pre-print</span>
Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and trains them using automatic differentiation (AD). The concept emerges from deep learning but is not only limited to training neural networks. We present theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher order derivatives of the program accurately and
<span class="external-identifiers">
<a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1903.09650v1">arXiv:1903.09650v1</a>
<a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/n7l5zjfhhvemhianohbig7l4di">fatcat:n7l5zjfhhvemhianohbig7l4di</a>
</span>
more »
... fficiently using AD. We present essential techniques to differentiate through the tensor networks contractions, including stable AD for tensor decomposition and efficient backpropagation through fixed point iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking the second order derivative of the free energy obtained in the tensor renormalization group calculation. Next, we perform gradient based variational optimization of infinite projected entangled pair states for quantum antiferromagnetic Heisenberg model and obtain start-of-the-art variational energy and magnetization with moderate efforts. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200923165606/https://arxiv.org/pdf/1903.09650v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext">
<button class="ui simple right pointing dropdown compact black labeled icon button serp-button">
<i class="icon ia-icon"></i>
Web Archive
[PDF]
<div class="menu fulltext-thumbnail">
<img src="https://blobs.fatcat.wiki/thumbnail/pdf/1a/0a/1a0a5512a95bcf1e341ec58ffb07bfce1db7e02a.180px.jpg" alt="fulltext thumbnail" loading="lazy">
</div>
</button>
</a>
<a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1903.09650v1" title="arxiv.org access">
<button class="ui compact blue labeled icon button serp-button">
<i class="file alternate outline icon"></i>
arxiv.org
</button>
</a>