Tensor decompositions and algorithms, with applications to tensor learning [article]

Felipe Bottega Diniz
2021 arXiv   pre-print
A new algorithm of the canonical polyadic decomposition (CPD) presented here. It features lower computational complexity and memory usage than the available state of the art implementations. We begin with some examples of CPD applications to real world problems. A short summary of the main contributions in this work follows. In chapter 1 we review classical tensor algebra and geometry, with focus on the CPD. Chapter 2 focuses on tensor compression, which is considered (in this work) to be one
more » ... the most important parts of the CPD algorithm. In chapter 3 we talk about the Gauss-Newton method, which is a nonlinear least squares method used to minimize nonlinear functions. Chapter 4 is the longest one of this thesis. In this chapter we introduce the main character of this thesis: Tensor Fox. Basically it is a tensor package which includes a CPD solver. After introducing Tensor Fox we will conduct lots of computational experiments comparing this solver with several others. At the end of this chapter we introduce the Tensor Train decomposition and show how to use it to compute higher order CPDs. We also discuss some important details such as regularization, preconditioning, conditioning, parallelism, etc. In chapter 5 we consider the intersection between tensor decompositions and machine learning. A novel model is introduced, which works as a tensor version of neural networks. Finally, in chapter 6 we reach the final conclusions and introduce our expectations for future developments.
arXiv:2110.05997v1 fatcat:jktyucn73rgurg52kh4itgynmi