Randomized Methods for Low-Rank Tensor Decomposition in Unsupervised Learning
Tensor decomposition discovers latent structure in higher-order data sets and is the higher-order analogue of the matrix decomposition. Variations have proliferated, including symmetric tensor decomposition, tensor completion, higher-order singular value decomposition, generalized canonical tensor decomposition, streaming tensor decomposition, etc. All of these scenarios reduce to some form of nonconvex optimization. In this talk, we discuss the use of randomized optimization methods, including stochastic gradient methods and sketching. Such approaches show amazing promise, not only improving the speed of the methods but also the quality of the solution. We give examples of the intriguing successes as well as notable limitations of these methods. We close with a number of open questions, including important open theoretical challenges.
Dr. Tamara G. Kolda is a Distinguished Member of the Technical Staff at Sandia National Laboratories. She has published work in a number of areas including tensor decomposition, numerical optimization, network science, data mining, parallel computing, and scientific software. She was named a Distinguished Scientist of the Association for Computing Machinery (ACM) in 2011 and a Fellow of the Society for Industrial and Applied Mathematics (SIAM) in 2015. She is currently an elected member of the SIAM Board of Trustees and a member of the Board on Mathematical Sciences and Analytics for the United States National Academies. She is the founding Editor-in-Chief for the new SIAM Journal on the Mathematics of Data Science (SIMODS).