Generalized Canonical Polyadic Tensor Decomposition

Abstract

Tensor decomposition is a fundamental unsupervised machine learning method in data science, with applications including network analysis and sensor data processing. This work develops a generalized canonical polyadic (GCP) low-rank tensor decomposition that allows other loss functions besides squared error. For instance, we can use logistic loss or Kullback-Leibler divergence, enabling tensor decomposition for binary or count data. We present a variety of statistically-motivated loss functions for various scenarios. We provide a generalized framework for computing gradients and handling missing data that enables the use of standard optimization methods for fitting the model. We demonstrate the flexibility of GCP on several real-world examples including interactions in a social network, neural activity in a mouse, and monthly rainfall measurements in India.

Publication
SIAM Review
Date
Citation
D. Hong, T. G. Kolda, J. A. Duersch. Generalized Canonical Polyadic Tensor Decomposition. SIAM Review, in press, 2019. http://arxiv.org/abs/1808.07452

Keywords

canonical polyadic (CP) tensor decomposition, CANDECOMP, PARAFAC, Poisson tensor factorization, Bernoulli tensor factorization, missing data

BibTeX

@article{HoKoDu20,  
author = {David Hong and Tamara G. Kolda and Jed A. Duersch}, 
title = {Generalized Canonical Polyadic Tensor Decomposition}, 
journal = {SIAM Review}, 
year = {2020},
note = {in press},
eprint = {1808.07452},
}