Making Tensor Factorizations Robust to Non-Gaussian Noise

Abstract

Tensors are multi-way arrays, and the CANDECOMP/PARAFAC (CP) tensor factorization has found application in many different domains. The CP model is typically fit using a least squares objective function, which is a maximum likelihood estimate under the assumption of independent and identically distributed (i.i.d.) Gaussian noise. We demonstrate that this loss function can be highly sensitive to non-Gaussian noise. Therefore, we propose a loss function based on the 1-norm because it can accommodate both Gaussian and grossly non-Gaussian perturbations. We also present an alternating majorization-minimization (MM) algorithm for fitting a CP model using our proposed loss function (CPAL1) and compare its performance to the workhorse algorithm for fitting CP models, CP alternating least squares (CPALS).

Publication
Tech. Rep., Sandia National Laboratories
Date
Tags
Citation
E. C. Chi, T. G. Kolda. Making Tensor Factorizations Robust to Non-Gaussian Noise. Tech. Rep. No. SAND2011-1877, Sandia National Laboratories, 2011. https://doi.org/10.2172/1011706

BibTeX

@techreport{SAND2011-1877,  
author = {Eric C. Chi and Tamara G. Kolda}, 
title = {Making Tensor Factorizations Robust to Non-{Gaussian} Noise}, 
number = {SAND2011-1877}, 
institution = {Sandia National Laboratories}, 
month = {March}, 
year = {2011},
doi = {10.2172/1011706},	
url = {http://www.osti.gov/scitech/biblio/1011706},
urldate = {2014-04-17},
}