Tensors are multi-way arrays, and the Candecomp/Parafac (CP) tensor factorization has found application in many different domains. The CP model is typically fit using a least squares objective function, which is a maximum likelihood estimate under the assumption of i.i.d. Gaussian noise. We demonstrate that this loss function can actually be highly sensitive to non-Gaussian noise. Therefore, we propose a loss function based on the 1-norm because it can accommodate both Gaussian and grossly non-Gaussian perturbations. We also present an alternating majorization-minimization algorithm for fitting a CP model using our proposed loss function.
CANDECOMP, PARAFAC, 1-norm, non-Gaussian noise
Contributed paper at the NIPS Workshop on Tensors, Kernels, and Machine Learning, Whistler, BC, Canada, December 10, 2010
@inproceedings{arXiv_1010.3043,
author = {Eric C. Chi and Tamara G. Kolda},
title = {Making Tensor Factorizations Robust to Non-{G}aussian Noise},
booktitle = {NIPS Workshop on Tensors, Kernels, and Machine Learning},
venue = {Whistler, BC},
eventdate = {2010-12-10},
month = {October},
year = {2010},
eprint = {1010.3043},
eprintclass = {math.NA},
}