Abstract:
The focus of this talk is the multivariate method of moments for parameter estimation. First from a statistical standpoint, we show that in estimation problems where the noise is high, the sample complexity, that is, the number of observations necessary to estimate parameters, is dictated by the moments of the distribution. This follows from a Taylor expansion of the KL divergence that holds in the low signal-to-noise ratio regime. Second from a computational standpoint, we develop a method of moments to estimate the parameters of Gaussian Mixture Models (GMMs) implicitly, that is, without explicitly forming the moments. This addresses the curse of dimensionality: while the number of entries of higher-order moments of multivariate random variables scale exponentially with the order of the moments, our implicit approach has computational and storage costs similar to those of expectation-maximization (EM), and opens the door to the competitiveness between the two methods. Finally, I will discuss a new algorithm for tensor decomposition and its promising applications in implicitly decomposing moment tensors.