VIRTUAL: IDeAS Seminar, Speaker: Sheng Xu, Yale University

IDeAS
Dec 14, 2021
10 am
Virtual

Zoom: Meeting ID: 956 8252 2304

Efficient computational methods for nonconvex optimization

This talk will discuss two types of nonconvex optimization problems induced by sparsity and mixture assumptions. Two computational methods will be further presented to efficiently solve them: 

  1. We aim to estimate a gradient-sparse parameter on a general graph. We introduce a tree-projected gradient descent algorithm and show the resulting estimators achieve rate-optimal statistical guarantee.
  2. Motivated by applications to single-particle cryo-EM, we consider problems of function estimation from multiple independently rotated observations in a low SNR regime. We study the unregularized MLE and characterize the geometry of the Fisher information and log-likelihood landscape. We show the existence of spurious local optima under the continuous multi-reference alignment model. We further propose a frequency marching algorithm to recover the signal without a good initial guess.

Sheng is a final year PhD student in the Statistics and Data Science department at Yale University, advised by Professors Zhou Fan and Sahand Negahban. Prior to joining Yale, he received a Bachelor's degree in mathematics and statistics at Peking University. His research interests lie broadly in high dimensional statistics, signal processing, nonconvex optimization, and combinatorial algorithms.