Non-convex optimization and statistical properties

Speaker Name: 
Xiaoming Hu
Speaker Title: 
Professor, Stewart School of Industrial & Systems Engineering
Speaker Organization: 
Georgia Tech
Start Time: 
Monday, October 22, 2018 - 4:00pm
End Time: 
Monday, October 22, 2018 - 5:00pm
Location: 
BE 156
Organizer: 
STAT Department Seminars

Abstract: Non-convex optimization has been introduced into statistics with a range of applications. One application is in the model selection under the sparse regression framework, with the celebrated methods such as the smoothly clipped absolute deviation (SCAD), the minimax concave penalty (MCP), and many more. The newly emerged deep-learning-related techniques often involve non-convex objective functions as well. A non-convex optimization problem is generally NP-hard; therefore there is no guaranteed polynomial-time numerical solution. One can only hope to identify a local optimum. A difference-of-convex (DC) function can be expressed as a difference of two convex functions, though the original function itself may be non-convex. There is a large existing literature on the optimization problems when their objectives and/or constraints involve the DC functions; they are commonly referred to as difference-of-convex algorithms (DCA). Efficient numerical solutions have been proposed. Under the DC framework, directional-stationary (d-stationary) solutions are considered, and they are in general not unique. We show that under some mild conditions, a certain subset of d-stationary solutions in an optimization problem (with a DC objective) has some ideal statistical properties: namely, asymptotic estimation consistency, asymptotic model selection consistency, asymptotic efficiency. The aforementioned properties are the ones that have been proven by many researchers for a range of proposed non-convex penalties in the sparse regression. Our analysis indicates that even with non-convex optimization, some statistical theoretical guarantee can still be established, in some general senses. Our work potentially bridges the communities of optimization and statistics. A joint work with Shanshan Cao.