Talks
Some recent talks
A majorized proximal point dual Newton algorithm for nonconvex statistical optimization problems (The Sixth International Conference on Continuous Optimization, Technical University (TU) of Berlin, Germany, August 3–8, 2019).
Matrix Cones and Spectral Operators of Matrices (Advances in the Geometric and Analytic Theory of Convex Cones, Sungkyunkwan University, Korea, May 27–31, 2019).
On the Relationships of ADMM and Proximal ALM for Convex Optimization Problems (Institute of Applied Physics and Computational Mathematics, Beijing, April 12, 2019).
Sparse semismooth Newton methods and big data composite optimization (New Computing-Driven Opportunities for Optimization, Wuyishan, August 13-17, 2018).
On the efficient computation of the projector over the Birkhoff polytope (International Symposium on Mathematical Programming 2018, Bordeaux, July 1-6, 2018).
A block symmetric Gauss-Seidel decomposition theorem and its applications in big data nonsmooth optimization (International Workshop on Modern Optimization and Applications, AMSS, Beijing, June 16-18, 2018).
On the Equivalence of Inexact Proximal ALM and ADMM for a Class of Convex Composite Programming(DIMACS Workshop on ADMM and Proximal Splitting Methods in Optimization, Rutgers University, June 11-13, 2018).
A block symmetric Gauss-Seidel decomposition theorem and its applications in big data nonsmooth optimization (The Hong Kong Mathematical Society Annual General meeting 2018, May 26, 2018).
SDPNAL+: A MATLAB software package for large-scale SDPs with a user-friendly interface (SIAM-ALA18, May 2018).
Second order sparsity and big data optimization (October 2017).
Error bounds and the superlinear convergence rates of the augmented Lagrangian methods (October 2017).
Block symmetric Gauss-Seidel iteration and multi-block semidefnite programming (October 2017).
A two-phase augmented Lagrangian approach for linear and convex quadratic semidefinite programming problems(December 2016).
Linear rate convergence of the ADMM for multi-block convex conic programming (August 2016).
An efficient inexact accelerated block coordinate descent method for least squares semidefinite programming (June 2015).
Multi-stage convex relaxation approach for low-rank structured PSD matrix recovery (May 2014).
Some old talks
An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP (April 2012).
From linear programming to matrix programming: A paradigm shift in optimization (December 2011).
A sequential convex programming approach for rank constrained matrix optimization problems (December 2011).
Low rank matrix optimization problems: Back to nonconvex regularizations (June 2011).
Matrix optimization: Searching between the first and second order methods (May 2011).
A majorized penalty approach for calibrating rank constrained correlation matrix problems (October 2010).
An introduction to a class of matrix cone programming (July 2010).
A majoried penalty approach for calibrating rank constrained correlation matrix problems (July 2010).
Rank constrained matrix optimization problems (May 2010).
An Introduction to Correlation Stress Testing (March 2010).
An Implementable Proximal Point Algorithmic Framework for Nuclear Norm Minimization (July 2009).
A Proximal Point Method for Matrix Least Squares Problem with Nuclear Norm Regularization (May 2009).
A Newton-CG Augmented Lagrangian Method for Large Sacle Semidefinite Programming (based on a revised version) (March 2009).
A Newton-CG Augmented Lagrangian Method for Large Sacle Semidefinite Programming (October 2008).
Calibrating least squares covariance matrix problems with equality and inequality constraints (July 2008).
The Role of Metric Projectors in Nonlinear Conic Optimization (August 2007).
Inverse Quadratic Eigenvalue Problems (July 2006).
Recent Developments in Nonlinear Optimization Theory (July 2006).
A short summer school course on modern optimization theory: optimality conditions and perturbation analysis Part I Part II Part III (July 2006).