http://videolectures.net/nips2010_wright_oaml/
Prof. Stephen J. Wright conducted an excellent tutorial in NIPS 2010. This tutorial peeks into several important aspects of algorithms that are useful to practical and large-scale optimization problems in machine learning. Besides high-level overview of each aspect, the talk provides pointers to key references. Topics covered are:
- First-order Methods
- Stochastic and Incremental Gradient Methods
- Shrinking/Thresholding for Regularized Formulations
- Optimal Manifolds Identification and High-Order Methods
- Decomposition and Coordinate Relaxation
Also some tutorials/talks of interest from the long-term program “Modern Trends in Optimization and Its Application" (Sep – Dec 2010) in UCLA (provided the slides are released).
- (Tutorial) Algorithms for Sparse Optimization
- (Tutorial) Introduction to Robust Optimization
- (Tutorial) First-Order Algorithms for Convex Minimization
- (Talk) Accelerating First-Order Methods for Large-Scale Well-Structured Convex Optimization
- (Talk) Rank-Sparsity Minimization and Latent Variable Graphical Model Selection
- (Talk) Recent Advances in Alternating Direction Method: Practice and Theory
- (Talk) Bundle-Type Methods Uniformly Optimal for Smooth and Non-Smooth Convex Optimization
- (Talk) The Convex Geometry of Inverse Problems
- (Talk) Weak Recovery Conditions Using Graph Partitioning Bounds
- (Talk) A Majorized Penalty Approach for Calibrating Rank Constrained Correlation Matrix Problems
老大,你能不能别那么学术,顺应时代,搞点八卦花边什么的。
死周扒皮,跑到这么三俗的地方来反三俗