2019-2 Convex Optimization for Large-Scale ML
In this lecture, we discuss modern convex optimization techniques to solve large-scale machine learning problems that involve big data. Most of machine learning problems are written as convex optimization at their core, and therefore it is important to have an in-depth understanding of convex optimization, to solve large-scale machine learning problems efficiently. Topics will include recent developments in SGD (stochastic gradient descent), proximal gradient descent, Nesterov-type acceleration (FISTA & Smoothing), block coordinate descent, and ADMM (alternating direction method of multipliers).
Time: Wed 13:00am-16:00
Location: Cluster Bd. R509 (학연산클러스터 509호)
References:
Introductory lectures on convex optimization, Yurii Nesterov, Springer (2004)
Convex Optimization, Boyd & Vandenberghe
Numerical Optimization, Nocedal & Wright
Grading: this lecture will follow the format of IC-PBL+ lectures.
TA email: nomar0107@gmail.com
Lecture Notes
Lecture 01. Introduction [pdf]
Lecture 02. Background in machine learning [pdf] & optimization [pdf]
Lecture 03.
Lecture 04.
Lecture 05.
Midterm (Oct 24, in class)
You can bring one-sided A4 cheating sheet
Lecture 07.
AGD [pdf]
Lecture 08.
Compressed Sensing [pdf]
Lecture 09-10.
ADMM [pdf]
Lecture 11.
Lecture 12.
Duality 3-4 [pdf]
ICCV2019 Paper Review
Presentation 01. (11/7)
Presentation 02. (11/14)
조형민 SinGAN: Learning a Generative Model from a Single Natural Image [pdf]
Presentation 03. (11/21)