2019-2 Convex Optimization for Large-Scale ML

In this lecture, we discuss modern convex optimization techniques to solve large-scale machine learning problems that involve big data. Most of machine learning problems are written as convex optimization at their core, and therefore it is important to have an in-depth understanding of convex optimization, to solve large-scale machine learning problems efficiently. Topics will include recent developments in SGD (stochastic gradient descent), proximal gradient descent, Nesterov-type acceleration (FISTA & Smoothing), block coordinate descent, and ADMM (alternating direction method of multipliers).

Time: Wed 13:00am-16:00

Location: Cluster Bd. R509 (학연산클러스터 509호)

References:

    • Introductory lectures on convex optimization, Yurii Nesterov, Springer (2004)

    • Convex Optimization, Boyd & Vandenberghe

    • Numerical Optimization, Nocedal & Wright

Grading: this lecture will follow the format of IC-PBL+ lectures.

    • PBL: 30%

    • Midterm Exam: 30%

      • Practice Midterm questions [pdf]

    • Final Exam 30% (coverage: all)

      • You can bring one-sided A4 cheating sheet

      • Practice Final questions [pdf]

      • 이 연습문제와 유사 분야/난이도로 3문제 정도 출제 예정입니다.

    • Attendance: 10%

TA email: nomar0107@gmail.com

Lecture Notes

  • Lecture 01. Introduction [pdf]

  • Lecture 02. Background in machine learning [pdf] & optimization [pdf]

  • Lecture 03.

    • Gradient descent [pdf]

    • Subgradient method [pdf]

  • Lecture 04.

  • Lecture 05.

    • Proximal Gradient Descent [pdf]

    • KKT (with SVM) [pdf]

  • Midterm (Oct 24, in class)

    • You can bring one-sided A4 cheating sheet

  • Lecture 07.

  • Lecture 08.

    • Compressed Sensing [pdf]

  • Lecture 09-10.

  • Lecture 11.

  • Lecture 12.

    • Duality 3-4 [pdf]

ICCV2019 Paper Review

  • Presentation 01. (11/7)

    • 이정현 Sparse and Imperceivable Adversarial Attacks [pdf] [ppt]

    • 민동준 Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution [pdf] [ppt]

  • Presentation 02. (11/14)

    • 조형민 SinGAN: Learning a Generative Model from a Single Natural Image [pdf]

  • Presentation 03. (11/21)

    • 손재범 Fast AutoAugment[pdf]

    • 권준형 Mining GOLD Samples for Conditional GANs [pdf][ppt]