ECE490: Introduction to Optimization (Spring 2024)

Course Information

  • Office Hours: Tu 2-3pm, CSL 145

  • Lectures: Tu/Th 11am-12:20pm, ECEB 3081

  • TA's Office Hours: To be updated soon

  • TA's Office Hours: Yichi Zhang, Wed 2-3pm, ECEB 2036; Ian George, Mon 9-10am, ECEB 2036

  • Prerequisites: Linear Algebra (MATH 257 or 415)

  • Announcements will be done through Canvas.

Course Description

This is a senior/first year graduate-level course on optimization. Topics include necessary and sufficient conditions for local optima; characterization of convex sets and functions; unconstrained optimization, gradient descent and it variants; constrained optimization and the gradient projection method; optimization with equality and inequality constraints, Lagrange multipliers, KKT conditions; penalty and barrier function methods; weak and strong duality and Slater conditions; augmented Lagrangian methods; sub-gradient methods; proximal gradient descent; applications.

Textbook

The recommended textbook is Nonlinear Programming by D. Bertsekas (Edition 3). We will closely follow the lecture notes.

Grading

  • Quizzes: Homework will be posted with solutions, and hence will not be graded. There will be six quizzes based on the latest homework posted. Every student can drop one quiz with the lowest score. The schedule will be posted on the course website.

  • One Midterm Exam: There will be three midterm exams. The regular scheduled lecture time will be used for the midterms. The schedule will be posted on the course website.

  • Final Exam: There is no final exam for this course.

  • Grades for the students in Section P3 (3 credits) will be weighted as follows: Class Participation (5%), Quiz (50%), and Midterm Exams (45%).

  • Grades for the students in Section P4 (4 credits) will be weighted as follows: Class Participation (4%), Quiz (40%), Midterm Exam (36%), and Final Project (20%).

  • Final Project : The students in Section P4 (4 credits) are required to work on a final project. The task is to read and write a report on two algorithms: Nestorov's accelerated gradient method and Mirror descent. The TAs and the instructor will not provide help on the project. You have to find appropriate sources on the Internet and write a report, providing precise proofs of convergence. You have to cite the resources used. You can use ChatGPT to find resources or learn about the material but you will be responsible for any errors produced by ChatGPT. The report has to be typewritten by you, it should not be more than 10 pages long and must be in 11 pt font or bigger. Due: May 8 midnight