ORIE 6365

Continuous Optimization: Algorithms and Complexity

Spring 2026 | Cornell University | ORIE

Mon/Wed 10:10 AM – 11:25 AM
102 Upson Hall (Ithaca) and 397 Bloomberg Center (Cornell Tech)

Latest News: Homework 1 is now available, the submission deadline is February 21.

Instructor

Nikita Doikov
Assistant Professor, ORIE
Email: nikita.doikov at cornell.edu
Office: 218 Rhodes Hall
Office hours: Tuesdays 11:00 AM - 12:00 PM, Thursdays 2:00 PM - 3:00 PM, and by appointment

Course Description

This is a graduate-level course on the theory and algorithms of continuous optimization. It prepares students for research in optimization theory and for developing advanced methods for applications in operations research, machine learning, and related domains. The main emphasis is on understanding different classes of optimization problems and their theoretical limitations. We will rigorously study convergence rates and lower bounds for first-order and second-order optimization methods on both convex and non-convex problems, establishing the range of their applicability. Syllabus


Homeworks

Please use Gradescope to upload your homeworks.
Homework 0 Due on January 31
Homework 1 Due on February 21

Schedule & Materials

Date Topic Materials
Jan 21 Introduction. Complexity of Optimization Problems. Grid Search. Notes | Slides | Whiteboard
Jan 26 Canceled due to snowstorm
Jan 28 Lower Bound for Global Optimization. Gradients. Notes | Slides | Whiteboard
Feb 2 Smooth Functions. Gradient Method for Finding Stationary Points. Notes | Whiteboard
Feb 4 Adaptive Search. Stochastic Gradient Method. Notes | Whiteboard
Feb 9 Smooth Convex Functions. Notes | Whiteboard
Feb 11 Strong Convexity. Polyak's Heavy Ball Method. Notes | Whiteboard
Feb 18 Lower Bounds for Smooth Convex Optimization.
Feb 23 TBA

Literature

There is no single required textbook. The following sources are recommended for primary reading, alongside the provided lecture materials: Additional sources:

© 2026 Nikita Doikov.