ORIE 6365

Continuous Optimization: Algorithms and Complexity

Spring 2026 | Cornell University | ORIE

Mon/Wed 10:10 AM – 11:25 AM
102 Upson Hall (Ithaca) and 397 Bloomberg Center (Cornell Tech)

Latest News: Homework 2 is now available, the submission deadline is March 14.

Instructor

Nikita Doikov
Assistant Professor, ORIE
Email: nikita.doikov at cornell.edu
Office: 218 Rhodes Hall
Office hours: Tuesdays 11:00 AM - 12:00 PM, Thursdays 2:00 PM - 3:00 PM, and by appointment

Course Description

This is a graduate-level course on the theory and algorithms of continuous optimization. It prepares students for research in optimization theory and for developing advanced methods for applications in operations research, machine learning, and related domains. The main emphasis is on understanding different classes of optimization problems and their theoretical limitations. We will rigorously study convergence rates and lower bounds for first-order and second-order optimization methods on both convex and non-convex problems, establishing the range of their applicability. Syllabus


Homeworks

Please use Gradescope to upload your homeworks.
Homework 0 Due on January 31
Homework 1 Due on February 21
Homework 2 Due on March 14

Schedule & Materials

Date Topic Materials
Jan 21 Introduction. Complexity of Optimization Problems. Grid Search. Notes | Slides | Whiteboard
Jan 26 Canceled due to snowstorm
Jan 28 Lower Bound for Global Optimization. Gradients. Notes | Slides | Whiteboard
Feb 2 Smooth Functions. Gradient Method for Finding Stationary Points. Notes | Whiteboard
Feb 4 Adaptive Search. Stochastic Gradient Method. Notes | Whiteboard
Feb 9 Smooth Convex Functions. Notes | Whiteboard
Feb 11 Strong Convexity. Polyak's Heavy Ball Method. Notes | Whiteboard
Feb 18 Lower Bounds for Smooth Convex Optimization. Notes | Whiteboard
Feb 23 Nesterov's Fast Gradient Method. Notes | Whiteboard
Feb 25 Applications: Machine Learning. Fully Composite Problems. Notes | Whiteboard
Mar 2 Separation. Subgradients. Binary Search. Notes | Whiteboard
Mar 4 Ellipsoid Method. Notes | Whiteboard
Mar 9 Subgradient Method. Normalized Stepsizes.
Mar 11 Lower Bounds for Nonsmooth Convex Optimization.
Mar 16 Adaptive Stepsizes for Stochastic Methods.
Mar 18 Variance Reduction Techniques.
Mar 23 Mirror Descent and Dual Averaging. Accuracy Certificates.
Mar 25 Smoothing. Applications.
Spring break
Apr 6 Newton's Method. Self-Concordant Functions.
Apr 8 TBA

Literature

There is no single required textbook. The following sources are recommended for primary reading, alongside the provided lecture materials: Additional sources:

© 2026 Nikita Doikov.