Welcome!
I do research in Computer Science with a focus on Numerical Optimization and Machine Learning.Now I am a postdoctoral researcher at EPFL, Switzerland, working in the Machine Learning and Optimization Laboratory with Martin Jaggi.
I am excited to explore provably efficient optimization algorithms that exploit the problem structure and combine ideas from various fields. One of my areas of expertise is second-order methods and its global complexity bounds. I believe that bridging the gap between the second-order optimization theory and the best known computational practices is what can lead us to new achievements in the training process of our models.
I defended my PhD in 2021 at UCLouvain, Belgium, supervised by Yurii Nesterov. My thesis is "New second-order and tensor methods in Convex Optimization".
I received a BSc degree in Computational Mathematics and Cybernetics from Lomonosov Moscow State University in 2015. I obtained a MSc degree from Higher School of Economics in 2017, where I was studying advanced statistical and optimization methods.
Papers
Preprints / Various
- Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders, with Anastasia Koloskova, Sebastian U. Stich, and Martin Jaggi, 2023 (arXiv)
- Super-Universal Regularized Newton Method, with Konstantin Mishchenko, and Yurii Nesterov, 2022 (arXiv, code)
- Lower Complexity Bounds for Minimizing Regularized Functions, 2022 (arXiv)
Refereed Conference Publications
- Linearization Algorithms for Fully Composite Optimization, with Maria-Luiza Vladarean, Martin Jaggi, and Nicolas Flammarion, 2023 (COLT, arXiv)
- Polynomial Preconditioning for Gradient Methods, with Anton Rodomanov, 2023 (ICML, arXiv)
- Second-order optimization with lazy Hessians, with El Mahdi Chayti, and Martin Jaggi, 2023 (ICML, arXiv)
- Convex optimization based on global lower second-order models, with Yurii Nesterov, 2020 (NeurIPS proceedings, arXiv, code)
- Stochastic Subspace Cubic Newton Method, with Filip Hanzely, Peter Richtárik, and Yurii Nesterov, 2020 (ICML proceedings, arXiv)
- Inexact Tensor Methods with Dynamic Accuracies, with Yurii Nesterov, 2020 (ICML proceedings, arXiv, code)
- Randomized Block Cubic Newton Method, with Peter Richtárik, 2018 (ICML proceedings, arXiv)
Journal Publications
- Gradient Regularization of Newton Method with Bregman Distances, with Yurii Nesterov, 2023 (Math. Program. journal, arXiv)
- High-Order Optimization Methods for Fully Composite Problems, with Yurii Nesterov, 2022 (SIOPT journal, arXiv)
- Affine-invariant contracting-point methods for Convex Optimization, with Yurii Nesterov, 2022 (Math. Program. journal, arXiv, code)
- Local convergence of tensor methods, with Yurii Nesterov, 2021 (Math. Program. journal, arXiv)
- Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method, with Yurii Nesterov, 2021 (JOTA journal, arXiv)
- Contracting Proximal Methods for Smooth Convex Optimization, with Yurii Nesterov, 2020 (SIOPT journal, arXiv)
Recent Talks
- June 3, 2023: Second-Order Optimization with Lazy Hessians,
SIAM Conference on Optimization, Seattle (slides)
[photo↓]
- September 27, 2022: Super-Universal Regularized Newton Method, TML Laboratory, EPFL (slides)
- July 29, 2022: Affine-invariant contracting-point methods for Convex Optimization, EUROPT, Lisbon (slides)
- June 3, 2022: Second-order methods with global convergence in Convex Optimization, the research team of Panos Patrinos, KULeuven (slides)
- May 5, 2022: Optimization Methods for Fully Composite Problems, FGP-22, Porto (slides)
- February 21, 2022: Second-order methods with global convergence in Convex Optimization, MLO Laboratory, EPFL (slides)
- July 7, 2021: Local convergence of tensor methods, EUROPT, online (slides)
- March 4, 2021: Affine-invariant contracting-point methods for Convex Optimization, Symposium on Numerical Analysis and Optimization (invited by Geovani Grapiglia), UFPR, online (slides)
- October 28, 2020: Convex optimization based on global lower second-order models, NeurIPS, online (slides, poster)
- June 17, 2020: Inexact Tensor Methods with Dynamic Accuracies, ICML, online (slides, poster, video)
- October 8, 2019: Proximal Method with Contractions for Smooth Convex Optimization, ICTEAM seminar, Louvain-la-Neuve
- September 23, 2019: Proximal Method with Contractions for Smooth Convex Optimization,
Optimization and Learning for Data Science seminar
(invited by
Dmitry Grishchenko) Université Grenoble Alpes, Grenoble
(slides)
[photo↓]
- September 18, 2019: Complexity of Cubically Regularized Newton Method for Minimizing Uniformly Convex Functions, FGS-19, Nice (slides)
- August 5, 2019: Complexity of Cubically Regularized Newton Method for Minimizing Uniformly Convex Functions, ICCOPT, Berlin
- July 5, 2019: Randomized Block Cubic Newton Method,
Summer School on Optimization, Big Data and Applications, Veroli
[photo↓]
- June 28, 2019: Complexity of Cubically Regularized Newton Method for Minimizing Uniformly Convex Functions EUROPT, Glasgow
[photo↓]
- June 20, 2018: Randomized Block Cubic Newton Method, ICML, Stockholm
(slides,
poster,
video)
[photo↓]
- June 13, 2018: Randomized Block Cubic Newton Method,
X Traditional summer school on Optimization, Voronovo
[photo↓]