Welcome!
I do research in Computational Mathematics with a focus on Numerical Optimization and Machine Learning.Now I am a postdoctoral researcher at EPFL, Switzerland, working in the Machine Learning and Optimization Laboratory with Martin Jaggi.
I am excited to explore provably efficient optimization algorithms
that exploit the problem structure and combine ideas from various fields.
One of my areas of expertise is secondorder methods and its global complexity bounds. I believe that bridging the gap between the secondorder optimization theory
and the best known computational practices is what
will lead us to new achievements in the training process of our models.

More broadly, I am interested in pursuing the following areas:

I received a BSc degree in Computational Mathematics and Cybernetics from Lomonosov Moscow State University in 2015. I obtained a MSc degree from Higher School of Economics in 2017, where I was studying advanced statistical and machine learning methods.
Papers
Preprints / Various:

Complexity of Minimizing Regularized Convex Quadratic Functions.
Daniel Berg Thomsen and Nikita Doikov, 2024 (arXiv) 
First and zerothorder implementations of the regularized Newton method with lazy approximated Hessians.
Nikita Doikov and Geovani Nunes Grapiglia, 2023 (arXiv) 
Minimizing QuasiSelfConcordant Functions by Gradient Regularization of Newton Method.
Nikita Doikov, 2023 (arXiv) 
Unified Convergence Theory of Stochastic and VarianceReduced Cubic Newton Methods.
El Mahdi Chayti, Nikita Doikov, and Martin Jaggi, 2023 (arXiv) 
Lower Complexity Bounds for Minimizing Regularized Functions.
Nikita Doikov, 2022 (arXiv)
Refereed Publications:
2024

Spectral Preconditioning for Gradient Methods on Graded Nonconvex Functions.
Nikita Doikov, Sebastian U. Stich, and Martin Jaggi, 2024 (International Conference on Machine Learning [ICML], arXiv) 
On Convergence of Incremental Gradient for NonConvex Smooth Functions.
Anastasia Koloskova, Nikita Doikov, Sebastian U. Stich, and Martin Jaggi, 2023 (International Conference on Machine Learning [ICML], arXiv) 
SuperUniversal Regularized Newton Method.
Nikita Doikov, Konstantin Mishchenko, and Yurii Nesterov, 2022 (SIAM Journal on Optimization [SIOPT]: open access, arXiv, code)
2023

Linearization Algorithms for Fully Composite Optimization.
MariaLuiza Vladarean, Nikita Doikov, Martin Jaggi, and Nicolas Flammarion, 2023 (Conference on Learning Theory [COLT]: proceedings, arXiv) 
Polynomial Preconditioning for Gradient Methods.
Nikita Doikov and Anton Rodomanov, 2023 (International Conference on Machine Learning [ICML]: proceedings, arXiv) 
Secondorder optimization with lazy Hessians.
Nikita Doikov, El Mahdi Chayti, and Martin Jaggi, 2022 (International Conference on Machine Learning [ICML] (oral presentation): proceedings, arXiv) 
Gradient Regularization of Newton Method with Bregman Distances.
Nikita Doikov and Yurii Nesterov, 2021 (Mathematical Programming Journal [Math.Prog]: open access, arXiv)
2022

HighOrder Optimization Methods for Fully Composite Problems.
Nikita Doikov and Yurii Nesterov, 2021 (SIAM Journal on Optimization [SIOPT]: open access, arXiv) 
Affineinvariant contractingpoint methods for Convex Optimization.
Nikita Doikov and Yurii Nesterov, 2020 (Mathematical Programming Journal [Math.Prog]: open access, arXiv, code)
2021

Local convergence of tensor methods.
Nikita Doikov and Yurii Nesterov, 2019 (Mathematical Programming Journal [Math.Prog]: open access, arXiv) 
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method.
Nikita Doikov and Yurii Nesterov, 2019 (Journal of Optimization Theory and Applications [JOTA]: open access, arXiv)
2020

Convex optimization based on global lower secondorder models.
Nikita Doikov and Yurii Nesterov, 2020 (Conference on Neural Information Processing Systems [NeurIPS] (oral presentation): proceedings, arXiv, code) 
Stochastic Subspace Cubic Newton Method.
Filip Hanzely, Nikita Doikov, Peter Richtárik, and Yurii Nesterov, 2020 (International Conference on Machine Learning [ICML]: proceedings, arXiv) 
Inexact Tensor Methods with Dynamic Accuracies.
Nikita Doikov and Yurii Nesterov, 2020 (International Conference on Machine Learning [ICML]: proceedings, arXiv, code) 
Contracting Proximal Methods for Smooth Convex Optimization.
Nikita Doikov and Yurii Nesterov, 2019 (SIAM Journal on Optimization [SIOPT]: open access, arXiv)
2018

Randomized Block Cubic Newton Method.
Nikita Doikov and Peter Richtárik, 2018 (International Conference on Machine Learning [ICML] (oral presentation): proceedings, arXiv)
Recent Talks
 April 9, 2024: Minimizing quasiselfconcordant functions by gradient regularization of Newton method, NOPTA, University of Antwerp (slides)
 August 25, 2023: SuperUniversal Regularized Newton Method, EUROPT, Budapest (slides)
 July 20, 2023: SecondOrder Optimization with Lazy Hessians, ICML, Hawaii (slides, poster) [photo↓]
 July 19, 2023: Polynomial Preconditioning for Gradient Methods, ICML, Hawaii (poster)
 June 3, 2023: SecondOrder Optimization with Lazy Hessians, SIAM Conference on Optimization, Seattle (slides) [photo↓]
 September 27, 2022: SuperUniversal Regularized Newton Method, TML Laboratory, EPFL (slides)
 July 29, 2022: Affineinvariant contractingpoint methods for Convex Optimization, EUROPT, Lisbon (slides)
 June 3, 2022: Secondorder methods with global convergence in Convex Optimization, the research team of Panos Patrinos, KULeuven (slides)
 May 5, 2022: Optimization Methods for Fully Composite Problems, FGP22, Porto (slides)
 February 21, 2022: Secondorder methods with global convergence in Convex Optimization, MLO Laboratory, EPFL (slides)
 July 7, 2021: Local convergence of tensor methods, EUROPT, online (slides)
 March 4, 2021: Affineinvariant contractingpoint methods for Convex Optimization, Symposium on Numerical Analysis and Optimization (invited by Geovani Nunes Grapiglia), UFPR, online (slides)
 October 28, 2020: Convex optimization based on global lower secondorder models, NeurIPS, online (slides, poster)
 June 17, 2020: Inexact Tensor Methods with Dynamic Accuracies, ICML, online (slides, poster, video)
 October 8, 2019: Proximal Method with Contractions for Smooth Convex Optimization, ICTEAM seminar, LouvainlaNeuve
 September 23, 2019: Proximal Method with Contractions for Smooth Convex Optimization, Optimization and Learning for Data Science seminar (invited by Dmitry Grishchenko) Université Grenoble Alpes, Grenoble (slides) [photo↓]
 September 18, 2019: Complexity of Cubically Regularized Newton Method for Minimizing Uniformly Convex Functions, FGS19, Nice (slides)
 August 5, 2019: Complexity of Cubically Regularized Newton Method for Minimizing Uniformly Convex Functions, ICCOPT, Berlin
 July 5, 2019: Randomized Block Cubic Newton Method, Summer School on Optimization, Big Data and Applications, Veroli [photo↓]
 June 28, 2019: Complexity of Cubically Regularized Newton Method for Minimizing Uniformly Convex Functions EUROPT, Glasgow [photo↓]
 June 20, 2018: Randomized Block Cubic Newton Method, ICML, Stockholm (slides, poster, video) [photo↓]
 June 13, 2018: Randomized Block Cubic Newton Method, X Traditional summer school on Optimization, Voronovo [photo↓]