Dmitry Kovalev
Titel
Citeras av
Citeras av
År
Don’t jump through hoops and remove those loops: SVRG and Katyusha are better without the outer loop
D Kovalev, S Horváth, P Richtárik
Algorithmic Learning Theory, 451-467, 2020
322020
Stochastic distributed learning with gradient quantization and variance reduction
S Horváth, D Kovalev, K Mishchenko, S Stich, P Richtárik
arXiv preprint arXiv:1904.05115, 2019
262019
Revisiting stochastic extragradient
K Mishchenko, D Kovalev, E Shulgin, P Richtárik, Y Malitsky
International Conference on Artificial Intelligence and Statistics, 4573-4582, 2020
132020
Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization
Z Li, D Kovalev, X Qian, P Richtárik
arXiv preprint arXiv:2002.11364, 2020
122020
RSN: Randomized subspace newton
R Gower, D Koralev, F Lieder, P Richtárik
Advances in Neural Information Processing Systems, 616-625, 2019
112019
Accelerated methods for composite non-bilinear saddle point problem
M Alkousa, D Dvinskikh, F Stonyakin, A Gasnikov
arXiv preprint arXiv:1906.03620, 2019
72019
Stochastic Newton and Cubic Newton Methods with Simple Local Linear-Quadratic Rates
D Kovalev, K Mishchenko, P Richtárik
arXiv preprint arXiv:1912.01597, 2019
62019
Stochastic proximal langevin algorithm: Potential splitting and nonasymptotic rates
A Salim, D Koralev, P Richtárik
Advances in Neural Information Processing Systems, 6653-6664, 2019
62019
Stochastic spectral and conjugate descent methods
D Kovalev, P Richtarik, E Gorbunov, E Gasanov
Advances in Neural Information Processing Systems, 3358-3367, 2018
62018
From Local SGD to Local Fixed Point Methods for Federated Learning
G Malinovsky, D Kovalev, E Gasanov, L Condat, P Richtarik
arXiv preprint arXiv:2004.01442, 2020
52020
Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
F Hanzely, D Kovalev, P Richtarik
arXiv preprint arXiv:2002.04670, 2020
32020
A hypothesis about the rate of global convergence for optimal methods (Newton's type) in smooth convex optimization
AV Gasnikov, DA Kovalev
Computer research and modeling 10 (3), 305-314, 2018
32018
Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization
D Kovalev, A Salim, P Richtárik
arXiv preprint arXiv:2006.11773, 2020
12020
Distributed Fixed Point Methods with Compressed Iterates
S Chraibi, A Khaled, D Kovalev, P Richtárik, A Salim, M Takáč
arXiv preprint arXiv:1912.09925, 2019
12019
Towards accelerated rates for distributed optimization over time-varying networks
A Rogozin, V Lukoshkin, A Gasnikov, D Kovalev, E Shulgin
arXiv preprint arXiv:2009.11069, 2020
2020
Fast Linear Convergence of Randomized BFGS
D Kovalev, RM Gower, P Richtárik, A Rogozin
arXiv preprint arXiv:2002.11337, 2020
2020
Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization Open Website
D Kovalev, A Salim, P Richtárik
Variance Reduced Coordinate Descent with Acceleration: New Method with a Surprising Application to Finite-Sum Problems Download PDF
F Hanzely, D Kovalev, P Richtarik
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–18