Filip Hanzely
Filip Hanzely
PhD student, KAUST
Verifierad e-postadress på - Startsida
Citeras av
Citeras av
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
F Hanzely, P Richtarik, L Xiao
arXiv preprint arXiv:1808.03045, 2018
Accelerated stochastic matrix inversion: general theory and speeding up BFGS rules for faster second-order optimization
R Gower, F Hanzely, P Richtárik, SU Stich
Advances in Neural Information Processing Systems, 1619-1629, 2018
SEGA: Variance reduction via gradient sketching
F Hanzely, K Mishchenko, P Richtárik
Advances in Neural Information Processing Systems, 2082-2093, 2018
Accelerated coordinate descent with arbitrary sampling and best rates for minibatches
F Hanzely, P Richtárik
AISTATS 2019, 2019
A unified theory of sgd: Variance reduction, sampling, quantization and coordinate descent
E Gorbunov, F Hanzely, P Richtárik
International Conference on Artificial Intelligence and Statistics, 680-690, 2020
Fastest rates for stochastic mirror descent methods
F Hanzely, P Richtárik
arXiv preprint arXiv:1803.07374, 2018
Privacy preserving randomized gossip algorithms
F Hanzely, J Konečný, N Loizou, P Richtárik, D Grishchenko
arXiv preprint arXiv:1706.07636, 2017
Testing for causality in reconstructed state spaces by an optimized mixed prediction method
A Krakovská, F Hanzely
Physical Review E 94 (5), 052203, 2016
A nonconvex projection method for robust pca
A Dutta, F Hanzely, P Richtárik
Proceedings of the AAAI Conference on Artificial Intelligence 33, 1468-1476, 2019
One method to rule them all: variance reduction for data, parameters and many new methods
F Hanzely, P Richtárik
arXiv preprint arXiv:1905.11266, 2019
99% of parallel optimization is inevitably a waste of time
K Mishchenko, F Hanzely, P Richtárik
arXiv, 2019
Federated learning of a mixture of global and local models
F Hanzely, P Richtárik
arXiv preprint arXiv:2002.05516, 2020
Best pair formulation & accelerated scheme for non-convex principal component pursuit
A Dutta, F Hanzely, J Liang, P Richtárik
IEEE Transactions on Signal Processing, 2020
Stochastic Subspace Cubic Newton Method
F Hanzely, N Doikov, P Richtárik, Y Nesterov
ICML 2020, 2020
Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
F Hanzely, D Kovalev, P Richtarik
ICML 2020, 2020
A privacy preserving randomized gossip algorithm via controlled noise insertion
F Hanzely, J Konečný, N Loizou, P Richtárik, D Grishchenko
NeurIPS PPML workshop, 2018
99% of worker-master communication in distributed optimization is not needed
K Mishchenko, F Hanzely, P Richtárik
Conference on Uncertainty in Artificial Intelligence, 979-988, 2020
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
F Hanzely
arXiv preprint arXiv:2008.11824, 2020
Learning to Optimize via Dual space Preconditioning
S Chraibi, A Salim, S Horváth, F Hanzely, P Richtárik
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–19