Sebastian Urban Stich
Sebastian Urban Stich
Research Scientist, EPFL
Verifierad e-postadress på epfl.ch - Startsida
Titel
Citeras av
Citeras av
År
Advances and Open Problems in Federated Learning
P Kairouz, HB McMahan, B Avent, A Bellet, M Bennis, AN Bhagoji, ...
arXiv preprint arXiv:1912.04977, 2019
1362019
Local SGD Converges Fast and Communicates Little
SU Stich
ICLR 2019 - International Conference on Learning Representations, 2019
1202019
Sparsified SGD with memory
SU Stich, JB Cordonnier, M Jaggi
NeurIPS 2018 - Advances in Neural Information Processing Systems, 4448-4459, 2018
1042018
Don't Use Large Mini-Batches, Use Local SGD
T Lin, SU Stich, KK Patel, M Jaggi
ICLR 2020 - International Conference on Learning Representations, 2020
932020
Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
Y Nesterov, SU Stich
SIAM Journal on Optimization 27 (1), 110-123, 2017
692017
Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication
A Koloskova, SU Stich, M Jaggi
ICML 2019 - International Conference on Machine Learning, 2019
642019
Error Feedback Fixes SignSGD and other Gradient Compression Schemes
SP Karimireddy, Q Rebjock, SU Stich, M Jaggi
ICML 2019 - International Conference on Machine Learning, 2019
562019
Optimization of convex functions with random pursuit
SU Stich, CL Muller, B Gartner
SIAM Journal on Optimization 23 (2), 1284-1309, 2013
372013
On two problems regarding the Hamiltonian cycle game
D Hefetz, SU Stich
the electronic journal of combinatorics 16 (1), 28, 2009
332009
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning
SP Karimireddy, S Kale, M Mohri, SJ Reddi, SU Stich, AT Suresh
ICML 2020 - International Conference on Machine Learning, 2019
30*2019
Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization
RM Gower, F Hanzely, P Richtárik, SU Stich
NeurIPS 2018 - Advances in Neural Information Processing Systems, 1619-1629, 2018
232018
Stochastic distributed learning with gradient quantization and variance reduction
S Horváth, D Kovalev, K Mishchenko, S Stich, P Richtárik
arXiv preprint arXiv:1904.05115, 2019
202019
Decentralized Deep Learning with Arbitrary Communication Compression
A Koloskova, T Lin, SU Stich, M Jaggi
ICLR 2020 - International Conference on Learning Representations, 2020
172020
Safe Adaptive Importance Sampling
SU Stich, A Raj, M Jaggi
NIPS 2017 - Advances in Neural Information Processing Systems, 4382-4392, 2017
172017
On Matching Pursuit and Coordinate Descent
F Locatello, A Raj, SP Karimireddy, G Rätsch, B Schölkopf, S Stich, ...
ICML 2018 - International Conference on Machine Learning, 3204-3213, 2018
15*2018
The Error-Feedback Framework: Better Rates for SGD with Delayed Gradients and Compressed Communication
SU Stich, SP Karimireddy
arXiv preprint arXiv:1909.05350, 2019
142019
Variable metric random pursuit
SU Stich, CL Müller, B Gärtner
Mathematical Programming 156 (1-2), 549-579, 2016
142016
A Unified Theory of Decentralized SGD with Changing Topology and Local Updates
A Koloskova, N Loizou, S Boreiri, M Jaggi, SU Stich
ICML 2020 - International Conference on Machine Learning, 2020
122020
Global linear convergence of Newton's method without strong-convexity or Lipschitz gradients
SP Karimireddy, SU Stich, M Jaggi
arXiv preprint arXiv:1806.00413, 2018
122018
Approximate Steepest Coordinate Descent
SU Stich, A Raj, M Jaggi
ICML 2017 - International Conference on Machine Learning, 3251-3259, 2017
112017
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20