Title: Optimal Guarantees for Algorithmic Reproducibility and Gradient Complexity in Convex Optimization

Authors: Liang Zhang, Junchi YANG, Amin Karbasi, Niao He

Abstract

Algorithmic reproducibility measures the deviation in outputs of machine learning algorithms upon minor changes in the training process. Previous work suggests that first-order methods would need to trade-off convergence rate (gradient complexity) for better reproducibility. In this work, we challenge this perception and demonstrate that both optimal reproducibility and near-optimal convergence guarantees can be achieved for smooth convex minimization and smooth convex-concave minimax problems under various error-prone oracle settings.

Full Text: [PDF]

Accessibility at Yale   Inference, Information, and Decision Group at Yale