Title: Submodular Maximization with Nearly Optimal Approximation, Adaptivity and Query Complexity

Authors: Matthew Fahrbach, Vahab Mirrokni and Morteza Zadimoghaddam


Submodular optimization generalizes many classic problems in combinatorial optimization and has recently found a wide range of applications in machine learning (e.g., feature engineering and active learning). For many large-scale optimization problems, we are often concerned with the adaptivity complexity of an algorithm, which quantifies the number of sequential rounds where polynomially-many independent function evaluations can be executed in parallel. While low adaptivity is ideal, it is not sufficient for a distributed algorithm to be efficient, since in many practical applications of submodular optimization the number of function evaluations becomes prohibitively expensive. Motivated by these applications, we study the adaptivity and query complexity of adaptive submodular optimization.

Our main result is a distributed algorithm for maximizing a monotone submodular function with cardinality constraint k that achieves a (1 – 1/e – ε)-approximation in expectation. This algorithm runs in O(log(n)) adaptive rounds and makes O(n) calls to the function evaluation oracle in expectation. The approximation guarantee and query complexity are optimal, and the adaptivity is nearly optimal. Moreover, the number of queries is substantially less than in previous works. We also extend our results to the submodular cover problem to demonstrate the generality of our algorithm and techniques.

Full Text: [PDF]

Accessibility at Yale   Inference, Information, and Decision Group at Yale