Title: Continuous Submodular Maximization: Beyond DR-Submodularity

Authors: Moran Feldman, Amin Karbasi

Abstract

In this paper, we propose the first continuous optimization algorithms that achieve a constant fac- tor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called Coordinate-Ascent+, achieves a ( e-1/(2e−1) − ε)-approximation guarantee while performing O(n/ε) itera- tions, where the computational complexity of each iteration is roughly O(n/√ε + n log n) (here, n denotes the dimension of the optimization problem). We then propose Coordinate-Ascent++, that achieves the tight (1 − 1/e − ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n3/ε2.5 +n3 logn/ε2) per iteration. However, the com- putation of each round of Coordinate-Ascent++ can be easily parallelized so that the computational cost per machine scales as O(n/√ε + n log n).

Full Text: [PDF]

Accessibility at Yale   Inference, Information, and Decision Group at Yale