Title: Submodularity in Action: From Machine Learning to Signal Processing Applications

Authors: Ehsan Tohidi, Rouhollah Amiri, Mario Coutino, David Gesbert, Geert Leus, Amin Karbasi

Abstract

Submodularity is a discrete domain functional prop- erty that can be interpreted as mimicking the role of the well- known convexity/concavity properties in the continuous domain. Submodular functions exhibit strong structure that lead to efficient optimization algorithms with provable near-optimality guarantees. These characteristics, namely, efficiency and provable performance bounds, are of particular interest for signal process- ing (SP) and machine learning (ML) practitioners as a variety of discrete optimization problems are encountered in a wide range of applications. Conventionally, two general approaches exist to solve discrete problems: (i) relaxation into the continuous domain to obtain an approximate solution, or (ii) development of a tailored algorithm that applies directly in the discrete domain. In both approaches, worst-case performance guarantees are often hard to establish. Furthermore, they are often complex, thus not practical for large-scale problems. In this paper, we show how certain scenarios lend themselves to exploiting submodularity so as to construct scalable solutions with provable worst-case performance guarantees. We introduce a variety of submodular- friendly applications, and elucidate the relation of submodularity to convexity and concavity which enables efficient optimization. With a mixture of theory and practice, we present different flavors of submodularity accompanying illustrative real-world case studies from modern SP and ML. In all cases, optimization algorithms are presented, along with hints on how optimality guarantees can be established.

Full Text: [PDF]

Accessibility at Yale   Inference, Information, and Decision Group at Yale