Practical and Parallelizable Algorithms for Non-Monotone Submodular Maximization with Size Constraint

Main Article Content

Yixin Chen
Alan Kuhnle

Abstract

We present combinatorial and parallelizable algorithms for the maximization of a submodular function, not necessarily monotone, with respect to a size constraint. We improve the best approximation factor achieved by an algorithm that has optimal adaptivity and nearly optimal query complexity to 1/6 − ε, and even further to 0.193 − ε by increasing the adaptivity by a factor of O(log(k)). The conference version of this work mistakenly employed a subroutine that does not work for non-monotone, submodular functions. In this version, we propose a fixed and improved subroutine to add a set with high average marginal gain, ThreshSeq, which returns a solution in O(log(n)) adaptive rounds with high probability. Moreover, we provide two approximation algorithms. The first has approximation ratio 1/6 − ε, adaptivity O(log(n)), and query complexity O(n log(k)), while the second has approximation ratio 0.193 − ε, adaptivity O(log(n) log(k)), and query complexity O(n log(k)). Our algorithms are empirically validated to use a low number of adaptive rounds and total queries while obtaining solutions with high objective value in comparison with state-of-the-art approximation algorithms, including continuous algorithms that use the multilinear extension.

Article Details

Section
Articles