Enter an email to get links to download code (via GitHub) and replication data for my Python library, submodular, which contains a suite of state-of-the-art MPI-parallelized functions to maximize submodular functions (including our new FAST algorithm). The submodular library can easily scale up to leverage thousands of processors on Amazon’s AWS. It also includes examples that show how to find influential people in large social networks with e.g. millions of users, as well as other classic objectives.

I'm happy to provide a quick tutorial and examples for running submodular on AWS (or locally) in parallel or to discuss how to use submodular for your research—Email me at breuer `at' harvard.edu

For details and when citing submodular, please refer to our paper:
Breuer, A., Balkanski, E., & Singer, Y. "The FAST Algorithm for Submodular Maximization". *International Conference on Machine Learning (ICML) 2020.

submodular_map2021 sized.png

In addition to all replication code for our experiments in the paper, submodular also contains optimized parallel (and nonparallel) MPI implementations of the following classic and state-of-the-art submodular maximization algorithms:

Classic Submodular Maximization Algorithms:

  • Top K (Returns the solution containing the top k elements with the highest marginal contribution to the empty set.);

  • Random (Returns the value of a random set of size k.);

  • Greedy (Nemhauser 1978);

  • Lazy Greedy (Minoux 1978).

State-of-the-art Submodular Maximization Algorithms:

  • FAST (Breuer, Balkanski, and Singer, ICML 2020);

  • Stochastic Greedy (Mirzasoleiman et al., 2015);

  • Lazier Than Lazy Greedy (Mirzasoleiman et al., 2015);

  • Amortized Filtering (Balkanski et al., SODA 2019);

  • Adaptive Sequencing (Balkanski, et al., STOC 2019);

  • Exhaustive Maximization (Fahrbach et al. Algorithm 3 from "Submodular Maximization with Nearly Optimal Approximation, Adaptivity and Query Complexity");

  • Binary Search Maximization (Fahrbach et al. Algorithm 4 from "Submodular Maximization with Nearly Optimal Approximation, Adaptivity and Query Complexity");

  • Randomized Parallel Greedy (Chekuri et al. 2019, randomized-parallel-greedy for cardinality constraints algorithm from Figure 2 p. 12 of "Submodular Function Maximization in Parallel via the Multilinear Relaxation").

Objective functions:

submodular also includes several canonical submodular objective functions, such as:

  • Movie Recommendation;
  • Network Max Cover;
  • Revenue Maximization on YouTube;
  • Influence Maximization;
  • Traffic Sensor Placement.