ICML Workshop on Negative Dependence in ML

NEGDEPML 2019


Artificial Intelligence



Whether selecting training data, finding an optimal experimental design, exploring in reinforcement learning, or designing recommender systems, selecting a high-quality but diverse set of items is a core challenge for ML.
Any task that requires selecting multiple, non-similar items leverages the concept of negative dependence. Negatively-dependent measures and submodularity are powerful, theoretically-grounded tools that can aid in this selection.
Determinantal point processes are arguably the most popular negatively-dependent measure, with past applications including recommender systems, neural network pruning, ensemble learning, summarization, and kernel reconstruction. However, the spectrum of negatively-dependent measures is much broader.
This workshop will discuss with the ICML audience the rich mathematical tools associated with negative dependence, delving into the key theoretical concepts that underlie negatively-dependent measures and investigating fundamental applications.
SUBMISSIONS
We invite submissions of papers on any topic related to negative dependence in machine learning, including (but not limited to):
- Submodular optimization
- Determinantal point processes
- Volume sampling
- Recommender systems
- Experimental design
- Variance-reduction methods
- Exploitation/exploration trade-offs (RL, Bayesian Optimization, etc.)
- Batched active learning
- Strongly Rayleigh measures
ORGANIZERS
- Mike Gartrell (Criteo AI Lab)
- Jennifer Gillenwater (Google Research NY)
- Alex Kulesza (Google Research NY)
- Zelda Mariet (MIT)