NeurIPS 2019 Workshop on Information Theory and Machine Learning

ITML 2019


Artificial Intelligence



Submit at this site: https://cmt3.research.microsoft.com/ITML2019
We invite submissions in any of the following areas:
- Controlling information quantities for performance guarantees, such as PAC-Bayes, interactive data analysis, information bottleneck, fairness, and privacy.
- Information theoretic limitations / performance upper bounds of learning algorithms.
- Information theory for representation learning, semi-supervised learning and unsupervised learning, such as its applications to generative models.
- Methods to estimate information theoretic quantities for high dimensional observations, such as variational methods, sampling methods
- Quantification of usable / useful information, e.g. information an algorithm can use for prediction.
- Machine learning applied to information theory, such as designing better codes, compression optimized for human perception.
- Any other topics related to information theory and machine learning
A submission should take the form of an extended abstract (3 pages long) in PDF format using the NeurIPS style. Author names do not need to be anonymised and references may extend as far as needed beyond the 3 page upper limit. Submissions may extend beyond the 3 pages upper limit, but reviewers are not expected to read beyond the first 3 pages. If research has previously appeared in a journal, workshop, or conference (including NeurIPS 2019 conference), the workshop submission should extend that previous work. Parallel submissions (such as to ICLR) are permitted.
Submissions will be accepted as contributed talks or poster presentations. Final versions will be posted on the workshop website (and are archival but do not constitute a proceedings). We will do our best to guarantee workshop registration for all accepted workshop submissions.