Workshop on Computational Aspects of Deep Learning - ICPR 2020

CADL 2020


Computer Graphics Computer Vision & Pattern Recognition



AIMS AND SCOPE
===============
Deep Learning has been the most significant breakthrough in the past 10 years: it has radically changed the research methodology towards a data-oriented approach, in which learning involves all steps of the prediction pipeline. In this context, optimization and careful design of neural architectures play an increasingly important role which directly affects the research pace, the effectiveness of state-of-the-art models and their applicability in production scale.
The ICPR workshop on “Computational Aspects of Deep Learning” fosters the submission of research works that focus on the development of optimized deep neural network architectures and on the optimization of existing ones, also onto highly scalable systems. This includes the training on large-scale or highly-dimensional datasets, the design of novel architectures and operators for increasing the efficacy or the efficiency in feature extraction and classification, the optimization of hyperparameters to enhance model’s performance, solutions for training in multi-node systems such as HPC clusters.
The workshop targets any research field related to pattern recognition, ranging from computer vision to natural language processing and multimedia, in which data and computationally intensive architectures are needed to solve key research issues. The workshop also favors positive criticism on the current data-intensive trends in machine learning and will encourage new perspectives and solutions on the matter. Submissions should address computationally intensive scenarios from the point of view of architectural design, data preparation and processing, operator design, training strategies, distributed and large-scale training. Quantitative comparisons of existing solutions and datasets are also welcome to raise awareness on the topic.
PRIZES
======
CADL is organized in collaboration with NVIDIA AI Technology Center. The best paper will be awarded a Titan RTX GPU (or equivalent) offered by NVIDIA.
TOPICS
======
Topics of interest include, but are not limited to, the following:
- Design of innovative architectures and operators for data-intensive scenarios
- Video understanding and spatio-temporal feature extraction
- Distributed reinforcement learning algorithms
- Applications of large-scale pre-training techniques
- Distributed training approaches and architectures
- HPC and massively parallel architectures in Deep Learning
- Frameworks and optimization algorithms for training Deep Networks
- Model pruning, gradient compression techniques to reduce the computational complexity
- Design, implementation and use of hardware accelerators
SUBMISSION GUIDELINES
======================
We invite submission of full and short papers describing work in the domains suggested above or in closely-related areas.
Accepted submissions will be presented either as oral or posters at the workshop, and published in the ICPR 2020 Workshops volume, edited by Springer.
Full papers: 12-15 pages
Short papers: 6-8 pages
IMPORTANT DATES
================
- Paper submission deadline: October 10th, 2020
- Notification of acceptance: November 10th, 2020
- Camera ready deadline: November 15th, 2020
- Finalized workshop program: December 1st, 2020
ORGANIZERS
===========
- Frederic Pariente, Engineering Manager at NVIDIA, deputy director of NVAITC in EMEA
- Iuri Frosio, Senior research scientist, NVIDIA
- Lorenzo Baraldi, Assistant Professor at UNIMORE
- Claudio Baecchi, Post-Doc at MICC, University of Florence
CONTACTS
=========
For further information, please see the workshop website at http://www.cadl.it/