1st International Workshop on Deceptive AI @ ECAI2020

DeceptECAI 2020


Cognitive Science



There is no dominant theory of deception. The literature on deception treats different aspects and components of deception separately, sometimes offering contradictory evidence and opinions on these components. Emerging AI techniques offer an exciting and novel opportunity to expand our understanding of deception from a computational perspective. However, the design, modelling and engineering of deceptive machines is not trivial from either conceptual, engineering, scientific, or ethical perspectives. The aim of DeceptECAI is to bring together people from academia, industry and policy-making in order to discuss and disseminate the current and future threats, risks, and even benefits of designing deceptive AI. The workshop proposes a multidisciplinary approach (Computer Science, Psychology, Sociology, Philosophy & Ethics, Military Studies, Law etc.) to discuss the following aspects of deceptive AI:
1) Behaviour - What type of machine behaviour should be considered deceptive? How do we study deceptive behaviour in machines as opposed to humans?
2) Reasoning - What kind of reasoning mechanisms lie behind deceptive behaviour? Also, what type of reasoning mechanisms are more prone to deception?
3) Cognition - How does cognition affect deception and how does deception affect cognition? Also, what function, if any, do agent cognitive architectures play in deception?
4) AI & Society - How does the ability of machines to deceive influence society? What kinds of measures do we need to take in order to neutralise or mitigate the negative effects of deceptive AI?
5) Engineering Principles - How should we engineer autonomous agents such that we are able to know why and when they deceive? Also, why should or shouldn’t we engineer or model deceptive machines?
Submission Guidelines
The following paper categories are welcome:
Long papers (12 pages + 1 page references): Long papers should present original research work and be no longer than thirteen pages in total: twelve pages for the main text of the paper (including all figures but excluding references), and one additional page for references.
Short papers (7 pages + 1 page references): Short papers may report on works in progress. Short paper submissions should be no longer than eight pages in total: seven pages for the main text of the paper (including all figures but excluding references), and one additional page for references.
Position papers regarding potential research challenges are also welcomed in either long or short paper format.
Submissions are NOT anonymous. The names and affiliations of the authors should be stated in the manuscript.
All papers must be original and not simultaneously submitted to another journal or conference.
All papers should be formatted following the Springer Lecture Notes in Computer Science LNCS/LNAI style and submitted through the EasyChair link below.
Submission Link - https://easychair.org/conferences/?conf=deceptecai2020
Publication
List of Topics
Deceptive Machines
Multi-Agent Systems and Agent-Based Models
Trust and Security in AI
Machine Behaviour
Argumentation
Machine Learning
Explainable AI - XAI
Human-Computer(Agent) Interaction - HCI/HAI
Philosophical, Psychological, and Sociological aspects
Ethical, Moral, Political, Economical, and Legal aspects
Storytelling and Narration in AI
Computational Social Science
Applications related to deceptive AI
Organizing committee
Stefan Sarkadi - King’s College London, UK
Peter McBurney - King’s College London, UK
Liz Sonenberg - University of Melbourne, Australia
Iyad Rahwan - Max Planck Institute for Human Development & MIT , Germany & USA
Publication
DeceptECAI2020 Proceedings shall be submitted to Springer LNCS/LNAI for publication.
We also plan a Special Issue on the topic of Deceptive AI in a highly-ranked AI journal. Authors of selected papers will be invited to submit extended versions of their papers to this special issue.
Contact
All questions about submissions should be emailed to stefan.sarkadi@kcl.ac.uk