Can Explanations Support Privacy Awareness? A Research Roadmap

authored by
Wasja Brunotte, Larissa Chazette, Kai Korte
Abstract

Using systems as support tools for decision-making is a common part of a citizen's daily life. Systems support users in various tasks, collecting and processing data to learn about a user and provide more tailor-made services. This data collection, however, means that users' privacy sphere is increasingly at stake. Informing the user about what data is collected and how it is processed is key to reaching transparency, trustworthiness, and ethics in modern systems. While laws and regulations have come into existence to inform the user about privacy terms, this information is still conveyed in a complex and verbose way to the user, making it unintelligible to them. Meanwhile, explainability is seen as a way to disclose information about a system or its behavior in an intelligible manner. In this work, we propose explanations as a means to enhance users' privacy awareness. As a long-term goal, we want to understand how to achieve more privacy awareness with respect to systems and develop heuristics that support it, helping end-users to protect their privacy. We present preliminary results on private sphere explanations and present our research agenda towards our long-term goal.

Organisation(s)
Software Engineering Section
PhoenixD: Photonics, Optics, and Engineering - Innovation Across Disciplines
Institute of Legal Informatics
Type
Conference contribution
Pages
176-180
No. of pages
5
Publication date
2021
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Engineering(all), Computer Science(all), Strategy and Management
Electronic version(s)
https://doi.org/10.1109/REW53955.2021.00032 (Access: Closed)