Can Explanations Support Privacy Awareness? A Research Roadmap

verfasst von
Wasja Brunotte, Larissa Chazette, Kai Korte
Abstract

Using systems as support tools for decision-making is a common part of a citizen's daily life. Systems support users in various tasks, collecting and processing data to learn about a user and provide more tailor-made services. This data collection, however, means that users' privacy sphere is increasingly at stake. Informing the user about what data is collected and how it is processed is key to reaching transparency, trustworthiness, and ethics in modern systems. While laws and regulations have come into existence to inform the user about privacy terms, this information is still conveyed in a complex and verbose way to the user, making it unintelligible to them. Meanwhile, explainability is seen as a way to disclose information about a system or its behavior in an intelligible manner. In this work, we propose explanations as a means to enhance users' privacy awareness. As a long-term goal, we want to understand how to achieve more privacy awareness with respect to systems and develop heuristics that support it, helping end-users to protect their privacy. We present preliminary results on private sphere explanations and present our research agenda towards our long-term goal.

Organisationseinheit(en)
Fachgebiet Software Engineering
PhoenixD: Simulation, Fabrikation und Anwendung optischer Systeme
Institut für Rechtsinformatik (IRI)
Typ
Aufsatz in Konferenzband
Seiten
176-180
Anzahl der Seiten
5
Publikationsdatum
2021
Publikationsstatus
Veröffentlicht
Peer-reviewed
Ja
ASJC Scopus Sachgebiete
Ingenieurwesen (insg.), Informatik (insg.), Strategie und Management
Elektronische Version(en)
https://doi.org/10.1109/REW53955.2021.00032 (Zugang: Geschlossen)