Fighting Malicious Designs: Towards Visual Countermeasures Against Dark Patterns

Short Paper at ACM CHI '24
by René Schäfer, Paul Preuschoff, René Röpke, Sarah Sahabi, and Jan Borchers

Abstract

Dark patterns are malicious UI design strategies that nudge users towards decisions going against their best interests. To technical countermeasures against them, dark patterns must be automatically detectable. While researchers have devised algorithms to detect some patterns automatically, there has only been little work to use obtained results to technically counter the effects of dark patterns when users face them on their devices. To address this, we tested three visual countermeasures against 13 common dark patterns in an interactive lab study. The countermeasures we tested either (a) highlighted and explained the manipulation, (b) hid it from the user, or (c) let the user switch between the original view and the hidden version. From our data, we were able to extract multiple clusters of dark patterns where participants preferred specific countermeasures for similar reasons. To support creating effective countermeasures, we discuss our findings with a recent ontology of dark patterns.

Authors

René
Schäfer

Paul
Preuschoff

René
Röpke

Sarah
Sahabi

Jan
Borchers

Publications

    2024

  • René Schäfer, Paul Preuschoff, René Röpke, Sarah Sahabi and Jan Borchers. Fighting Malicious Designs: Towards Visual Countermeasures Against Dark Patterns.  In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, CHI '24 (Forthcoming), pages 13, Association for Computing Machinery, New York, NY, USA, May 2024.
    HomepagePDF DocumentBibTeX Entry

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.