Fighting Malicious Designs: Towards Visual Countermeasures Against Dark Patterns
Short Paper at ACM CHI '24
by René Schäfer, Paul Preuschoff, René Röpke, Sarah Sahabi, and Jan Borchers
Abstract
Dark patterns are malicious UI design strategies that nudge users towards decisions going against their best interests. To technical countermeasures against them, dark patterns must be automatically detectable. While researchers have devised algorithms to detect some patterns automatically, there has only been little work to use obtained results to technically counter the effects of dark patterns when users face them on their devices. To address this, we tested three visual countermeasures against 13 common dark patterns in an interactive lab study. The countermeasures we tested either (a) highlighted and explained the manipulation, (b) hid it from the user, or (c) let the user switch between the original view and the hidden version. From our data, we were able to extract multiple clusters of dark patterns where participants preferred specific countermeasures for similar reasons. To support creating effective countermeasures, we discuss our findings with a recent ontology of dark patterns.
Publications
- René Schäfer, Paul Preuschoff, René Röpke, Sarah Sahabi and Jan Borchers. Fighting Malicious Designs: Towards Visual Countermeasures Against Dark Patterns. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, CHI' 24, pages 13, Association for Computing Machinery, New York, NY, USA, May 2024.