Folder: Protecting personal data from “dark patterns”
The aim of this dossier is to shed some light on the subject of manipulation techniques in the digital world, also known as “dark patterns”. The scale of the phenomenon produces significant impacts on the protection of personal data. This analysis carried out within Orange Labs for the “Confidence and security” domain is based on documents from various sources, academics, associations, regulatory authorities, research and press articles. The chosen approach involves technical, legal and design aspects. Above all, the aim is to provide food for thought and to raise awareness about a seemingly deep-rooted problem.
The folder consists of three articles.
A first article defining “dark patterns”. It reports the results of recently published benchmark studies, the results of which make it possible to assess the impacts on the processing of personal data, and to propose monitoring tools.
The second article suggests approaching the topic from the perspective of regulatory models and addressing types of sanctions according to legislation, mainly in the United States and Europe. In light of the harmful effects on society of practices intended to increase profits based on digital advertising, some people do express regret. Many warning messages seem to converge and denounce the gravity of the global situation in the digital world.
The third article highlights the rejection of this situation and the challenges to be met to protect our freedoms and confidence in the digital world.
The purpose of these articles is to contribute, if possible, to shedding light and provoking debate on this important subject.
Article 1: What are “dark patterns” and their impacts on personal data?
Perhaps you have noticed “dark patterns” being mentioned more often. We could all fall victim to them when we use digital technology. They may help capture our attention, influence our choices and lead to a massive collection of personal data.
What are “dark patterns”? The subject seems to be somewhat related to social sciences, interface design and behavioural marketing. Are there any links with security and personal data protection? What results are we now using to be able to measure the extent of the phenomenon? What tools do we need to shed light on the matter? Aren’t “dark patterns” one of the essential facets of a broader issue around digital advertising?
Since 2010, designer Harry Brignull has been identifying and denouncing interfaces designed to manipulate users. His website darkpatterns.org has become a point of reference. “dark patterns” are sometimes associated with the term “nudge”[i], in a more positive sense, referring to tools or methods whose aim is to influence people by changing their behaviour.
Dark patterns are defined as manipulation techniques used in a service or product in order to deliberately mislead the user to the benefit of the supplier. They are based on our cognitive biases, which researchers like A. Acquisti see as an explanation for the paradox of privacy “where we make huge amounts of personal information available online while worrying about the consequences of sharing it”.[ii]
“Dark patterns” apply to any type of user interface regardless of the website domain (see the study by Mathur et al. on “dark patterns” in e-commerce sites[iii]). They could influence individuals in terms of their attitudes, choices and consumption.
In view of the principles of personal data protection, the French Data Protection Agency (CNIL) offered in March 2019, in the sixth Cahier Innovation & Prospectives [Cahier IP6], “The Form of Choices”[iv], a non-exhaustive typology of potentially misleading designs aimed at influencing consent, confusing the individual, making usability difficult or pushing the individual to share more data than necessary by using the tactics: take advantage of/seduce/lure/complicate/prohibit.
As Cahier IP6 highlights, some practices may remain compliant from the point of view of the General Data Protection Regulations (GDPR)[v] but, depending on the time, manner and data involved, may either pose ethical questions or become non-compliant.
For example, a potentially misleading design may ask for consent for data collection at a time when it knows that the individual is in a weak position because he is in a hurry or eager to finish. Or it may make the user feel guilty for his choice by the words used if he refuses to be tracked for advertising reasons. Other common techniques involve creating a deliberately long and time-consuming process to access the right levels of information or the most specific settings, or to make them so specific and complicated that the user abandons the process before reaching the initial goal.
There may be large-scale manipulation but there are tools to regain control
More and more studies have recently heightened awareness of the impact of interfaces on the way users behave. People from many disciplines have taken action and researchers in cybersecurity are showing an interest, as demonstrated by the intervention of Claude Castelluccia and Daniel Le Metayer of Inria (National Institute for Research in Digital Science and Technology) on the subject “Influence or Manipulation? What Protections in the Digital World?” [vi], at the 12th CPDP (Computers Privacy and Data Protection) International Conference in January 2019.
In 2018, the Norwegian Consumer Council issued a frequently cited report entitled “Deceived by Design”[vii] denouncing the bad practices employed by Facebook, Google, and Microsoft.
CNIL’s Cahier IP6 elaborates on the problem of manipulating design and encourages public debate for greater transparency. According to this study, the GDPR that has been applicable since May 2018 can only promote good design practices with awareness and intervention on the part of all stakeholders. The CNIL report encourages reverse engineering and market-based means of regulation.
In academic research, three studies have recently been published on consent and tracking management practices. The results demonstrate the use of large-scale manipulation techniques and the impacts of these on the violation of privacy, despite the principles set by the GDPR and the expectations of those concerned with regards to their rights.
On the one hand, the study conducted by Utz et al.[viii] provides a better understanding of the impact of the design of tools for managing consent and information on the way in which users interact. This study suggests that “the current business models of many data-based Web services, which often use ‘dark patterns’ to encourage people to accept data collection, may no longer be sustainable if the default protection principle of GDPR is applied.”
On the other hand, Matte et al.[ix] propose the verification of compliance with European principles of consent management tools based on “Consent Management Providers” (CMPs), actors in charge of collecting the consent of the end user and redistributing this consent to advertisers. In order to measure the compliance of banners, researchers have developed two tools. The first, “Cookinspect” (for Google Chrome) automatically visits websites with consent preferences saved and intercepts the transmission of consent to third parties. A second, “Cookie glasses” (for Google Chrome and Firefox) allows users to detect a banner and see if their choice is correctly passed on to advertisers by the CMPs. According to the results of this study, out of 1426 European websites that implement an IAB Europe Transparency & Consent Framework (TCF) banner, 54% of websites have at least one violation of the consent criteria.
In the study “Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence” published in early January 2020 by Nouwens et al.[x], five researchers from MIT, UCL and Aarhus University found that only 11.8% of the CMPs studied (Cookiebot, Crownpeak, OneTrust, Quantcast and TrustArc) on the 10,000 most popular websites in the UK met the minimum requirements set by the European legal and regulatory framework. According to this paper, “dark patterns” and implied consent are ubiquitous. A browser extension was also designed for this study, which allows automatic pop-up responses based on the user’s customisable preferences. This extension is called “Consent-o-Matic”.[xi] It is available on both Firefox and Chrome.
[i] Richard H. Thaler and Cass R. Sunstein. 2008. Nudge: Improving Decisions About Health, Wealth and Happiness. Yale University Press, New Haven, CT. https://www.researchgate.net/publication/235413094_NUDGE_Improving_Decisions_About_Health_Wealth_and_Happiness
[ii] Cahier IP6: https://www.cnil.fr/sites/default/files/atoms/files/cnil_cahiers_ip6.pdf
[iii] Arunesh Mathur, Gunes Acar, Michael J. Friedman, Elena Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan. 2019. Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 81 (November 2019), 32 pages. https://doi.org/10.1145/3359183
[v] REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679
[vi] https://www.youtube.com/watch?v=7Gm2QLp3fl0
[vii] https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf
[viii] Christine Utz, Martin Degeling, Sascha Fahl, Florian Schaub, and Thorsten Holz. 2019. (Un)informed Consent: Studying GDPR Consent Notices in the Field. In 2019 ACM SIGSAC Conference on Computer and Communications Security (CCS ’19), 11–15 November 2019, London, United Kingdom. ACM, New York, NY, USA, 18 pages. https://doi.org/10.1145/3319535.3354212
[ix] Célestin Matte, Nataliia Bielova, and Cristiana Santos. 2019. Do Cookie Banners Respect my Choice? Measuring Legal Compliance of Banners from IAB Europe’s Transparency and Consent Framework (Under submission). https://arxiv.org/abs/1911.09964v1
[x] Midas Nouwens, Ilaria Liccardi, Michael Veale, David Karger, Lalana Kagal. 2020. Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence. CHI ’20 CHI Conference on Human Factors in Computing Systems, 25–30 April 2020, Honolulu, HI, USA. https://arxiv.org/pdf/2001.02479v1.pdf
[xi] Firefox: https://addons.mozilla.org/en-US/firefox/addon/consent-o-matic/
Chrome: https://chrome.google.com/webstore/detail/consent-o-matic/mdjildafknihdffpkfmmpnpoiajfjnjd