Folder: Protecting personal data from “dark patterns”
The aim of this dossier is to shed some light on the subject of manipulation techniques in the digital world, also known as “dark patterns”. The scale of the phenomenon produces significant impacts on the protection of personal data. This analysis carried out within Orange Labs for the “Confidence and security” domain is based on documents from various sources, academics, associations, regulatory authorities, research and press articles. The chosen approach involves technical, legal and design aspects. Above all, the aim is to provide food for thought and to raise awareness about a seemingly deep-rooted problem.
The folder consists of three articles.
A first article defining “dark patterns”. It reports the results of recently published benchmark studies, the results of which make it possible to assess the impacts on the processing of personal data, and to propose monitoring tools.
The second article suggests approaching the topic from the perspective of regulatory models and addressing types of sanctions according to legislation, mainly in the United States and Europe. In light of the harmful effects on society of practices intended to increase profits based on digital advertising, some people do express regret. Many warning messages seem to converge and denounce the gravity of the global situation in the digital world.
The third article highlights the rejection of this situation and the challenges to be met to protect our freedoms and confidence in the digital world.
The purpose of these articles is to contribute, if possible, to shedding light and provoking debate on this important subject.
Article 3: Dark patterns, an unacceptable downward spiral
Are we going from utopia to digital nightmare? Our digital usage is suffering from “persuasive” techniques and profit motives. What are the challenges and how can we reject these “dark patterns” of abuse?
We are supposedly in “the goldfish civilisation”, according to the title of a book by Bruno Patino, editorial director of Arte France. In the essay, he points to the risks associated with predictive algorithms. This media specialist and web development pioneer notes that we are like a goldfish going around and around its fish tank and rediscovering the world each time. According to the title of an August 2019 article appearing in Marianne [1], it’s a broken utopia in which Google engineers have determined the attention span of millennials (who grew up with screens) to be 9 seconds. Hence the so-called “persuasive” techniques to make us dependent and create addicted users, such as “random-reward systems” in which “uncertainty produces compulsion”. As the article points out in reference to Bruno Patino’s essay, the adverse effects of increasing screen exposure have names like “nomophobia” (fear of being separated from one’s phone), “profile schizophrenia” (confusion between one’s Internet profile and one’s real identity), and “athazagoraphobia” (fear of being treated with indifference on social media).
The excessive, even addictive, digital consumption developed by dark patterns especially affects children [2], with more than three-quarters (76.2%) between the ages of 11-14 logging on more than 2 hours per day on average. This worries parents. But overconsumption also impacts the climate and the environment [3].
Sir Tim Berners-Lee, who is very worried, says we should safeguard ourselves against a “dystopia” [4]! It may still be possible to save the Web, which in his view has fallen prey to commercialism over the span of 30 years. The founder of the World Wide Web is proposing a contract [5] that sets rules for businesses, governments and individuals to adhere to. Is this a new utopia?
Amnesty International’s November 2019 report points to “the surveillance-based economic model put in place by Facebook and Google”. According to Amnesty International, it “is inherently incompatible with the right to privacy and poses a threat to a whole range of other rights: rights to freedom of opinion, expression and thought, rights to equality and the right to non-discrimination” [6].
If for now there still seems to be room for criticism and improvement, by what means might we resist new versions of “dark patterns” installed on a mind/machine interface, like that of Elon Musk’s Neuralink company [7]? What safeguards are there against the desire by companies to control our behaviour for more and more profits and new markets?
Criticism is growing increasingly sharp, and calls for vigilance are now being made for a global system that Shoshana Zuboff calls “surveillance capitalism”. This professor emeritus at Harvard Business School is examining this new power. “How is it transforming human nature in the name of its lucrative certainties? [8]”
Challenges
Based on fundamental principles and following society’s high expectations, how can we protect the rights of affected individuals in accordance with the values of a democratic system? What steps should be taken to improve the situation? What avenues can give us greater perspective to think about solutions? As highlighted in an article presented at APVP’18 [9], “while confidence in the protection of personal data is often emphasised as a crucial factor when adopting digital services, the reality of integrating protection measures, especially technologies like “Privacy Enhancement Technologies” (PETs), are severely lacking. Yet there is an opportunity to create innovative services while ensuring privacy and data minimisation with cutting-edge technologies designed and evaluated by a community of researchers from prestigious institutions. ENISA, the European Network and Information Security Agency, helps promote PETs and Privacy by Design. But the economic success of digital services that value personal data mines has stifled opportunities to offer “privacy-friendly” products and services to most people”. Is the GDPR, with its principles of transparency, oversight by those concerned, and accountability by all players processing data, in a position to create a real alternative and turn things around?
When it comes to choosing technical tools, can we come to a collective decision on strategies that are consistent with our values?
As suggested in the World Editorial of 10 December 2019, on “the use of health data”: “Let’s take action so that France and Europe are not under the thumb of the multinational digital giants.” The editorial proposes sharing transparent algorithms and analysis software by promoting free software and culture. In practice, the situation is way off the mark.
Legally, shouldn’t other supplemental approaches to safeguarding personal data be implemented, in view of its impact on society as a whole? According to the critical analysis by researchers Antonio Casilli and Paola Tubaro, “our privacy is no longer an individual right but closer to the idea of a bundle of rights and prerogatives to be allocated among citizens, the state and digital companies” [10]. Dominique Cardon, head of the MediaLab at Sciences Po, is also calling on us to stop thinking about privacy individually and to think about it as a collective right in order to address the subject of surveillance differently [11].
Practical tools and more ethical ideas are needed to provide solutions. The IP6 Paper published by the CNIL proposes design recommendations to improve “the quality, accessibility and intelligibility of the information provided to those affected”. Those designing and developing services most decidedly need tools to formulate novel responses. The objective of producing “analyses for the design of privacy-friendly interfaces for users (acculturating them to data protection concepts, issues to be integrated into the design process, building blocks, principles and rules, etc.) and concrete recommendations (“do”/“don’t”, “design patterns”, types of transparency and loyalty mechanisms, etc.)” is an essential step. But can these recommendations change the prevailing mind-set of targeted advertising, the intrinsic rules of which run counter to data minimisation and control by those concerned?
How can we now offer simple and useful digital services for users to have a customised experience geared at exercising their rights? What resources are needed to achieve quality services that are valued by users and respect their privacy? The cost to society of “free” advertising-driven services must be reassessed in light of the threats posed by massive data collection, a disdain for freedom, the harmful effects of over-exposure to screens, and the pressures of endless consumption.
Conclusion
Security and personal-data-protection management already incorporates many ways to avoid certain manipulations, such as in social engineering. It already takes into account the “human flaws” of an information system as “leverage” to break through its security measures.
A careful study of dark patterns can help us target new threats, identify liabilities, and improve security in people’s digital life. It provides a better understanding of the magnitude of the problems and challenges. Concrete actions taken at the design stage against the “dark patterns” are helping to reduce risks. The CNIL IP6 Book and many scientific articles offer reading grids and provide examples. Dark patterns are not the isolated malicious practices of a handful of unscrupulous developers. They are in wide use, and reveal how an economic system that destroys values, first among which are respect for privacy, self-determination, and individual autonomy, is accelerating. Numerous studies demonstrate the mechanisms and non-compliance with GDPR of large-scale technical solutions. Against this backdrop, the promise of human happiness and individual fulfilment through new technologies is illusory. Alerts about threats to our freedoms are growing.
In Europe, since 2018, the GDPR has provided a co-regulatory framework in which each stakeholder must take responsibility and guarantee solutions for the protection of personal data. Sanctions can be very severe. But, for the moment, they remain limited in comparison to the fines levied in the USA. Market regulation could, as suggested in the CNIL IP6 Paper, serve as leverage for the adoption of best practices. However, digital advertising giants are accumulating record data and profits. For the moment, “dark patterns” seem to be prevailing at the expense of “privacy by design”. How are users lacking information or without a choice supposed to act in this case? Change is possible if the default data-protection principle is respected.
The abusive practices observed in connection with “dark patterns” raise many questions about the power acquired by digital giants funded by advertising and how to take collective action to confront the threat posed by manipulation techniques.
[2] Sources: Médiamétrie April 2019 and Pew Research Center
[3] https://theshiftproject.org/category/thematiques/numerique/
[4] https://cacm.acm.org/news/228232-from-topia-to-dystopia-and-back-again/fulltext
[5] https://contractfortheweb.org/fr/
[6] https://www.amnesty.fr/actualites/facebook-et-google-les-geants-de-la-surveillance
[8] https://www.monde-diplomatique.fr/2019/01/ZUBOFF/59443
[9] Stéphane Guilloteau, 2018, Logique de conformité, introduction aux mécanismes de certification en matière de protection des données personnelles (Compliance logic, introduction to certification mechanisms in the field of personal data protection), Workshop on Privacy Protection, 2018, 3 to 6 June 2018, Porquerolles, France, Publication pending in the journal in TSI (Technical and Computer Science), https://project.inria.fr/apvp2018/
[10] “Notre vie privée, un concept négociable” (Our privacy, a negotiable concept), Antonio Casilli (Telecom ParisTech/EHESS) and Paola Tubaro (CNRS).Opinion column in “Le Monde”, 22 January 2018. https://www.casilli.fr/2018/01/22/la-vie-privee-et-les-travailleurs-de-la-donnee-le-monde-22-janv-2018/
[11] “Culture Numérique” (Digital culture), Dominique Cardon, SciencesPo Press, 2019