Folder: Protecting personal data from “dark patterns”
The aim of this dossier is to shed some light on the subject of manipulation techniques in the digital world, also known as “dark patterns”. The scale of the phenomenon produces significant impacts on the protection of personal data. This analysis carried out within Orange Labs for the “Confidence and security” domain is based on documents from various sources, academics, associations, regulatory authorities, research and press articles. The chosen approach involves technical, legal and design aspects. Above all, the aim is to provide food for thought and to raise awareness about a seemingly deep-rooted problem.
The folder consists of three articles.
A first article defining “dark patterns”. It reports the results of recently published benchmark studies, the results of which make it possible to assess the impacts on the processing of personal data, and to propose monitoring tools.
The second article suggests approaching the topic from the perspective of regulatory models and addressing types of sanctions according to legislation, mainly in the United States and Europe. In light of the harmful effects on society of practices intended to increase profits based on digital advertising, some people do express regret. Many warning messages seem to converge and denounce the gravity of the global situation in the digital world.
The third article highlights the rejection of this situation and the challenges to be met to protect our freedoms and confidence in the digital world.
The purpose of these articles is to contribute, if possible, to shedding light and provoking debate on this important subject.
Article 2: What are the measures against “dark patterns”?
What are the regulatory models and types of sanctions in place to combat manipulation techniques? What existing frameworks are in force in the United States and Europe? These practices are intended to increase profits based on digital advertising but they are having adverse effects on society. In this context, digital players are repenting and numerous warnings seem to be converging to signal the gravity of the situation.
Since 2018, the French Data Protection Agency [1] (Commission Nationale de l’Informatique et des Libertés – CNIL) has cautioned several companies for lack of consent to process geolocation data for targeted advertising purposes, imposing a €50 million penalty [2] on Google in January 2019 for lack of transparency, unsatisfactory information and lack of valid consent for advertising personalisation.
As you may recall, a GDPR compliance logic is based on technical and organisational measures to allow persons concerned to have control over their data and to exercise their rights. Personal data processing, especially on the legal basis of consent, no longer seems able to guarantee an adequate level of protection when dark patterns are manipulative by default. Where then does responsibility lie and how are the regulations enforced?
In July 2019, the CNIL adopted new guidelines [3] for read and write operations on a user’s device (including cookies and other trackers). However, the CNIL seems hesitant about which position to take. At the same time, it announced that it would only impose sanction on illegal behaviour in this domain after mid-2020, much to the chagrin of La Quadrature du Net [4], an association for the defence of fundamental freedoms in the digital world. It challenged the CNIL before the Council of State in order to have obligations immediately enforced, but its application was rejected in October 2019.
Some design professionals seem to be protesting against bad practices. Ethics is becoming an issue that some agencies have made a brand image or an axis of differentiation, like the design agency “Les Sismo” [5]. Different initiatives are emerging. For example, the collective “Designers Éthiques” (Ethical Designers) is behind the event “Ethics by Design” [6], whose first conventions took place at ENS de Lyon in 2017.
The more these derivatives appear, the more we encourage debate about commitment and ethical promises. However, it does not always seem easy to assess what is actually being done and to precisely describe what “good” or “bad” design, in a pragmatic approach to the divergent interests of different stakeholders.
Rigorous criteria are required for establishing best practice. This is the case for consent management, for which 4 cumulative criteria must be met to be validly collected (free, specific, informed, unambiguous) according to the GDPR.
Nevertheless, the objectives of the GDPR are broad and operational implementation is still hesitant. Protection, by default and design, minimisation of collected data, accountability and portability rights seem to be undermined by dark patterns. They are characterised by the priority given to suppliers’ financial interests at the expense of the rights of the persons concerned and are likely to lead to litigation and sanctions or undermine the authorities and erode confidence.
Work such as that carried out by Utz et al., Matte et al. and Nouwens et al. cited in the first article can help to establish practical aspects for evaluating and improving interfaces.
As part of its action plan on targeted advertising, since January 2020 the CNIL has proposed establishing a recommendation [7] concerning the collection of Internet user consent for carriers using “cookies and other trackers”. When it was published, the CNIL asked the French Institute of Public Opinion (Institut Français de l’Opinion Publique – IFOP) to carry out a survey of 1000 people. Respondents expressed a need for transparency and control. First of all, 70% of people believed that it is essential to ask for their consent to use their browser data via cookies, even if it takes a little longer. Secondly, 90% of respondents wanted to know which companies are likely to track their browsing and found that the information currently available on this subject is insufficient.
At the same time as it published its recommendations, the CNIL offered a new version of “Cookieviz” to download. This is a tool that enables Internet users to view the cookies transmitted while browsing the Internet. As an article in the Echos [8] newspaper highlights, Cookieviz allows you to see the ramifications between hundreds of players in online advertising and illustrates the complexity of the data market via cookies. It should also be noted that some industry giants no longer use cookies or have announced that they are looking for other solutions [9].
Cahier IP6 published by the CNIL raises awareness of the current limitations of the means available to users to act, compared to the influence capacity of platforms whose economic model is based on data mining and advertising. “The effectiveness of design techniques, in terms of capturing the attention of Internet users and influencing their behaviour, inevitably leads us to examine the structures that implement these strategies in the most striking way, at the forefront of which are leading data industrialists”[10].
For sociologist Dominique Boullier, Professor at Sciences Po university in Paris, quoted in Cahier IP6, “the whole aim of this struggle to capture available brain time is to massively reduce hesitations and conscious decision-making”.
In a September 2019 article entitled “Dark Patterns: Which Approach to Regulate Them?”, Estelle Hary from the CNIL highlights that “the issue of deceptive design is not exclusive to data protection and privacy. In fact, these practices initially came from the advertising world, which has used them to convince people to consume more. The arrival of the Internet allowed these practices to be adapted to a new medium, reinforcing and consolidating them along the way” [11]. This article from the CNIL also highlights that the protection of consumer rights is included in various legislation, notably in the Directive concerning unfair business-to-consumer commercial practices (2005/29/EC) in Europe or the Code of the Federal Trade Commission, the independent agency in the United States government responsible for the enforcement of consumer law and the regulation of anti-competitive business practices.
Strengthening sanctions
The Federal Trade Commission’s (FTC) sanctions in the United States are far superior to those imposed so far under the GDPR. In September 2019, the FTC formalised a $170 million fine against Google and YouTube for violations of the Child Privacy Act, regarding data collected without parental consent (Children’s Online Privacy Protection Act “COPPA” Rule) [12]. Even though, in this case, dark patterns are not directly covered by the FTC, the accusation highlights that parents and children have been misled by the service offered and YouTube has earned millions of dollars by illegally using cookies to broadcast targeted ads to viewers of these channels.
Another example of a sanction in the United States occurred in early December 2019, when the FTC officially concluded that Cambridge Analytica misled Facebook users. According to the FTC, Cambridge Analytica “engaged in deceptive practices to harvest personal information from tens of millions of Facebook users for voter profiling and targeting”. Last summer, the FTC issued a record-breaking fine of $5 billion to Facebook as punishment for the user privacy violations revealed by the Cambridge Analytica case. In its press release, the FTC said that the purpose of this decision was “the creation of a new culture at Facebook where the company finally lives up to the privacy promises it has made to the millions of American consumers who use its platform” [13].
Will this sanction lead to real change, bearing in mind that advertising accounts for 97% of Facebook’s revenue and that its “hunting ground” for collecting increasing amounts of data is on an international scale? We are reminded that CEO Marc Zuckerberg is accustomed to the phrase “mea culpa” [14] and successive announcements for greater privacy.
Might this awareness, encouraged by the CNIL in Cahier IP6, lead to a “market sanction” and give users the means to act?
These companies’ financial success and political influence give them exceptional power. Professor Siva Vaidhyanathan from the University of Virginia condemns an out-of-control situation and calls for radical action [15]. Can the market self-regulation model ensure lasting confidence? In Europe, can the GDPR and its co-regulatory model promote the development of digital systems that actually respect fundamental freedoms?
New sources of pressure are emerging in the United States. First of all, the California Consumer Privacy Act (CCPA) came into force in January 2020 [16]. Secondly, in April 2019 Senator Mark R. Warner introduced a bill called “Deceptive Experiences To Online Users Reduction” or “DETOUR Act”, specifically targeting major web platforms, to regulate practices and prohibit manipulative designs. Finally, as highlighted by an editorial in Le Monde newspaper in September 2019, “the opening of an antitrust investigation against Google in the United States on 9th September shows that the American justice system has finally decided, under public pressure, to question the GAFA monopoly” [17]. Texas Prosecutor Ken Paxton, who is leading the proceedings, says: “We have evidence that Google’s business practices may have undermined consumer choice, stifled innovation, violated users’ privacy and put Google in control of the flow and dissemination of online information.”
In Europe, Margrethe Vestager, the Commissioner for Competition in office for her second term, is returning to the front line. She is targeting America’s tech giants, such as Google and Apple, which are ultra-dominant in their industries with the effect of significantly reducing consumer choice; “so at some point we may have to look at how these ecosystems can trap consumers” [18].
A decision by the European Union Court of Justice concerning Facebook’s “Like” social plugin should also be noted [19]. This tool transmits data to Facebook about visitors to sites that use the “Like” plugin. The EU Court of Justice ruled that these sites could be jointly responsible, along with Facebook, for collecting this data, despite the imbalance of resources. Thus, unable to hide behind Facebook, a site must first collect visitors’ “informed” consent, informing them about the collection of data and its transmission to Facebook.
Repentance
In this context, the most vocal criticism comes from the United States. Ethan Zuckerman, the inventor of pop-up advertising, apologised to the Internet in 2014 in a column published by the American magazine The Atlantic [20]. Today, he admits that advertising is the web’s “original sin”; “we’ve trained Internet users to expect that everything they say and do online will be aggregated into profiles (which they cannot review, challenge or change) that shape both what ads and what content they see”.
The “Center for Humane Technology” [21] has become the spearhead of this debate. Led by Tristan Harris, formerly at Google, the goal is to “realign technology with humanity’s best interests”. This organisation seeks to raise awareness among citizens and promote protective designs in the face of our intrinsic vulnerabilities such as cognitive biases. This association encourages political initiatives. Personalities such as Roger McNamee, James William and Cathy O’Neil are linked with the organisation.
Roger McNamee is a Facebook early investor. In a book published in French in 2019, he referred to an ongoing “catastrophe” but also to the rise of resistance against the digital giants. For him, “Facebook is as bad for democracy as smoking is for health”.
Another former Google employee, James Williams from Oxford University’s “Digital Ethics Labs” [22], has become a figure of “Ethics by Design” cited in Cahier IP6. According to him, all of the “devices” and technology platforms we use every day are designed to control our attention. And it is time to challenge this approach to digital design and invent sustainable alternatives that respect our freedom of choice and combat risks.
As noted in an article in the Usbek & Rica [23] magazine dated February 2018 about Silicon Valley repentance, “the greatest concern of these professionals, across the board, seems to be focused on future generations or their children, potential victims of the possible abuse of these new technologies”.
For Cahier IP6, “we must nevertheless remain cautious with regards to the rhetoric of tech repentance, which merely fuels the belief that it would be useless to try and regulate these omnipotent companies”.
[3] https://www.cnil.fr/en/cookies-and-other-tracking-devices-cnil-publishes-new-guidelines
[4] https://www.laquadrature.net/2019/10/17/le-conseil-detat-autorise-la-cnil-a-ignorer-le-rgpd/
[5] http://www.sismodesign.com/fr/
[6] https://designersethiques.org/ethics-by-design/
[10] Cahiers IP6
[11] https://linc.cnil.fr/fr/dark-patterns-quelle-grille-de-lecture-pour-les-reguler
[15] https://www.theguardian.com/commentisfree/2019/jul/26/google-facebook-regulation-ftc-settlement
et sa critique par Calimaq https://scinfolex.com/2020/01/07/un-rgpd-californien-qui-transforme-les-donnees-personnelles-en-marchandises-fictives/
[22] https://digitalethicslab.oii.ox.ac.uk/
[23] https://usbeketrica.com/article/les-repentis-de-la-silicon-valley-s-organisent