From 5c7f93f6b9dbc626cfa98124ca70bb968af2c1c4 Mon Sep 17 00:00:00 2001 From: Robin Berjon Date: Wed, 6 Sep 2023 12:48:06 -0400 Subject: [PATCH] Editorial pass, 1. to 1.1.2 (fixes #298) (#334) * 'incorporating direct feedback from wide review' * 'sec 1' * 'sec 1.1' * 'sec 1.1.1' * 'sec 1.1.2' * Update index.html Co-authored-by: Jeffrey Yasskin * 'less affirmative' * 'align with EWP' * 'switch web grafs' * 'move sentence up' * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * 'sense of loss' * 'out to our from JY' * 'missing a verb' * Update index.html Co-authored-by: Jeffrey Yasskin * Update index.html Co-authored-by: Jeffrey Yasskin * 'which what' --------- Co-authored-by: Robin Berjon Co-authored-by: Jeffrey Yasskin Co-authored-by: Daniel Appelquist --- index.html | 192 ++++++++++++++++++++++++++--------------------------- 1 file changed, 93 insertions(+), 99 deletions(-) diff --git a/index.html b/index.html index 80454148..09c230c6 100644 --- a/index.html +++ b/index.html @@ -544,16 +544,16 @@ This is a document containing technical guidelines. However, in order to put those guidelines in context we must first define some terms and explain what we mean by privacy. -The Web is for everyone ([[?For-Everyone]]). It is "a platform that helps people and provides a -net positive social benefit" ([[?ETHICAL-WEB]], [[?design-principles]]). One of the ways in which the -Web serves people is by protecting them in the face of asymmetries of power, and this includes -establishing and enforcing rules to govern the power of data. - The Web is a social and technical system made up of [=information flows=]. Because this document is specifically about [=privacy=] as it applies to the Web, it focuses on privacy with respect to information flows. -Information is power. It can be used to predict and to influence people, as well as to design online +The Web is for everyone ([[?For-Everyone]]). It should be "a platform that helps people and provides a +net positive social benefit" ([[?ETHICAL-WEB]]). One of the ways in which the +Web serves people is by seeking to protect them from surveillance and the types of manipulation that data can +enable. + +Information can be used to predict and to influence people, as well as to design online spaces that control people's behaviour. The collection and [=processing=] of information in greater volume, with greater precision and reliability, with increasing interoperability across a growing variety of data types, and at intensifying speed is leading to a concentration of power that threatens @@ -562,22 +562,24 @@ behaviours that would be more easily kept in check if the perpetrator had to be in the same room as the victim. -These asymmetries of information and of automation create significant asymmetries of power. - -Data governance is the system of principles that regulate [=information flows=]. When -[=people=] are involved in [=information flows=], [=data governance=] determines how -these principles constrain and distribute the power of information between different [=actors=]. -Such principles describe the ways in which different [=actors=] may, must, -or must not produce or [=process=] flows of information from, to, or about other [=actors=] -([[?GKC-Privacy]], [[?IAD]]). +When an [=actor=] can collect [=data=] about a [=person=] and process it automatically, and that +[=person=] cannot automatically protect their [=data=] and prevent its processing ([=automation asymmetry=]) +this creates an imbalance of power that favors that [=actor=] and decreases the [=person=]'s agency. +This document focuses on the impact that [=data=] [=processing=] can have on people, but it can also +impact other [=actors=], such as companies or governments. It is important to keep in mind that not all people are equal in how they can resist -the imposition of unfair principles: some [=people=] are more [=vulnerable=] and therefore in greater -need of protection. This document focuses on the impact that differences in information power can -have on people, but those differences can also impact other [=actors=], such as companies or governments. +an imbalance of power: some [=people=] are more [=vulnerable=] and therefore in greater +need of protection. + +Data governance is the system of principles that regulate [=information flows=]. +[=Data governance=] determines how +which [=actors=] can collect what [=data=] and how they may, must, or must not [=process=] it +([[?GKC-Privacy]], [[?IAD]]). This document provides building blocks for [=data governance=] +that puts [=people=] first. -Principles vary from [=context=] to [=context=] ([[?Understanding-Privacy]], [[?Contextual-Integrity]]): people -have different expectations of [=privacy=] at work, at a café, or at home for instance. Understanding and +Principles vary from [=context=] to [=context=] ([[?Understanding-Privacy]], [[?Contextual-Integrity]]). +For instance, people have different expectations of [=privacy=] at work, at a café, or at home. Understanding and evaluating a privacy situation is best done by clearly identifying: * Its [=actors=], which include the subject of the information as well as the sender and the recipient @@ -585,28 +587,22 @@ * The type of data involved in the [=information flow=]. * The principles that are in use in this context. -It is important to keep in mind that there are always privacy principles and that all -of them imply different power dynamics. Some sets of principles may be more permissive, but that does -not make them neutral — it means that they support the power dynamic that -comes with permissive [=processing=]. We must therefore determine which principles -best align with ethical Web values in Web [=contexts=] ([[?ETHICAL-WEB]], [[?Why-Privacy]]). - -Information flows as used in this document refer to information -exchanged or processed by [=actors=]. The information itself need not necessarily be -[=personal data=]. Disruptive or interruptive information flowing to a -person is in scope, as is [=de-identified=] [=data=] that can be used to manipulate people or that -was extracted by observing people's behaviour on a website. - -[=Information flows=] need to be understood from more than one perspective: there is the flow of information -about a person (the subject) being processed or transmitted to any other [=actor=], and there is -the flow of information towards a person (the recipient). Recipients can have their privacy violated in multiple ways such as -unexpected shocking images, loud noises while they intend to sleep, manipulative information, -interruptive messages when their focus is on something else, or harassment when they seek -social interactions. +There are always privacy principles at work. Some sets of principles may be more +permissive, but that does not make them neutral. All privacy principles have an impact on +[=people=] and we must therefore determine which principles best align with ethical Web values in +Web [=contexts=] ([[?ETHICAL-WEB]], [[?Why-Privacy]]). + +Information flows are information exchanged or processed by +[=actors=]. A person's privacy can be harmed both by their information flowing from them to +other actors and by information flowing toward them. Examples of the latter include: +unexpected shocking images, +loud noises while they intend to sleep, manipulative information, interruptive +messages when their focus is on something else, or harassment when they seek social interactions. +(In some of these cases, the information may not be [=personal data=].) On the Web, [=information flows=] may involve a wide variety of [=actors=] that are not always recognizable or obvious to a user within a particular interaction. Visiting a website may involve -the actors that operate that site and its functionality, but also actors with network access, +the actors that contribute to operating that site, but also actors with network access, which may include: Internet service providers; other network operators; local institutions providing a network connection including schools, libraries or universities; government intelligence services; malicious hackers who have gained access to the network or the systems of any of the other actors. @@ -623,66 +619,62 @@ A [=person=]'s autonomy is their ability to make decisions of their own personal will, without undue influence from other [=actors=]. People have limited intellectual resources and -time with which to weigh decisions, and by necessity rely on shortcuts when making -decisions. This makes their preferences, including privacy preferences, malleable and susceptible to -manipulation ([[?Privacy-Behavior]], [[?Digital-Market-Manipulation]]). A [=person=]'s [=autonomy=] is enhanced by a -system or device when that system offers a shortcut that aligns more with what that [=person=] would -have decided given arbitrary amounts of time and relatively unlimited intellectual ability; -and [=autonomy=] is decreased when a similar shortcut goes against decisions made under such -ideal conditions. +time with which to weigh decisions, and they have to rely on shortcuts when making decisions. This makes it possible +to manipulate their preferences, including their privacy preferences ([[?Privacy-Behavior]], [[?Digital-Market-Manipulation]]). +A [=person=]'s [=autonomy=] is improved by a system when that system offers a shortcut that is closer to what +that [=person=] would have decided given unlimited time and intellectual ability. [=Autonomy=] is decreased +when a similar shortcut goes against decisions made under these ideal conditions. Affordances and interactions that decrease [=autonomy=] are known as deceptive patterns (or dark patterns). A [=deceptive pattern=] does not have to be intentional ([[?Dark-Patterns]], [[?Dark-Pattern-Dark]]). - -Because we are all subject to motivated reasoning, the design of defaults and affordances -that may impact [=autonomy=] should be the subject of independent scrutiny. +When building something that may impact people's [=autonomy=], it is important that reviewers +from multiple independent perspectives check that it does not introduce [=deceptive patterns=]. Given the large volume of potential [=data=]-related decisions in today's data economy, -complete informational self-determination is impossible. This fact, however, should not be -confused with the idea that privacy is dead. Studies show that [=people=] remain concerned over how -their [=data=] is [=processed=], feeling powerless and like they have lost agency -([[?Privacy-Concerned]]). Careful design of our technological infrastructure can ensure that -people's [=autonomy=] with respect to their own [=data=] is enhanced through [=appropriate=] -defaults and choice architectures. +it is impossible for people to have detailed control over how their data is processed. +This fact does not imply that privacy is dead. Studies show that +[=people=] remain concerned over how their [=data=] is [=processed=], that they feel powerless, +and sense that they have lost agency ([[?Privacy-Concerned]]). If we design our technological infrastructure +carefully, we can give people greater [=autonomy=] with respect to their own [=data=]. This is +done by setting [=appropriate=], privacy-protective defaults and designing user-friendly choice +architectures. ### Opt-in, Consent, Opt-out, Global Controls {#opt-in-out} Several kinds of mechanisms exist to enable [=people=] to control how they interact -with systems in the world. Mechanisms that increase the number of [=purposes=] for which -their [=data=] is being [=processed=] or the amount their [=data=] is [=processed=] +with data-processing systems. Mechanisms that increase the number of [=purposes=] for which +their [=data=] is being [=processed=] or the amount of their [=data=] that is [=processed=] are referred to as [=opt-in=] or consent. Mechanisms -that decrease this number of [=purposes=] or amount of [=processing=] are known as +that decrease this number of [=purposes=] or the amount of [=data=] being [=processed=] are known as opt-out. -When deployed thoughtfully, these mechanisms can enhance [=people=]'s [=autonomy=]. Often, +When deployed thoughtfully, these mechanisms can improve [=people=]'s [=autonomy=]. Often, however, they are used as a way to avoid putting in the difficult work of deciding which types of [=processing=] are [=appropriate=] and which are not, offloading [=privacy labour=] to the people using a system. -In specific cases, [=people=] should be able to [=consent=] to data sharing that would otherwise be restricted, -such as having their [=identity=] or reading history shared across contexts. -[=Actors=] need to take care that their users are *informed* when granting this [=consent=] and -*aware* enough about what's going on that they can know to revoke their consent -when they want to. -[=Consent=] is comparable to the general problem of permissions on the Web -platform. Both consent and permissions should be requested in a way that lets -people delay or avoid answering if they're trying to do something else. If -either results in persistent data access, there should be an indicator that lets -people notice and that lets them turn off the access if it has lasted longer -than they want. In general, providing [=consent=] should be rare, intentional, -and temporary. - -When an [=opt-out=] mechanism exists, it should preferably be complemented by a +[=People=] should be able to [=consent=] to data sharing that would +otherwise be restricted, such as granting access to their pictures or geolocation. +[=Actors=] need to take care that their users are [*informed*](#consent-principles) when +granting this [=consent=] and *aware* enough about what's going on that they can know to +revoke their consent when they want to. +[=Consent=] to data processing and granting permissions to access Web platform APIs are +similar problems. Both consent and permissions should be requested in a way that lets +people delay or avoid answering if they're trying to do something else. If the user +grants some form of persistent access to data, there should be an indicator that lets +people notice this ongoing access and that lets them turn it off whenever they wish to. +In general, providing [=consent=] should be rare, intentional, and temporary. + +When an [=opt-out=] mechanism exists, it should preferably work with a global opt-out mechanism. The function of a [=global opt-out=] mechanism is to -rectify the automation asymmetry whereby service providers can automate +rectify the automation asymmetry whereby service providers can automate [=data processing=] but [=people=] have to take manual action to prevent it. A good example of a [=global opt-out=] mechanism is the Global Privacy Control [[?GPC]]. Conceptually, a [=global opt-out=] mechanism is an automaton operating as part of the -[=user agent=], which is to say that it is equivalent to a robot that would carry out a -[=person=]'s bidding by pressing an [=opt-out=] button with every interaction that the -[=person=] has with a site, or more generally conveys an expression of the [=person=]'s -rights in a relevant jurisdiction. (For instance, the [=person=] may be objecting to [=processing=] +[=user agent=]. It is equivalent to a robot that would carry out a [=person=]'s instructions +by pressing an [=opt-out=] button (or a similar expression of the [=person=]'s rights) with every +interaction that the [=person=] has with a site. (For instance, the [=person=] may be objecting to [=processing=] based on legitimate interest, withdrawing [=consent=] to specific [=purposes=], or requesting that their data not be sold or shared.) The [=user=] is effectively delegating the expression of their [=opt-out=] to their [=user agent=], which helps rectify [=automation asymmetry=]. @@ -692,7 +684,7 @@ [=user agent=] but rather as a preference that they have chosen to automatically reaffirm with every interaction with the site. -One implementation strategy for [=opt-outs=] and other data rights is +One implementation strategy for [=opt-outs=] or other data rights is to assign [=people=] stable [=identifiers=] and to maintain a central registry to map these [=identifiers=] to [=people=]'s preferences. [=Actors=] that wish to process a given person's data are then expected to fetch that person's preferences from the central registry and to @@ -705,32 +697,31 @@ ### Privacy Labour {#privacy-labour} -Privacy labour is the practice of having a [=person=] carry out +Privacy labour is the practice of having a [=person=] do the work of ensuring [=data processing=] of which they are the subject or recipient is [=appropriate=], instead of putting the responsibility on the [=actors=] who are doing the processing. Data systems that are based on asking [=people=] for their [=consent=] tend to increase [=privacy labour=]. -More generally, implementations of [=privacy=] are often dominated by self-governing approaches that -offload [=labour=] to [=people=]. This is notably true of the regimes descended from the -Fair Information Practices ([=FIPs=]), a loose set of principles initially -elaborated in the 1970s in support of individual [=autonomy=] in the face of growing concerns with databases. The -[=FIPs=] generally assume that there is sufficiently little [=data processing=] taking place that any -[=person=] will be able to carry out sufficient diligence to enable [=autonomy=] in their -decision-making. Since they offload the [=privacy labour=] -to people and assume perfect, unlimited [=autonomy=], the [=FIPs=] do not forbid specific -types of [=data processing=] but only place them under different procedural requirements. -This approach is no longer appropriate. - -One notable issue with procedural, self-governing approaches to privacy is that they tend to have the same +More generally, implementations of [=privacy=] often offload [=labour=] to [=people=]. This is +notably true of the regimes descended from the Fair Information Practices +([=FIPs=]), a loose set of principles initially elaborated in the 1970s in support of individual +[=autonomy=] in the face of growing concerns with databases. The [=FIPs=] generally assume that +there is sufficiently little [=data processing=] taking place that any [=person=] will be able to +carry out sufficient diligence to be [=autonomous=] in their decision-making. Since they offload +the [=privacy labour=] to people and assume perfect, unlimited [=autonomy=], the [=FIPs=] do not +forbid specific types of [=data processing=] but only place them under different procedural +requirements. This approach is no longer [=appropriate=]. + +One notable issue with procedural approaches to privacy is that they tend to have the same requirements in situations where people find themselves in a significant asymmetry of power with another [=actor=] — for instance a [=person=] using an essential service provided by a monopolistic platform — and those where a person and the other [=actor=] are very much on equal footing, or even where the [=person=] may have greater power, as is the case with small -businesses operating in a competitive environment. They further do not consider cases in +businesses operating in a competitive environment. They also do not consider cases in which one [=actor=] may coerce other [=actors=] into facilitating its [=inappropriate=] -practices, as is often the case with dominant players in advertising or -in content aggregation ([[?Consent-Lackeys]], [[?CAT]]). +practices, as is often the case with dominant players in advertising or in content aggregation +([[?Consent-Lackeys]], [[?CAT]]). Reference to the [=FIPs=] survives to this day. They are often referenced as "transparency and choice", which, in today's digital environment, is often an indication that @@ -743,7 +734,11 @@ of [=privacy=] in a given context can be contested ([[?Privacy-Contested]]). This makes privacy a problem of collective action ([[?GKC-Privacy]]). Group-level [=data processing=] may impact populations or individuals, including in -ways that [=people=] could not control even under the optimistic assumptions of [=consent=]. +ways that [=people=] could not control even under the optimistic assumptions of [=consent=]. For instance, +it's possible that the only thing that a person is willing to reveal to a particular actor is that they +are part of a given group. However, other members of the same group may be interacting with the same +actor and revealing a lot more information, which can enable effective statistical inferences about +people who refrain from providing information about themselves. What we consider is therefore not just the relation between the [=people=] who share data and the [=actors=] that invite that sharing ([[?Relational-Turn]]), but also between the [=people=] @@ -816,8 +811,7 @@ to ensure that "broad testing and audit continues to be possible" where [=information flows=] and automated decisions are involved. -Such transparency can only function if there are strong rights -of access to data (including data +Such transparency can only function if there are strong rights of access to data (including data derived from one's personal data) as well as mechanisms to explain the outcomes of automated decisions. @@ -1208,7 +1202,7 @@ For example, the URLs of resources, the timing of link clicks, and the referrer chain within a single origin are not guarded by anything; the scroll position is guarded by the setting to turn off JavaScript; and access to the camera or geolocation are guarded by permission prompts. - + When the `` attribute was added, the designers realized that it exposed the scroll position, so it's also guarded by the setting to turn off JavaScript. @@ -1269,7 +1263,7 @@