From d668c964a5b6b370dddca18ac3070a8f492bc762 Mon Sep 17 00:00:00 2001 From: Robin Berjon Date: Wed, 6 Sep 2023 16:49:30 +0000 Subject: [PATCH] Editorial pass, 1. to 1.1.2 (fixes #298) (#334) SHA: 5c7f93f6b9dbc626cfa98124ca70bb968af2c1c4 Reason: push, by darobin Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> --- index.html | 828 ++++++++++++++++++++++++++--------------------------- 1 file changed, 411 insertions(+), 417 deletions(-) diff --git a/index.html b/index.html index bb066675..1e5e9344 100644 --- a/index.html +++ b/index.html @@ -917,14 +917,14 @@

Privacy Principles

1. An Introduction to Privacy on the Web

This is a document containing technical guidelines. However, in order to put those guidelines in context we must first define some terms and explain what we mean by privacy.

-

The Web is for everyone ([For-Everyone]). It is "a platform that helps people and provides a -net positive social benefit" ([ETHICAL-WEB], [design-principles]). One of the ways in which the -Web serves people is by protecting them in the face of asymmetries of power, and this includes -establishing and enforcing rules to govern the power of data.

-

The Web is a social and technical system made up of information flows. Because this document +

The Web is a social and technical system made up of information flows. Because this document is specifically about privacy as it applies to the Web, it focuses on privacy with respect to information flows.

-

Information is power. It can be used to predict and to influence people, as well as to design online +

The Web is for everyone ([For-Everyone]). It should be "a platform that helps people and provides a +net positive social benefit" ([ETHICAL-WEB]). One of the ways in which the +Web serves people is by seeking to protect them from surveillance and the types of manipulation that data can +enable.

+

Information can be used to predict and to influence people, as well as to design online spaces that control people's behaviour. The collection and processing of information in greater volume, with greater precision and reliability, with increasing interoperability across a growing variety of data types, and at intensifying speed is leading to a concentration of power that threatens @@ -932,45 +932,42 @@

Privacy Principles

of our lives both increase the power of information and decrease the cost of a number of intrusive behaviours that would be more easily kept in check if the perpetrator had to be in the same room as the victim.

-

These asymmetries of information and of automation create significant asymmetries of power.

-

Data governance is the system of principles that regulate information flows. When -people are involved in information flows, data governance determines how -these principles constrain and distribute the power of information between different actors. -Such principles describe the ways in which different actors may, must, -or must not produce or process flows of information from, to, or about other actors -([GKC-Privacy], [IAD]).

+

When an actor can collect data about a person and process it automatically, and that +person cannot automatically protect their data and prevent its processing (automation asymmetry) +this creates an imbalance of power that favors that actor and decreases the person's agency. +This document focuses on the impact that data processing can have on people, but it can also +impact other actors, such as companies or governments.

It is important to keep in mind that not all people are equal in how they can resist -the imposition of unfair principles: some people are more vulnerable and therefore in greater -need of protection. This document focuses on the impact that differences in information power can -have on people, but those differences can also impact other actors, such as companies or governments.

-

Principles vary from context to context ([Understanding-Privacy], [Contextual-Integrity]): people -have different expectations of privacy at work, at a café, or at home for instance. Understanding and +an imbalance of power: some people are more vulnerable and therefore in greater +need of protection.

+

Data governance is the system of principles that regulate information flows. +Data governance determines how +which actors can collect what data and how they may, must, or must not process it +([GKC-Privacy], [IAD]). This document provides building blocks for data governance +that puts people first.

+

Principles vary from context to context ([Understanding-Privacy], [Contextual-Integrity]). +For instance, people have different expectations of privacy at work, at a café, or at home. Understanding and evaluating a privacy situation is best done by clearly identifying:

-

It is important to keep in mind that there are always privacy principles and that all -of them imply different power dynamics. Some sets of principles may be more permissive, but that does -not make them neutral — it means that they support the power dynamic that -comes with permissive processing. We must therefore determine which principles -best align with ethical Web values in Web contexts ([ETHICAL-WEB], [Why-Privacy]).

-

Information flows as used in this document refer to information -exchanged or processed by actors. The information itself need not necessarily be -personal data. Disruptive or interruptive information flowing to a -person is in scope, as is de-identified data that can be used to manipulate people or that -was extracted by observing people's behaviour on a website.

-

Information flows need to be understood from more than one perspective: there is the flow of information -about a person (the subject) being processed or transmitted to any other actor, and there is -the flow of information towards a person (the recipient). Recipients can have their privacy violated in multiple ways such as -unexpected shocking images, loud noises while they intend to sleep, manipulative information, -interruptive messages when their focus is on something else, or harassment when they seek -social interactions.

-

On the Web, information flows may involve a wide variety of actors that are not always +

There are always privacy principles at work. Some sets of principles may be more +permissive, but that does not make them neutral. All privacy principles have an impact on +people and we must therefore determine which principles best align with ethical Web values in +Web contexts ([ETHICAL-WEB], [Why-Privacy]).

+

Information flows are information exchanged or processed by +actors. A person's privacy can be harmed both by their information flowing from them to +other actors and by information flowing toward them. Examples of the latter include: +unexpected shocking images, +loud noises while they intend to sleep, manipulative information, interruptive +messages when their focus is on something else, or harassment when they seek social interactions. +(In some of these cases, the information may not be personal data.)

+

On the Web, information flows may involve a wide variety of actors that are not always recognizable or obvious to a user within a particular interaction. Visiting a website may involve -the actors that operate that site and its functionality, but also actors with network access, +the actors that contribute to operating that site, but also actors with network access, which may include: Internet service providers; other network operators; local institutions providing a network connection including schools, libraries or universities; government intelligence services; malicious hackers who have gained access to the network or the systems of any of the other actors. @@ -981,99 +978,95 @@

Privacy Principles

which could include friends, family members, teachers, strangers, or government officials. Some threats to privacy, including both disclosure and harassment, may be particular to the other people involved in the information flow.

-

1.1 Individual Autonomy

A person's autonomy is their ability to make decisions of their own personal will, -without undue influence from other actors. People have limited intellectual resources and -time with which to weigh decisions, and by necessity rely on shortcuts when making -decisions. This makes their preferences, including privacy preferences, malleable and susceptible to -manipulation ([Privacy-Behavior], [Digital-Market-Manipulation]). A person's autonomy is enhanced by a -system or device when that system offers a shortcut that aligns more with what that person would -have decided given arbitrary amounts of time and relatively unlimited intellectual ability; -and autonomy is decreased when a similar shortcut goes against decisions made under such -ideal conditions.

+

1.1 Individual Autonomy

A person's autonomy is their ability to make decisions of their own personal will, +without undue influence from other actors. People have limited intellectual resources and +time with which to weigh decisions, and they have to rely on shortcuts when making decisions. This makes it possible +to manipulate their preferences, including their privacy preferences ([Privacy-Behavior], [Digital-Market-Manipulation]). +A person's autonomy is improved by a system when that system offers a shortcut that is closer to what +that person would have decided given unlimited time and intellectual ability. Autonomy is decreased +when a similar shortcut goes against decisions made under these ideal conditions.

Affordances and interactions that decrease autonomy are known as deceptive patterns (or dark patterns). -A deceptive pattern does not have to be intentional ([Dark-Patterns], [Dark-Pattern-Dark]).

-

Because we are all subject to motivated reasoning, the design of defaults and affordances -that may impact autonomy should be the subject of independent scrutiny.

-

Given the large volume of potential data-related decisions in today's data economy, -complete informational self-determination is impossible. This fact, however, should not be -confused with the idea that privacy is dead. Studies show that people remain concerned over how -their data is processed, feeling powerless and like they have lost agency -([Privacy-Concerned]). Careful design of our technological infrastructure can ensure that -people's autonomy with respect to their own data is enhanced through appropriate -defaults and choice architectures.

-

2. Principles for Privacy on the Web

@@ -1350,14 +1346,14 @@

Privacy Principles

These principles should be enforced by user agents. When this is not possible, additional enforcement mechanisms are needed.

2.1 Identity on the Web

Principle: A user agent -should help its user present the identity they want in each context +should help its user present the identity they want in each context they are in, and should prevent or support recognition as appropriate.
-

A person's identity is the set of characteristics that define +

A person's identity is the set of characteristics that define them. Their identity in a context is the set of characteristics they present under particular circumstances.

People can present different identities to different contexts, and can @@ -1365,24 +1361,24 @@

Privacy Principles

People may wish to present an ephemeral or anonymous identity. This is a set of characteristics that is too small or unstable to be useful for following them through time.

-

A person's identities may often be distinct from whatever legal identity +

A person's identities may often be distinct from whatever legal identity or identities they hold.

In some circumstances, the best way for a user agent to uphold this principle is to prevent recognition (e.g. so that one site can't -learn anything about its user's behavior on another site).

+learn anything about its user's behavior on another site).

In other circumstances, the best way for a user agent to uphold this -principle is to support recognition (e.g. to help its user prove +principle is to support recognition (e.g. to help its user prove to one site that they have a particular identity on another site).

-

Similarly, a user agent can help its user by preventing or supporting +

Similarly, a user agent can help its user by preventing or supporting recognition across repeat visits to the same site.

User agents should do their best to distinguish contexts within a site and adjust their partitions to prevent or support recognition across those intra-site contexts -according to their users' wishes.

-

2.2 Data Minimization

Principle: Sites, user agents, and other actors -should minimize the amount of personal data they transfer.
+according to their users' wishes.

+

2.2 Data Minimization

Principle: Sites, user agents, and other actors +should minimize the amount of personal data they transfer.
Principle: Web APIs should be designed to minimize the amount of data that sites need -to request to carry out their users' goals and provide granularity and user controls over personal +to request to carry out their users' goals and provide granularity and user controls over personal data that is communicated to sites.
Principle: In maintaining duties of protection, discretion and loyalty, user agents should share data only when it either is needed @@ -1392,12 +1388,12 @@

Privacy Principles

Data minimization limits the risks of data being disclosed or misused. It also -helps user agents and other actors more meaningfully explain the decisions their users need +helps user agents and other actors more meaningfully explain the decisions their users need to make. For more information, see Data Minimization in Web APIs.

Web APIs should be designed to minimize the amount of data that sites need to request to pursue their users' goals and interests. They should also provide granular -user controls over personal data that is communicated to sites.

-

The principle of data minimization applies to all personal data, even if it +user controls over personal data that is communicated to sites.

+

The principle of data minimization applies to all personal data, even if it is not known to be identifying, sensitive, or otherwise harmful. See: 2.4 Sensitive Information.

-

Different users will want to share different kinds and amounts of -ancillary data with sites. Some people will not want to share any +

Different users will want to share different kinds and amounts of +ancillary data with sites. Some people will not want to share any ancillary data at all.

Users may be willing to share ancillary data if it is aggregated with -the data of other users, or de-identified. This can be useful +the data of other users, or de-identified. This can be useful when ancillary data contributes to a collective benefit in a way that reduces privacy threats to individuals (see collective privacy).

@@ -1490,7 +1486,7 @@

Privacy Principles

into information about people, web servers, and other things.

User-controlled settings or permissions can guard access to data on the web. When designing a Web API, use access guards -to ensure the API exposes information in appropriate ways.

+to ensure the API exposes information in appropriate ways.

When considering whether a class of information is likely to be sensitive to @@ -1629,9 +1625,9 @@

Privacy Principles

  • whether it enables other threats, like intrusion.
  • 2.5 Data Rights

    Principle: - People have certain rights over data that is about themselves, and these rights should - be facilitated by their user agent and the actors that are processing their - data. + People have certain rights over data that is about themselves, and these rights should + be facilitated by their user agent and the actors that are processing their + data.
    @@ -1639,36 +1635,36 @@

    Privacy Principles

    While data rights alone are not sufficient to satisfy all privacy principles for the Web, they do support self-determination and help improve accountability. Such rights include:

      -
    • The right to access data about oneself.
    • +
    • The right to access data about oneself.

    This right includes both being able to review what information has been collected or inferred about -oneself and being able to discover what actors have collected information about oneself. As a -result, databases cannot be kept secret and data collected about people needs to be meaningfully +oneself and being able to discover what actors have collected information about oneself. As a +result, databases cannot be kept secret and data collected about people needs to be meaningfully discoverable by those people.

      -
    • The right to erase data about oneself.
    • +
    • The right to erase data about oneself.

    The right to erase applies whether or not terminating use of a service altogether, though what -data can be erased may differ between those two cases. On the Web, people may wish to erase +data can be erased may differ between those two cases. On the Web, people may wish to erase data on their device, on a server, or both, and the distinctions may not always be clear.

    • The right to -port data, including data one has stored with another actor, so it can easily be reused or +port data, including data one has stored with another actor, so it can easily be reused or transferred elsewhere.
    -

    Portability is needed to realize the ability for people to make choices about services with +

    Portability is needed to realize the ability for people to make choices about services with different data practices. Standards for interoperability are essential for effective re-use.

      -
    • The right to correct data about oneself, to ensure that one's -identity is properly reflected in a system.

      +
    • The right to correct data about oneself, to ensure that one's +identity is properly reflected in a system.

    • -
    • The right to be free from automated decision-making based on data +

    • The right to be free from automated decision-making based on data about oneself.

    For some kinds of decision-making with substantial consequences, there is a privacy interest in being able to exclude oneself from automated profiling. For example, some services may alter the -price of products (price discrimination) or offers for credit or insurance based on data +price of products (price discrimination) or offers for credit or insurance based on data collected about a person. Those alterations may be consequential (financially, say) and objectionable to people who believe those decisions based on data about them are inaccurate or unjust. As another example, some services may draw inferences about a user's identity, humanity, or @@ -1679,13 +1675,13 @@

    Privacy Principles

  • The right to object, withdraw consent, and restrict use of data about oneself.
  • -

    People may change their decisions about consent or may object to subsequent uses of data about +

    People may change their decisions about consent or may object to subsequent uses of data about themselves. Retaining rights requires ongoing control, not just at the time of collection.

    The OECD Privacy Principles [OECD-Guidelines], [Records-Computers-Rights], and the [GDPR], -among other places, include many of the rights people have as data subjects. These participatory +among other places, include many of the rights people have as data subjects. These participatory rights by people over data about themselves are inherent to autonomy.

    2.6 De-identified Data

    Principle: - Whenever possible, processors should work with data that has been de-identified. + Whenever possible, processors should work with data that has been de-identified. @@ -1693,7 +1689,7 @@

    Privacy Principles

    Data is de-identified when there exists a high level of confidence -that no person described by the data can be identified, directly or indirectly +that no person described by the data can be identified, directly or indirectly (e.g. via association with an identifier, user agent, or device), by that data alone or in combination with other available information. Note that further considerations relating to groups are covered in the @@ -1702,44 +1698,44 @@

    Privacy Principles

    1. The state of the data is such that the information that could be used to re-identify an individual has been removed or altered, and
    2. -
    3. there is a process in place to prevent attempts to re-identify people and the inadvertent +
    4. there is a process in place to prevent attempts to re-identify people and the inadvertent release of the de-identified data. ([De-identification-Privacy-Act])

    Different situations involving controlled de-identified data will require different controls. For instance, if the controlled de-identified data is only being processed by one -actor, typical controls include making sure that the identifiers used in the data are unique -to that dataset, that any person (e.g. an employee of the actor) with access to the data is -barred (e.g. based on legal terms) from sharing the data further, and that technical measures -exist to prevent re-identification or the joining of different data sets involving this data.

    +actor, typical controls include making sure that the identifiers used in the data are unique +to that dataset, that any person (e.g. an employee of the actor) with access to the data is +barred (e.g. based on legal terms) from sharing the data further, and that technical measures +exist to prevent re-identification or the joining of different data sets involving this data.

    In general, the goal is to ensure that controlled de-identified data is used in a manner that provides a viable degree of oversight and accountability such that technical and procedural means to guarantee the maintenance of pseudonymity are preserved.

    This is more difficult when the controlled de-identified data is shared between several -actors. In such cases, good examples of typical controls that are representative of best +actors. In such cases, good examples of typical controls that are representative of best practices would include making sure that:

      -
    • the identifiers used in the data are under the direct and exclusive control of the first party (the actor a person is directly interacting with) who is prevented by strict controls +

    • the identifiers used in the data are under the direct and exclusive control of the first party (the actor a person is directly interacting with) who is prevented by strict controls from matching the identifiers with the data;

    • when these identifiers are shared with a third party, they are made unique to that third party such that if they are shared with more than one third party these cannot then match them up with one another;

    • -
    • there is a strong level of confidence that no third party can match the data +

    • there is a strong level of confidence that no third party can match the data with any data other than that obtained through interactions with the first party;

    • -
    • any third party receiving such data is barred (e.g. based on legal terms) +

    • any third party receiving such data is barred (e.g. based on legal terms) from sharing it further;

    • technical measures exist to prevent re-identification or the joining of -different data sets involving this data; and

      +different data sets involving this data; and

    • there exist contractual terms between the first party and third party describing the limited purpose for which the data is being shared.

    Note that controlled de-identified data, on its own, is not sufficient to make -data processing appropriate.

    +data processing appropriate.

    2.7 Collective Privacy

    Principle: Groups and various forms of institutions should best protect and support autonomy by making decisions collectively rather than individually to either prevent or enable data @@ -1758,39 +1754,39 @@

    Privacy Principles

    individuals have a collective interest in the creation of information about the group, and actions taken on its behalf.” ([Individual-Group-Privacy]) This justifies ensuring that grouped people can benefit from both individual and collective means to support their autonomy with respect -to data processing. It should be noted that processing can be unjust even if individuals +to data processing. It should be noted that processing can be unjust even if individuals remain anonymous, not from the violation of individual autonomy but because it violates ideals of social equality ([Relational-Governance]).

    Another case in which collective decision-making is preferable is for processing for which informed individual decision-making is unrealistic (due to the complexity of the processing, the volume or frequency of processing, or both). Expecting laypeople (or even experts) to make informed -decisions relating to complex data processing or to make decisions on a very frequent +decisions relating to complex data processing or to make decisions on a very frequent basis — even if the processing is relatively simple — is unrealistic if we also want them to have reasonable levels of autonomy in making these decisions.

    -

    The purpose of this principle is to require that data governance provide ways to distinguish -appropriate data processing without relying on individual decisions whenever the latter +

    The purpose of this principle is to require that data governance provide ways to distinguish +appropriate data processing without relying on individual decisions whenever the latter are impossible, which is often ([Relational-Governance], [Relational-Turn]).

    Which forms of collective governance are recognised as legitimate will depend on domains. These may take many forms, such as governmental bodies at various administrative levels, standards organisations, worker bargaining units, or civil society fora.

    It must be noted that, even though collective decision-making can be better than offloading -privacy labour to individuals, it is not necessarily a panacea. When considering such +privacy labour to individuals, it is not necessarily a panacea. When considering such collective arrangements it is important to keep in mind the principles that are likely to support viable and effective institutions at any level of complexity ([IAD]).

    A good example of a failure in collective privacy decisions was the standardisation of the ping attribute. Search engines, social sites, and other algorithmic media in the same vein -have an interest in knowing which sites that they link to people choose to visit (which in turn -could improve the service for everyone). But people may have an interest in keeping that +have an interest in knowing which sites that they link to people choose to visit (which in turn +could improve the service for everyone). But people may have an interest in keeping that information private from algorithmic media companies (as do the sites being linked to, as that -facilitates timing attacks to recognise people there). A person's exit through a specific +facilitates timing attacks to recognise people there). A person's exit through a specific link can either be tracked with JavaScript tricks or through bounce tracking, both of which are slow and difficult for user agents to defend against. The value proposition of the ping attribute in this context is therefore straightforward: by providing declarative support for this functionality it can be made fast (the browser sends an asynchronous notification to a ping -endpoint after the person exits through a link) and the user agent can provide its user with +endpoint after the person exits through a link) and the user agent can provide its user with the option to opt out of such tracking — or disable it by default.

    Unfortunately, this arrangement proved to be unworkable on the privacy side (the performance gains, -however, are real). What prevents a site from using ping for people who have it activated +however, are real). What prevents a site from using ping for people who have it activated and bounce tracking for others? What prevents a browsers from opting everyone out because it wishes to offer better protection by default? Given the contested nature of the ping attribute and the absence of a forcing function to support collective enforcement, the scheme failed to deliver @@ -1814,7 +1810,7 @@

    Privacy Principles

    configure the programs that run on them. As a program running on a device, a user agent generally can't tell whether the administrator who has installed and configured it was authorized by the device's actual owner.

    -

    Sometimes the person using a device doesn't own the device or have +

    Sometimes the person using a device doesn't own the device or have administrator access to it (e.g. an employer providing a device to an employee; a friend loaning a device to their guest; or a parent providing a device to their young child). Other times, the owner and primary user of a @@ -1824,13 +1820,13 @@

    Privacy Principles

    prevent their partner from having administrator access to their devices. An employee might have to agree to use their employer's devices in order to keep their job.

    While a device owner has an interest and sometimes a responsibility to make sure their device is -used in the ways they intended, the person using the device still has a right to privacy while +used in the ways they intended, the person using the device still has a right to privacy while using it. This principle enforces this right to privacy in two ways:

    1. User agent developers need to consider whether requests from device owners and administrators are reasonable, and refuse to implement unreasonable requests, even if that means fewer sales. Owner/administrator needs do not supersede user needs in the priority of constituencies.
    2. -
    3. Even when information disclosure is reasonable, the person whose data is being disclosed +
    4. Even when information disclosure is reasonable, the person whose data is being disclosed needs to know about it so that they can avoid doing things that would lead to unwanted consequences.
    @@ -1898,15 +1894,15 @@

    Privacy Principles

  • Restricting what new users of a service can post, e.g. limiting links and media until a user has interacted a sufficient number of times over a given period with a larger group. This helps to raise the cost of producing sock puppet accounts and gives new users time to understand local norms before posting.
  • -
  • Only accepting communication between people who have an established relationship of some kind, +
  • Only accepting communication between people who have an established relationship of some kind, such as being part of a shared group. Protocols should consider requiring a handshake between -people prior to enabling communication.
  • +people prior to enabling communication.
  • Requiring a deliberate action from the recipient before rendering media coming from an untrusted source.
  • -
  • Supporting the ability for people to block another actor such that they cannot send information +
  • Supporting the ability for people to block another actor such that they cannot send information again.
  • Pooling mitigation information, for instance shared block lists, shared spam-detection -information, or public information about misbehaving actors.
  • +information, or public information about misbehaving actors.
  • Enabling users to filter out or hide information or media based on tags or content warnings.
  • @@ -1928,7 +1924,7 @@

    Privacy Principles

    Sometimes particular groups are classed as “vulnerable” (e.g. children, or the elderly), but anyone could become privacy vulnerable in a given context. -A person may not realise when they disclose personal data that +A person may not realise when they disclose personal data that they are vulnerable or could become vulnerable.

    Some individuals may be more vulnerable to privacy risks or harm as a result of collection, misuse, loss or theft of personal data because:

    @@ -1937,7 +1933,7 @@

    Privacy Principles

  • of the situation or setting (e.g. where there is information asymmetry or other power imbalances);
  • they lack the capacity to fully assess the risks;
  • -
  • choices are not presented in an easy-to-understand meaningful way (e.g. deceptive patterns);
  • +
  • choices are not presented in an easy-to-understand meaningful way (e.g. deceptive patterns);
  • they have not been consulted about their privacy needs and expectations;
  • they have not been considered in the decisions about the design of the product or service.
  • @@ -1989,7 +1985,7 @@

    Privacy Principles

    2.13 Non-Retaliation

    Principle: - Actors must not retaliate against people who protect their data against - non-essential processing or exercise rights over their data. + Actors must not retaliate against people who protect their data against + non-essential processing or exercise rights over their data.
    -

    Whenever people have the ability to cause an actor to process less of their data or to stop +

    Whenever people have the ability to cause an actor to process less of their data or to stop carrying out some given set of data processing that is not essential to the service, they must be -allowed to do so without the actor retaliating, for instance by artificially removing an +allowed to do so without the actor retaliating, for instance by artificially removing an unrelated feature, by decreasing the quality of the service, or by trying to cajole, badger, or -trick the person into opting back into the processing.

    +trick the person into opting back into the processing.

    Issue 3

    2.14 Support Choosing Which Information to Present

    Principle: - User agents should support people in choosing which information they provide to actors that + User agents should support people in choosing which information they provide to actors that request it, up to and including allowing users to provide arbitrary information.
    -

    Actors can invest time and energy into automating ways of gathering data from people and can -design their products in ways that make it a lot easier for people to disclose information than not, whereas -people typically have to manually wade through options, repeated prompts, and deceptive patterns. In many -cases, the absence of data — when a person refuses to provide some information — can also be identifying +

    Actors can invest time and energy into automating ways of gathering data from people and can +design their products in ways that make it a lot easier for people to disclose information than not, whereas +people typically have to manually wade through options, repeated prompts, and deceptive patterns. In many +cases, the absence of data — when a person refuses to provide some information — can also be identifying or revealing. Additionally, APIs can be defined or implemented in rigid ways that can prevent people from accessing useful functionality. For example, I might want to look for restaurants in a city I will be visiting this weekend, but if my geolocation is forcefully set to match my GPS, a restaurant-finding site might only allow searches in my current location. In other cases, sites do not abide by data minimisation principles and request more information than they require. This principle supports -people in minimising their own data.

    -

    User agents should make it simple for people to present the identity they wish +people in minimising their own data.

    +

    User agents should make it simple for people to present the identity they wish to and to provide information about themselves or their devices in -ways that they control. This helps people to live in obscurity ([Lost-In-Crowd], +ways that they control. This helps people to live in obscurity ([Lost-In-Crowd], [Obscurity-By-Design]), including by obfuscating information about themselves ([Obfuscation]).

    Principle: APIs should be designed such that data returned through an API does not assert a fact or make a @@ -2130,17 +2126,17 @@

    Privacy Principles

    -

    Instead, the API could indicate a person's preference, a person's chosen identity, a -person's query or interest, or a person's selected communication style.

    +

    Instead, the API could indicate a person's preference, a person's chosen identity, a +person's query or interest, or a person's selected communication style.

    For example, a user agent might support this principle by:

      -
    • Generating domain-specific email addresses or other directed identifiers so that people can +
    • Generating domain-specific email addresses or other directed identifiers so that people can log into the site without becoming recognisable across contexts.
    • Offering the option to generate geolocation and accelerometry data with parameters specified by -the user.
    • +the user.
    • Uploading a stored video stream in response to a camera prompt.
    • Automatically granting or denying permission prompts based on user configuration.
    @@ -2156,11 +2152,11 @@

    Privacy Principles

    mitigating forms of data collection, including browser fingerprinting.

    A. Common Concepts

    -

    A.1 People

    A person (also user or -data subject) is any natural person. Throughout this document, we primarily use person or -people to refer to human beings, as a reminder of their humanity. When we use the term user, -it is to talk about the specific person who happens to be using a given system at that time.

    -

    A vulnerable person is a person who may be unable to +

    A.1 People

    A person (also user or +data subject) is any natural person. Throughout this document, we primarily use person or +people to refer to human beings, as a reminder of their humanity. When we use the term user, +it is to talk about the specific person who happens to be using a given system at that time.

    +

    A vulnerable person is a person who may be unable to exercise sufficient self-determination in a context. Amongst other things, they should be treated with greater default privacy protections and may be considered unable to consent to various interactions with a system. @@ -2168,25 +2164,25 @@

    Privacy Principles

    are employees with respect to their employers, are facing a steep asymmetry of power, are people in some situations of intellectual or psychological impairment, are refugees, etc.

    -

    A.2 Contexts

    A context is a physical or digital environment in which people interact with other -actors, and which the people understand as distinct from other contexts.

    +

    A.2 Contexts

    A context is a physical or digital environment in which people interact with other +actors, and which the people understand as distinct from other contexts.

    A context is not defined in terms of who owns or controls it. Sharing -data between different contexts of a single company is -a privacy violation, just as if the same data were shared between unrelated actors.

    -

    A.3 Server-Side Actors

    An actor is an entity that a person can reasonably understand as a single "thing" -they're interacting with. Actors can be people or collective entities like companies, +data between different contexts of a single company is +a privacy violation, just as if the same data were shared between unrelated actors.

    +

    A.3 Server-Side Actors

    An actor is an entity that a person can reasonably understand as a single "thing" +they're interacting with. Actors can be people or collective entities like companies, associations, or governmental bodies. Uses of this document in a particular domain are expected to -describe how the core concepts of that domain combine into a user-comprehensible actor, and +describe how the core concepts of that domain combine into a user-comprehensible actor, and those refined definitions are likely to differ between domains.

    -

    User agents tend to explain to people which origin or site provided the -web page they're looking at. The actor that controls this origin or site is -known as the web page's first party. When a person +

    User agents tend to explain to people which origin or site provided the +web page they're looking at. The actor that controls this origin or site is +known as the web page's first party. When a person interacts with a UI element on a web page, the first party of that interaction -is usually the web page's first party. However, if a different actor controls +is usually the web page's first party. However, if a different actor controls how data collected with the UI element is used, and a reasonable person with a realistic cognitive budget would realize -that this other actor has this control, this other -actor is the first party for the interaction instead.

    +that this other actor has this control, this other +actor is the first party for the interaction instead.

    Issue 4

    A.4 Acting on Data

    We define personal data as any information that is directly or -indirectly related to an identified or identifiable person, such as by reference to an +indirectly related to an identified or identifiable person, such as by reference to an identifier ([GDPR], [OECD-Guidelines], [Convention-108]).

    On the web, an identifier of some type is typically assigned for an -identity as seen by a website, which makes it easier for an automated -system to store data about that person.

    -

    Examples of identifiers for a person can be:

    +identity as seen by a website, which makes it easier for an automated +system to store data about that person.

    +

    Examples of identifiers for a person can be:

    • their name,
    • an identification number including those mapping to a device that this -person may be using,
    • +person may be using,
    • their phone number,
    • their location data,
    • an online identifier such as email or IP addresses,
    • @@ -2224,27 +2220,27 @@

      Privacy Principles

      configuration characteristics), or
    • factors specific to their physical, physiological, genetic, mental, economic, -cultural, social, or behavioral identity,
    • +cultural, social, or behavioral identity,
    • strings derived from other identifiers, for instance through hashing.
    -

    If a person could reasonably be identified or re-identified through the combination of data with other -data, then that data is personal data.

    -

    Privacy is achieved in a given context that either involves personal data or -involves information being presented to people when the principles of that context are +

    If a person could reasonably be identified or re-identified through the combination of data with other +data, then that data is personal data.

    +

    Privacy is achieved in a given context that either involves personal data or +involves information being presented to people when the principles of that context are followed appropriately. When the principles for that context are not followed, there is a privacy violation. Similarly, we say that a particular interaction is appropriate when the principles are adhered to) or inappropriate otherwise.

    -

    An actor processes data if it -carries out operations on personal data, whether or not by automated means, such as +

    An actor processes data if it +carries out operations on personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, sharing, dissemination or otherwise making available, selling, alignment or combination, restriction, erasure or destruction.

    -

    An actor shares data if it provides it to any other -actor. Note that, under this definition, an actor that provides data to its own +

    An actor shares data if it provides it to any other +actor. Note that, under this definition, an actor that provides data to its own service providers is not sharing it.

    -

    An actor sells data when it shares it in exchange +

    An actor sells data when it shares it in exchange for consideration, monetary or otherwise.

    The purpose of a given processing of data is an anticipated, intended, or planned outcome of this processing which is achieved or aimed for within a given @@ -2256,35 +2252,35 @@

    Privacy Principles

    level and not necessarily all the way down to implementation details. Example: a person will have their preferences restored (purpose) by looking up their identifier in a preferences store (means).

    -

    A data controller is an actor that determines the means and purposes -of data processing. Any actor that is not a service provider is a data controller.

    +

    A data controller is an actor that determines the means and purposes +of data processing. Any actor that is not a service provider is a data controller.

    A service provider or data processor is considered to be in -the same category of first party or third party as the actor contracting it to +the same category of first party or third party as the actor contracting it to perform the relevant processing if it:

      -
    • is processing the data on behalf of that actor;
    • +
    • is processing the data on behalf of that actor;
    • ensures that the data is only retained, accessed, and used as directed by that -actor and solely for the list of explicitly-specified purposes -detailed by the directing actor or data controller;
    • +actor and solely for the list of explicitly-specified purposes +detailed by the directing actor or data controller;
    • may determine implementation details of the data processing in question but does not determine the purpose for which the data is being processed nor the overarching means through which the purpose is carried out;
    • -
    • has no independent right to use the data other than in a de-identified form (e.g., for +
    • has no independent right to use the data other than in a de-identified form (e.g., for monitoring service integrity, load balancing, capacity planning, or billing); and,
    • -
    • has a contract in place with the actor which is consistent with the above limitations.
    • +
    • has a contract in place with the actor which is consistent with the above limitations.
    -

    A.5 Recognition

    Recognition is the act of realising that a given identity -corresponds to the same person as another identity which may have been +

    A.5 Recognition

    Recognition is the act of realising that a given identity +corresponds to the same person as another identity which may have been observed either in another context, or in the same context but at a different time. Recognition can be probabilistic, if someone realises there's -a high probability that two identities correspond to the same person, +a high probability that two identities correspond to the same person, even if they aren't certain.

    -

    A person can be recognized whether or not their legal identity or +

    A person can be recognized whether or not their legal identity or characteristics of their legal identity are included in the recognition.

    A.5.1 Recognition Types

    There are several types of recognition that may take place.

    Cross-context recognition is recognition between different contexts.

    -

    Cross-context recognition is only appropriate when the person being recognized +

    Cross-context recognition is only appropriate when the person being recognized can reasonably expect recognition to happen, and can control whether it does.

    If a person uses a piece of identifying information in two different contexts (e.g. their email or phone number), this does not automatically @@ -2306,9 +2302,9 @@

    Privacy Principles

    different contexts, cross-site recognition is a privacy harm in the same cases as cross-context recognition.

    Same-site recognition is when a single site recognizes a -person across two or more visits.

    -

    A privacy harm occurs if a person reasonably expects that they'll be using -a different identity for different visits to a single site, but the site +person across two or more visits.

    +

    A privacy harm occurs if a person reasonably expects that they'll be using +a different identity for different visits to a single site, but the site recognizes them anyway.

    Note that these categories overlap: cross-site recognition is usually cross-context recognition (and always recognizes across partitions); and @@ -2420,11 +2416,11 @@

    Privacy Principles

    needs to collect extra data from its users in order to protect those or other users, it must take extra technical and legal measures to ensure that this data can't be then used for other purposes, like to grow the service.
  • Principle: A user agent -should help its user present the identity they want in each context +should help its user present the identity they want in each context they are in, and should prevent or support recognition as appropriate. -
  • Principle: Sites, user agents, and other actors -should minimize the amount of personal data they transfer.
  • Principle: Web APIs should be designed to minimize the amount of data that sites need -to request to carry out their users' goals and provide granularity and user controls over personal +
  • Principle: Sites, user agents, and other actors +should minimize the amount of personal data they transfer.
  • Principle: Web APIs should be designed to minimize the amount of data that sites need +to request to carry out their users' goals and provide granularity and user controls over personal data that is communicated to sites.
  • Principle: In maintaining duties of protection, discretion and loyalty, user agents should share data only when it either is needed to satisfy a user's immediate goals or aligns with the user's wishes and interests.
  • Principle: New Web APIs must guard users' information at least @@ -2432,14 +2428,14 @@

    Privacy Principles

    System designers should not assume that particular information is or is not sensitive. Whether information is considered sensitive can vary depending on a - person's circumstances and the context of an interaction, and it can + person's circumstances and the context of an interaction, and it can change over time.
  • Principle: - People have certain rights over data that is about themselves, and these rights should - be facilitated by their user agent and the actors that are processing their - data. + People have certain rights over data that is about themselves, and these rights should + be facilitated by their user agent and the actors that are processing their + data.
  • Principle: - Whenever possible, processors should work with data that has been de-identified. + Whenever possible, processors should work with data that has been de-identified.
  • Principle: Groups and various forms of institutions should best protect and support autonomy by making decisions collectively rather than individually to either prevent or enable data @@ -2463,26 +2459,26 @@

    Privacy Principles

    degradation of features which may be incompatible with stronger privacy protections.
  • Principle: - When any actor obtains consent for processing from a person, the + When any actor obtains consent for processing from a person, the actor should design the consent request so as to learn the person's true intent to consent or not, and not to maximize the processing consented to.
  • Principle: - An actor should avoid interrupting a person's use of a site for + An actor should avoid interrupting a person's use of a site for consent requests when an alternative is available.
  • Principle: - It should be as easy for a person to check what consent they have given, to withdraw consent, + It should be as easy for a person to check what consent they have given, to withdraw consent, or to opt out or object, as to give consent.
  • Principle: - Actors should provide functionality to access, correct, and remove data about - people to those people when that data has been provided by someone else. + Actors should provide functionality to access, correct, and remove data about + people to those people when that data has been provided by someone else.
  • Principle: A user agent should help users control notifications and other interruptive UI that can be used to manipulate behavior.
  • Principle: Web sites should use notifications only for information that their users have specifically requested.
  • Principle: - Actors must not retaliate against people who protect their data against - non-essential processing or exercise rights over their data. + Actors must not retaliate against people who protect their data against + non-essential processing or exercise rights over their data.
  • Principle: - User agents should support people in choosing which information they provide to actors that + User agents should support people in choosing which information they provide to actors that request it, up to and including allowing users to provide arbitrary information.
  • Principle: APIs should be designed such that data returned through an API does not assert a fact or make a @@ -2642,25 +2638,25 @@

    Privacy Principles

    Referenced in:

    -
  • § 1.1.1 Opt-in, Consent, Opt-out, Global Controls
  • - § 1.1.2 Privacy Labour + § 1.1.2 Privacy Labour (2)
  • - § 1.2 Collective Governance + § 1.2 Collective Governance
  • - § 1.3 User Agents + § 1.3 User Agents
  • - § 1.4 Incorporating Different Privacy Principles + § 1.4 Incorporating Different Privacy Principles
  • - § 2.3 Information access (2) + § 2.3 Information access (2)
  • - § 2.3.1 Unavoidable information exposure + § 2.3.1 Unavoidable information exposure
  • - § 2.6 De-identified Data + § 2.6 De-identified Data
  • - § 2.7 Collective Privacy + § 2.7 Collective Privacy
  • - § A.3 Server-Side Actors (2) + § A.3 Server-Side Actors (2)
  • - § A.5.1 Recognition Types + § A.5.1 Recognition Types