Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Harrasment, for #310 and #311 #328

Merged
merged 6 commits into from
Aug 23, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 7 additions & 6 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -1738,7 +1738,7 @@
Link this to [[[#guardians]]].
</aside>

## Abuse
## Protecting users from abusive behaviour
jyasskin marked this conversation as resolved.
Show resolved Hide resolved

<div class="practice">
<p>
Expand All @@ -1763,7 +1763,7 @@
Harassment may include: sending [=unwanted information=]; directing others to contact
or bother a person ("dogpiling"); disclosing [sensitive information](#sensitive-information) about a person; posting false information about a person; impersonating a person; insults; threats; and hateful or demeaning speech.

Disclosure of identifying or contact information (including "doxxing") can be used to send often persistent [=unwanted information=] that amounts to harassment.
Disclosure of identifying or contact information (including "doxxing") can often be used to cause additional attackers to send persistent [=unwanted information=] that amounts to harassment.
Disclosure of location information can be used to intrude on a
person's physical safety or space.

Expand All @@ -1773,11 +1773,11 @@
Effective reporting is likely to require:

* standardized mechanisms to identify abuse reporting contacts;
* visible, usable ways provided by sites and user agents to report abuse;
* sites and user agents to provide visible and usable ways to report abuse;
* identifiers to refer to senders and content;
* the ability to provide context and explanation of harms;
* people responsible for promptly responding to reports;
* tools for pooling mitigation information (see [[[#example-10]]]).
* tools for pooling mitigation information (see [[[#example-reducing-unwanted-information]]]).

<aside class="note">
Some useful research overviews of online harassment include: [[?PEW-Harassment]],
Expand All @@ -1788,13 +1788,14 @@
that are typically harmless individually but that become a nuisance in aggregate (spam) to the
sending of explicit, graphic, or violent images.

<aside class="example">
System designers should take steps to make the sending of unwanted information more difficult
or more costly, and to make the senders more accountable.


<aside class="example" id="example-reducing-unwanted-information">
Examples of mitigations include:

* Restricting what new users of a service can post, e.g. limiting links and media until they have
* Restricting what new users of a service can post, e.g. limiting links and media until a user has
interacted a sufficient number of times over a given period with a larger group. This helps to
raise the cost of producing [sock puppet accounts](https://en.wikipedia.org/wiki/Sock_puppet_account) and gives new users time to understand local norms before posting.
* Only accepting communication between [=people=] who have an established relationship of some kind,
Expand Down