Skip to content

Commit

Permalink
Add basic batches to readme
Browse files Browse the repository at this point in the history
  • Loading branch information
tsterbak committed Jun 1, 2023
1 parent ed68947 commit 010650c
Show file tree
Hide file tree
Showing 2 changed files with 44 additions and 2 deletions.
23 changes: 22 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,20 @@
# biaslyze - The NLP Bias Identification Toolkit

<p align="center">
<h1>biaslyze - The NLP Bias Identification Toolkit</h1>
</p>

<p align="center">
<a href="https://github.com/biaslyze-dev/biaslyze/blob/main/LICENSE">
<img alt="licence" src="https://img.shields.io/github/license/biaslyze-dev/biaslyze">
</a>
<a href="https://pypi.org/project/biaslyze/">
<img alt="pypi" src="https://img.shields.io/pypi/v/biaslyze">
</a>
<a href="https://pypi.org/project/biaslyze/">
<img alt="pypi" src="https://img.shields.io/pypi/pyversions/biaslyze">
</a>
</p>


Bias is often subtle and difficult to detect in NLP models, as the protected attributes are less obvious and can take many forms in language (e.g. proxies, double meanings, ambiguities etc.). Therefore, technical bias testing is a key step in avoiding algorithmically mediated discrimination. However, it is currently conducted too rarely due to the effort involved, missing resources or lack of awareness for the problem.

Expand All @@ -11,6 +27,11 @@ Installation can be done using pypi:
pip install biaslyze
```

Then you need to download the required spacy models:
```bash
python -m spacy download en_core_web_sm
```

## Quickstart

```python
Expand Down
23 changes: 22 additions & 1 deletion docs/sources/index.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,20 @@
# biaslyze - The NLP Bias Identification Toolkit

<p align="center">
<h1>biaslyze - The NLP Bias Identification Toolkit</h1>
</p>

<p align="center">
<a href="https://github.com/biaslyze-dev/biaslyze/blob/main/LICENSE">
<img alt="licence" src="https://img.shields.io/github/license/biaslyze-dev/biaslyze">
</a>
<a href="https://pypi.org/project/biaslyze/">
<img alt="pypi" src="https://img.shields.io/pypi/v/biaslyze">
</a>
<a href="https://pypi.org/project/biaslyze/">
<img alt="pypi" src="https://img.shields.io/pypi/pyversions/biaslyze">
</a>
</p>


Bias is often subtle and difficult to detect in NLP models, as the protected attributes are less obvious and can take many forms in language (e.g. proxies, double meanings, ambiguities etc.). Therefore, technical bias testing is a key step in avoiding algorithmically mediated discrimination. However, it is currently conducted too rarely due to the effort involved, missing resources or lack of awareness for the problem.

Expand All @@ -11,6 +27,11 @@ Installation can be done using pypi:
pip install biaslyze
```

Then you need to download the required spacy models:
```bash
python -m spacy download en_core_web_sm
```

## Quickstart

```python
Expand Down

0 comments on commit 010650c

Please sign in to comment.