File size: 856 Bytes
a3873ea
 
a8ede26
b0c4481
 
 
fa50338
b0c4481
 
a3873ea
a8ede26
 
03c5ef2
4fe29cd
 
03c5ef2
 
a8ede26
 
 
 
 
 
 
 
fa50338
a8ede26
 
 
 
4a734ad
a8ede26
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---

license: cc
language: es
widget:
- text: "Me cae muy bien."
  example_title: "Non-racist example"
- text: "Unos menas agreden a una mujer."
  example_title: "Racist example"

---



Model to predict whether a given text is racist or not:
* `LABEL_0` output indicates non-racist text
* `LABEL_1` output indicates racist text

Usage:

```python

from transformers import pipeline



RACISM_MODEL = "davidmasip/racism"

racism_analysis_pipe = pipeline("text-classification",

                                model=RACISM_MODEL, tokenizer=RACISM_MODEL)



results = racism_analysis_pipe("Unos menas agreden a una mujer.")





def clean_labels(results):

    for result in results:

        label = "Non-racist" if results["label"] == "LABEL_0" else "Racist"

        result["label"] = label





clean_labels(results)

print(results)

```