David
commited on
Commit
•
aea99f0
1
Parent(s):
20f9940
Update README.md
Browse files
README.md
CHANGED
@@ -28,9 +28,22 @@ We release a `small` and `medium` version with the following configuration:
|
|
28 |
```python
|
29 |
from transformers import ElectraForPreTraining, ElectraTokenizerFast
|
30 |
|
31 |
-
discriminator = ElectraForPreTraining.from_pretrained("
|
32 |
-
tokenizer = ElectraTokenizerFast.from_pretrained("
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
```
|
|
|
34 |
- Links to our zero-shot-classifiers
|
35 |
|
36 |
## Metrics
|
|
|
28 |
```python
|
29 |
from transformers import ElectraForPreTraining, ElectraTokenizerFast
|
30 |
|
31 |
+
discriminator = ElectraForPreTraining.from_pretrained("...")
|
32 |
+
tokenizer = ElectraTokenizerFast.from_pretrained("...")
|
33 |
+
|
34 |
+
sentence_with_fake_token = "Estamos desayunando pan rosa con tomate y aceite de oliva."
|
35 |
+
|
36 |
+
inputs = tokenizer.encode(sentence_with_fake_token, return_tensors="pt")
|
37 |
+
logits = discriminator(inputs).logits.tolist()[0]
|
38 |
+
|
39 |
+
print("\t".join(tokenizer.tokenize(sentence_with_fake_token)))
|
40 |
+
print("\t".join(map(lambda x: str(x)[:4], logits[1:-1])))
|
41 |
+
"""Output:
|
42 |
+
Estamos desayun ##ando pan rosa con tomate y aceite de oliva .
|
43 |
+
-3.1 -3.6 -6.9 -3.0 0.19 -4.5 -3.3 -5.1 -5.7 -7.7 -4.4 -4.2
|
44 |
+
"""
|
45 |
```
|
46 |
+
|
47 |
- Links to our zero-shot-classifiers
|
48 |
|
49 |
## Metrics
|