Update README.md
#1
by
andreas122001
- opened
README.md
CHANGED
@@ -16,14 +16,20 @@ tags:
|
|
16 |
- mgt-detection
|
17 |
- ai-detection
|
18 |
metrics:
|
19 |
-
- accuracy
|
20 |
-
|
21 |
-
-
|
22 |
-
|
|
|
|
|
|
|
|
|
23 |
---
|
24 |
|
25 |
The hosted inference here, for some reason, does not work. It seems like this is because the hosted inference ignores the attention-mask when doing predictions. It works with pipelines however, and when including attention-mask in the model, e.g. `**encodings`.
|
26 |
|
27 |
-
|
|
|
28 |
|
29 |
-
|
|
|
|
16 |
- mgt-detection
|
17 |
- ai-detection
|
18 |
metrics:
|
19 |
+
- type: accuracy
|
20 |
+
value: 0.973
|
21 |
+
- type: precision
|
22 |
+
value: 1.000
|
23 |
+
- type: recall
|
24 |
+
value: 0.945
|
25 |
+
- type: f1
|
26 |
+
value: 0.972
|
27 |
---
|
28 |
|
29 |
The hosted inference here, for some reason, does not work. It seems like this is because the hosted inference ignores the attention-mask when doing predictions. It works with pipelines however, and when including attention-mask in the model, e.g. `**encodings`.
|
30 |
|
31 |
+
This is a text classification for detecting machine-generated text based on the bloomz-560m by BigScience (see https://huggingface.co/bigscience/bloomz-560m).
|
32 |
+
The model is fine-tuned on generations by GPT-2.
|
33 |
|
34 |
+
**NOTE**: the hosted inference does not work. To do inference, please download the model to perform inference.
|
35 |
+
**NOTE**: when not using the attention-head in the predictions, the predictions will be wrong.
|