Removed model args from use example
Browse files
README.md
CHANGED
@@ -31,12 +31,12 @@ model_args = {
|
|
31 |
|
32 |
The same pipeline was run with two other transformer models and `fasttext` for comparison. Accuracy and macro F1 score were recorded for each of the 6 fine-tuning sessions and post festum analyzed.
|
33 |
|
34 |
-
| model
|
35 |
-
|
36 |
-
|bcms-bertic-frenk-hate|0.8313|0.8219|
|
37 |
-
|EMBEDDIA/crosloengual-bert |0.8054|0.796|
|
38 |
-
|xlm-roberta-base
|
39 |
-
|fasttext|0.771
|
40 |
|
41 |
|
42 |
|
@@ -44,19 +44,19 @@ From recorded accuracies and macro F1 scores p-values were also calculated:
|
|
44 |
|
45 |
Comparison with `crosloengual-bert`:
|
46 |
|
47 |
-
| test
|
48 |
-
|
49 |
-
|Wilcoxon|0.00781|0.00781|
|
50 |
-
|Mann Whithney|0.00108|0.00108|
|
51 |
-
|Student t-test |2.43e-10
|
52 |
|
53 |
Comparison with `xlm-roberta-base`:
|
54 |
|
55 |
-
| test
|
56 |
-
|
57 |
-
|Wilcoxon|0.00781|0.00781|
|
58 |
-
|Mann Whithney|0.00107|0.00108|
|
59 |
-
|Student t-test |4.83e-11
|
60 |
|
61 |
|
62 |
|
@@ -64,14 +64,10 @@ Comparison with `xlm-roberta-base`:
|
|
64 |
|
65 |
```python
|
66 |
from simpletransformers.classification import ClassificationModel
|
67 |
-
model_args = {
|
68 |
-
"num_train_epochs": 12,
|
69 |
-
"learning_rate": 1e-5,
|
70 |
-
"train_batch_size": 74}
|
71 |
|
72 |
model = ClassificationModel(
|
73 |
"bert", "5roop/bcms-bertic-frenk-hate", use_cuda=True,
|
74 |
-
|
75 |
|
76 |
)
|
77 |
|
|
|
31 |
|
32 |
The same pipeline was run with two other transformer models and `fasttext` for comparison. Accuracy and macro F1 score were recorded for each of the 6 fine-tuning sessions and post festum analyzed.
|
33 |
|
34 |
+
| model | average accuracy | average macro F1 |
|
35 |
+
|----------------------------|------------------|------------------|
|
36 |
+
| bcms-bertic-frenk-hate | 0.8313 | 0.8219 |
|
37 |
+
| EMBEDDIA/crosloengual-bert | 0.8054 | 0.796 |
|
38 |
+
| xlm-roberta-base | 0.7175 | 0.7049 |
|
39 |
+
| fasttext | 0.771 | 0.754 |
|
40 |
|
41 |
|
42 |
|
|
|
44 |
|
45 |
Comparison with `crosloengual-bert`:
|
46 |
|
47 |
+
| test | accuracy p-value | macro F1 p-value |
|
48 |
+
|----------------|------------------|------------------|
|
49 |
+
| Wilcoxon | 0.00781 | 0.00781 |
|
50 |
+
| Mann Whithney | 0.00108 | 0.00108 |
|
51 |
+
| Student t-test | 2.43e-10 | 1.27e-10 |
|
52 |
|
53 |
Comparison with `xlm-roberta-base`:
|
54 |
|
55 |
+
| test | accuracy p-value | macro F1 p-value |
|
56 |
+
|----------------|------------------|------------------|
|
57 |
+
| Wilcoxon | 0.00781 | 0.00781 |
|
58 |
+
| Mann Whithney | 0.00107 | 0.00108 |
|
59 |
+
| Student t-test | 4.83e-11 | 5.61e-11 |
|
60 |
|
61 |
|
62 |
|
|
|
64 |
|
65 |
```python
|
66 |
from simpletransformers.classification import ClassificationModel
|
|
|
|
|
|
|
|
|
67 |
|
68 |
model = ClassificationModel(
|
69 |
"bert", "5roop/bcms-bertic-frenk-hate", use_cuda=True,
|
70 |
+
|
71 |
|
72 |
)
|
73 |
|