End of training
Browse files
README.md
CHANGED
@@ -17,67 +17,67 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
This model is a fine-tuned version of [allegro/herbert-large-cased](https://huggingface.co/allegro/herbert-large-cased) on the universal_dependencies dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
-
- Loss: 0.
|
21 |
-
- : {'precision': 0.
|
22 |
-
- Arataxis:insert: {'precision': 0.
|
23 |
-
- Arataxis:obj: {'precision': 0.
|
24 |
-
- Ark: {'precision': 0.
|
25 |
-
- Ase: {'precision': 0.
|
26 |
-
- Bj: {'precision': 0.
|
27 |
-
- Bl: {'precision': 0.
|
28 |
-
- Bl:agent: {'precision': 0.
|
29 |
-
- Bl:arg: {'precision': 0.
|
30 |
-
- Bl:cmpr: {'precision': 0.
|
31 |
-
- C: {'precision': 0.
|
32 |
-
- C:preconj: {'precision':
|
33 |
-
- Cl: {'precision': 0.
|
34 |
-
- Cl:relcl: {'precision': 0.
|
35 |
-
- Comp: {'precision': 0.
|
36 |
- Comp:cleft: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
|
37 |
-
- Comp:obj: {'precision': 0.
|
38 |
-
- Comp:pred: {'precision': 0.
|
39 |
- Comp:subj: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
|
40 |
-
- Dvcl: {'precision': 0.
|
41 |
-
- Dvcl:cmpr: {'precision': 0.
|
42 |
-
- Dvmod: {'precision': 0.
|
43 |
-
- Dvmod:arg: {'precision': 0.
|
44 |
-
- Dvmod:emph: {'precision': 0.
|
45 |
-
- Dvmod:neg: {'precision': 0.
|
46 |
-
- Et: {'precision': 0.
|
47 |
-
- Et:numgov: {'precision': 0.
|
48 |
-
- Et:nummod: {'precision': 0
|
49 |
-
- Et:poss: {'precision': 0.
|
50 |
- Iscourse:intj: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
|
51 |
- Ist: {'precision': 1.0, 'recall': 0.6666666666666666, 'f1': 0.8, 'number': 9}
|
52 |
-
- Ixed: {'precision': 0.
|
53 |
-
- Lat: {'precision': 0.
|
54 |
-
- Mod: {'precision': 0.
|
55 |
-
- Mod:arg: {'precision': 0.
|
56 |
-
- Mod:flat: {'precision': 0.
|
57 |
-
- Mod:poss: {'precision':
|
58 |
- Mod:pred: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
|
59 |
-
- Obj: {'precision': 0.
|
60 |
-
- Ocative: {'precision': 0.
|
61 |
-
- Onj: {'precision': 0.
|
62 |
-
- Oot: {'precision': 0.
|
63 |
-
- Op: {'precision': 0.
|
64 |
-
- Ppos: {'precision': 0.
|
65 |
- Rphan: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3}
|
66 |
-
- Subj: {'precision': 0.
|
67 |
-
- Subj:pass: {'precision': 0.
|
68 |
-
- Ummod: {'precision': 0.
|
69 |
-
- Ummod:gov: {'precision': 0.
|
70 |
-
- Unct: {'precision': 0.
|
71 |
-
- Ux: {'precision': 0.
|
72 |
-
- Ux:clitic: {'precision': 0.
|
73 |
-
- Ux:cnd: {'precision': 0.
|
74 |
- Ux:imp: {'precision': 1.0, 'recall': 0.75, 'f1': 0.8571428571428571, 'number': 4}
|
75 |
-
- Ux:pass: {'precision': 0.
|
76 |
-
- Xpl:pv: {'precision': 0.
|
77 |
-
- Overall Precision: 0.
|
78 |
-
- Overall Recall: 0.
|
79 |
-
- Overall F1: 0.
|
80 |
-
- Overall Accuracy: 0.
|
81 |
|
82 |
## Model description
|
83 |
|
|
|
17 |
|
18 |
This model is a fine-tuned version of [allegro/herbert-large-cased](https://huggingface.co/allegro/herbert-large-cased) on the universal_dependencies dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 0.3266
|
21 |
+
- : {'precision': 0.9565217391304348, 'recall': 0.9295774647887324, 'f1': 0.9428571428571428, 'number': 71}
|
22 |
+
- Arataxis:insert: {'precision': 0.7380952380952381, 'recall': 0.4626865671641791, 'f1': 0.5688073394495413, 'number': 67}
|
23 |
+
- Arataxis:obj: {'precision': 0.8181818181818182, 'recall': 0.7758620689655172, 'f1': 0.7964601769911505, 'number': 58}
|
24 |
+
- Ark: {'precision': 0.9248554913294798, 'recall': 0.8888888888888888, 'f1': 0.9065155807365439, 'number': 180}
|
25 |
+
- Ase: {'precision': 0.9654178674351584, 'recall': 0.9429978888106967, 'f1': 0.9540761836952653, 'number': 1421}
|
26 |
+
- Bj: {'precision': 0.9072978303747534, 'recall': 0.8846153846153846, 'f1': 0.8958130477117819, 'number': 520}
|
27 |
+
- Bl: {'precision': 0.8146718146718147, 'recall': 0.8554054054054054, 'f1': 0.834541858932103, 'number': 740}
|
28 |
+
- Bl:agent: {'precision': 0.875, 'recall': 0.875, 'f1': 0.875, 'number': 16}
|
29 |
+
- Bl:arg: {'precision': 0.8407407407407408, 'recall': 0.7138364779874213, 'f1': 0.772108843537415, 'number': 318}
|
30 |
+
- Bl:cmpr: {'precision': 0.75, 'recall': 0.7058823529411765, 'f1': 0.7272727272727272, 'number': 17}
|
31 |
+
- C: {'precision': 0.9197530864197531, 'recall': 0.863768115942029, 'f1': 0.890881913303438, 'number': 345}
|
32 |
+
- C:preconj: {'precision': 0.8, 'recall': 0.6666666666666666, 'f1': 0.7272727272727272, 'number': 6}
|
33 |
+
- Cl: {'precision': 0.8733333333333333, 'recall': 0.8397435897435898, 'f1': 0.8562091503267975, 'number': 156}
|
34 |
+
- Cl:relcl: {'precision': 0.9056603773584906, 'recall': 0.631578947368421, 'f1': 0.7441860465116278, 'number': 76}
|
35 |
+
- Comp: {'precision': 0.8118279569892473, 'recall': 0.7475247524752475, 'f1': 0.7783505154639175, 'number': 202}
|
36 |
- Comp:cleft: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
|
37 |
+
- Comp:obj: {'precision': 0.55, 'recall': 0.4583333333333333, 'f1': 0.5, 'number': 24}
|
38 |
+
- Comp:pred: {'precision': 0.7272727272727273, 'recall': 0.8, 'f1': 0.761904761904762, 'number': 10}
|
39 |
- Comp:subj: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
|
40 |
+
- Dvcl: {'precision': 0.8270676691729323, 'recall': 0.8661417322834646, 'f1': 0.8461538461538461, 'number': 127}
|
41 |
+
- Dvcl:cmpr: {'precision': 0.6666666666666666, 'recall': 0.5, 'f1': 0.5714285714285715, 'number': 4}
|
42 |
+
- Dvmod: {'precision': 0.8817204301075269, 'recall': 0.8631578947368421, 'f1': 0.8723404255319148, 'number': 380}
|
43 |
+
- Dvmod:arg: {'precision': 0.4, 'recall': 0.5, 'f1': 0.4444444444444445, 'number': 4}
|
44 |
+
- Dvmod:emph: {'precision': 0.8571428571428571, 'recall': 0.8516129032258064, 'f1': 0.8543689320388348, 'number': 155}
|
45 |
+
- Dvmod:neg: {'precision': 0.9411764705882353, 'recall': 0.8888888888888888, 'f1': 0.9142857142857143, 'number': 126}
|
46 |
+
- Et: {'precision': 0.9320388349514563, 'recall': 0.8648648648648649, 'f1': 0.8971962616822431, 'number': 111}
|
47 |
+
- Et:numgov: {'precision': 0.9473684210526315, 'recall': 0.9, 'f1': 0.9230769230769231, 'number': 20}
|
48 |
+
- Et:nummod: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 1}
|
49 |
+
- Et:poss: {'precision': 0.9482758620689655, 'recall': 0.9482758620689655, 'f1': 0.9482758620689655, 'number': 58}
|
50 |
- Iscourse:intj: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
|
51 |
- Ist: {'precision': 1.0, 'recall': 0.6666666666666666, 'f1': 0.8, 'number': 9}
|
52 |
+
- Ixed: {'precision': 0.875, 'recall': 0.5697674418604651, 'f1': 0.6901408450704225, 'number': 86}
|
53 |
+
- Lat: {'precision': 0.819672131147541, 'recall': 0.6944444444444444, 'f1': 0.7518796992481204, 'number': 72}
|
54 |
+
- Mod: {'precision': 0.8294849023090586, 'recall': 0.7861952861952862, 'f1': 0.8072601555747624, 'number': 1188}
|
55 |
+
- Mod:arg: {'precision': 0.6832298136645962, 'recall': 0.5365853658536586, 'f1': 0.6010928961748634, 'number': 205}
|
56 |
+
- Mod:flat: {'precision': 0.6326530612244898, 'recall': 0.5254237288135594, 'f1': 0.5740740740740742, 'number': 59}
|
57 |
+
- Mod:poss: {'precision': 1.0, 'recall': 0.25, 'f1': 0.4, 'number': 4}
|
58 |
- Mod:pred: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
|
59 |
+
- Obj: {'precision': 0.8617021276595744, 'recall': 0.7330316742081447, 'f1': 0.7921760391198045, 'number': 221}
|
60 |
+
- Ocative: {'precision': 0.9090909090909091, 'recall': 1.0, 'f1': 0.9523809523809523, 'number': 10}
|
61 |
+
- Onj: {'precision': 0.830316742081448, 'recall': 0.7474541751527495, 'f1': 0.7867095391211147, 'number': 491}
|
62 |
+
- Oot: {'precision': 0.9740777666999003, 'recall': 0.977, 'f1': 0.9755366949575637, 'number': 1000}
|
63 |
+
- Op: {'precision': 0.8470588235294118, 'recall': 0.8780487804878049, 'f1': 0.8622754491017964, 'number': 82}
|
64 |
+
- Ppos: {'precision': 0.6612903225806451, 'recall': 0.6949152542372882, 'f1': 0.6776859504132231, 'number': 59}
|
65 |
- Rphan: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3}
|
66 |
+
- Subj: {'precision': 0.9516129032258065, 'recall': 0.918562874251497, 'f1': 0.9347958561852528, 'number': 835}
|
67 |
+
- Subj:pass: {'precision': 0.8275862068965517, 'recall': 0.8275862068965517, 'f1': 0.8275862068965517, 'number': 29}
|
68 |
+
- Ummod: {'precision': 0.9393939393939394, 'recall': 0.96875, 'f1': 0.9538461538461539, 'number': 64}
|
69 |
+
- Ummod:gov: {'precision': 0.9555555555555556, 'recall': 0.86, 'f1': 0.9052631578947369, 'number': 50}
|
70 |
+
- Unct: {'precision': 0.9564315352697096, 'recall': 0.9146825396825397, 'f1': 0.9350912778904665, 'number': 2016}
|
71 |
+
- Ux: {'precision': 0.9285714285714286, 'recall': 0.7222222222222222, 'f1': 0.8125000000000001, 'number': 36}
|
72 |
+
- Ux:clitic: {'precision': 0.9661016949152542, 'recall': 0.95, 'f1': 0.957983193277311, 'number': 60}
|
73 |
+
- Ux:cnd: {'precision': 0.9090909090909091, 'recall': 0.9090909090909091, 'f1': 0.9090909090909091, 'number': 22}
|
74 |
- Ux:imp: {'precision': 1.0, 'recall': 0.75, 'f1': 0.8571428571428571, 'number': 4}
|
75 |
+
- Ux:pass: {'precision': 0.8461538461538461, 'recall': 0.8461538461538461, 'f1': 0.8461538461538461, 'number': 39}
|
76 |
+
- Xpl:pv: {'precision': 0.940677966101695, 'recall': 0.9288702928870293, 'f1': 0.9347368421052632, 'number': 239}
|
77 |
+
- Overall Precision: 0.9013
|
78 |
+
- Overall Recall: 0.8597
|
79 |
+
- Overall F1: 0.88
|
80 |
+
- Overall Accuracy: 0.8941
|
81 |
|
82 |
## Model description
|
83 |
|