update model card README.md
Browse files
README.md
CHANGED
@@ -14,15 +14,15 @@ should probably proofread and complete it, then remove this comment. -->
|
|
14 |
|
15 |
This model is a fine-tuned version of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
-
- Loss: 0.
|
18 |
-
- Loc: {'precision': 0.
|
19 |
-
- Misc: {'precision': 0.
|
20 |
-
- Org: {'precision': 0.
|
21 |
-
- Per: {'precision': 0.
|
22 |
-
- Overall Precision: 0.
|
23 |
-
- Overall Recall: 0.
|
24 |
-
- Overall F1: 0.
|
25 |
-
- Overall Accuracy: 0.
|
26 |
|
27 |
## Model description
|
28 |
|
@@ -47,27 +47,20 @@ The following hyperparameters were used during training:
|
|
47 |
- seed: 42
|
48 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
49 |
- lr_scheduler_type: linear
|
50 |
-
- num_epochs:
|
51 |
|
52 |
### Training results
|
53 |
|
54 |
-
| Training Loss | Epoch | Step
|
55 |
-
|
56 |
-
|
|
57 |
-
| 0.
|
58 |
-
| 0.
|
59 |
-
| 0.0423 | 4.0 | 8132 | 0.4695 | {'precision': 0.7260940032414911, 'recall': 0.4254510921177588, 'f1': 0.5365269461077844, 'number': 1053} | {'precision': 0.04154204054503157, 'recall': 0.7225433526011561, 'f1': 0.07856693903205532, 'number': 173} | {'precision': 0.6585903083700441, 'recall': 0.3014112903225806, 'f1': 0.4135546334716459, 'number': 992} | {'precision': 0.552366175329713, 'recall': 0.6973555337904016, 'f1': 0.6164502164502165, 'number': 1021} | 0.2950 | 0.4890 | 0.3680 | 0.9094 |
|
60 |
-
| 0.0354 | 5.0 | 10165 | 0.5324 | {'precision': 0.6918518518518518, 'recall': 0.44349477682811017, 'f1': 0.5405092592592593, 'number': 1053} | {'precision': 0.03929273084479371, 'recall': 0.6936416184971098, 'f1': 0.07437248218159281, 'number': 173} | {'precision': 0.6088794926004228, 'recall': 0.2903225806451613, 'f1': 0.39317406143344713, 'number': 992} | {'precision': 0.5702054794520548, 'recall': 0.6523016650342801, 'f1': 0.6084970306075833, 'number': 1021} | 0.2870 | 0.4758 | 0.3580 | 0.9067 |
|
61 |
-
| 0.0254 | 6.0 | 12198 | 0.5827 | {'precision': 0.684593023255814, 'recall': 0.4472934472934473, 'f1': 0.5410683515221136, 'number': 1053} | {'precision': 0.031561996779388084, 'recall': 0.5664739884393064, 'f1': 0.05979255643685173, 'number': 173} | {'precision': 0.6134301270417423, 'recall': 0.3407258064516129, 'f1': 0.43810758263123784, 'number': 992} | {'precision': 0.5973684210526315, 'recall': 0.6669931439764937, 'f1': 0.6302637667746414, 'number': 1021} | 0.2896 | 0.4903 | 0.3641 | 0.9046 |
|
62 |
-
| 0.0206 | 7.0 | 14231 | 0.5821 | {'precision': 0.6886657101865137, 'recall': 0.45584045584045585, 'f1': 0.5485714285714286, 'number': 1053} | {'precision': 0.03427093436792758, 'recall': 0.6127167630057804, 'f1': 0.06491120636864667, 'number': 173} | {'precision': 0.5944055944055944, 'recall': 0.34274193548387094, 'f1': 0.43478260869565216, 'number': 992} | {'precision': 0.5777964676198486, 'recall': 0.672869735553379, 'f1': 0.6217194570135747, 'number': 1021} | 0.2906 | 0.4980 | 0.3670 | 0.9063 |
|
63 |
-
| 0.0147 | 8.0 | 16264 | 0.6210 | {'precision': 0.7218045112781954, 'recall': 0.45584045584045585, 'f1': 0.5587892898719441, 'number': 1053} | {'precision': 0.03523542251325226, 'recall': 0.653179190751445, 'f1': 0.06686390532544378, 'number': 173} | {'precision': 0.6188524590163934, 'recall': 0.30443548387096775, 'f1': 0.40810810810810816, 'number': 992} | {'precision': 0.6246362754607178, 'recall': 0.6307541625857003, 'f1': 0.6276803118908383, 'number': 1021} | 0.2855 | 0.4751 | 0.3567 | 0.9047 |
|
64 |
-
| 0.0123 | 9.0 | 18297 | 0.6424 | {'precision': 0.7036496350364964, 'recall': 0.4577397910731244, 'f1': 0.5546605293440737, 'number': 1053} | {'precision': 0.04029899252518687, 'recall': 0.7167630057803468, 'f1': 0.07630769230769231, 'number': 173} | {'precision': 0.6023391812865497, 'recall': 0.31149193548387094, 'f1': 0.41063122923588036, 'number': 992} | {'precision': 0.5862068965517241, 'recall': 0.682664054848188, 'f1': 0.6307692307692307, 'number': 1021} | 0.2950 | 0.4977 | 0.3704 | 0.9074 |
|
65 |
-
| 0.0093 | 10.0 | 20330 | 0.6420 | {'precision': 0.706140350877193, 'recall': 0.4586894586894587, 'f1': 0.5561312607944733, 'number': 1053} | {'precision': 0.037747920665387076, 'recall': 0.6820809248554913, 'f1': 0.07153682934222491, 'number': 173} | {'precision': 0.6212424849699398, 'recall': 0.3125, 'f1': 0.4158283031522468, 'number': 992} | {'precision': 0.5895458440445587, 'recall': 0.67384916748286, 'f1': 0.6288848263254113, 'number': 1021} | 0.2920 | 0.4937 | 0.3670 | 0.9079 |
|
66 |
|
67 |
|
68 |
### Framework versions
|
69 |
|
70 |
-
- Transformers 4.29.
|
71 |
- Pytorch 2.0.0+cu118
|
72 |
- Datasets 2.12.0
|
73 |
- Tokenizers 0.13.3
|
|
|
14 |
|
15 |
This model is a fine-tuned version of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Loss: 0.0716
|
18 |
+
- Loc: {'precision': 0.7296511627906976, 'recall': 0.7943037974683544, 'f1': 0.7606060606060605, 'number': 316}
|
19 |
+
- Misc: {'precision': 0.7857142857142857, 'recall': 0.39285714285714285, 'f1': 0.5238095238095237, 'number': 56}
|
20 |
+
- Org: {'precision': 0.7745098039215687, 'recall': 0.7821782178217822, 'f1': 0.7783251231527093, 'number': 303}
|
21 |
+
- Per: {'precision': 0.8176100628930818, 'recall': 0.8074534161490683, 'f1': 0.8125000000000001, 'number': 322}
|
22 |
+
- Overall Precision: 0.7731
|
23 |
+
- Overall Recall: 0.7723
|
24 |
+
- Overall F1: 0.7727
|
25 |
+
- Overall Accuracy: 0.9826
|
26 |
|
27 |
## Model description
|
28 |
|
|
|
47 |
- seed: 42
|
48 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
49 |
- lr_scheduler_type: linear
|
50 |
+
- num_epochs: 3
|
51 |
|
52 |
### Training results
|
53 |
|
54 |
+
| Training Loss | Epoch | Step | Validation Loss | Loc | Misc | Org | Per | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
55 |
+
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
56 |
+
| No log | 1.0 | 476 | 0.0740 | {'precision': 0.6106442577030813, 'recall': 0.689873417721519, 'f1': 0.6478454680534919, 'number': 316} | {'precision': 0.6666666666666666, 'recall': 0.2857142857142857, 'f1': 0.4, 'number': 56} | {'precision': 0.665680473372781, 'recall': 0.7425742574257426, 'f1': 0.7020280811232449, 'number': 303} | {'precision': 0.7469879518072289, 'recall': 0.7701863354037267, 'f1': 0.7584097859327217, 'number': 322} | 0.6727 | 0.7091 | 0.6904 | 0.9794 |
|
57 |
+
| 0.1185 | 2.0 | 952 | 0.0647 | {'precision': 0.7383720930232558, 'recall': 0.8037974683544303, 'f1': 0.7696969696969697, 'number': 316} | {'precision': 0.6363636363636364, 'recall': 0.375, 'f1': 0.47191011235955066, 'number': 56} | {'precision': 0.7966101694915254, 'recall': 0.7755775577557755, 'f1': 0.785953177257525, 'number': 303} | {'precision': 0.8158730158730159, 'recall': 0.7981366459627329, 'f1': 0.8069073783359498, 'number': 322} | 0.7771 | 0.7693 | 0.7732 | 0.9831 |
|
58 |
+
| 0.0509 | 3.0 | 1428 | 0.0716 | {'precision': 0.7296511627906976, 'recall': 0.7943037974683544, 'f1': 0.7606060606060605, 'number': 316} | {'precision': 0.7857142857142857, 'recall': 0.39285714285714285, 'f1': 0.5238095238095237, 'number': 56} | {'precision': 0.7745098039215687, 'recall': 0.7821782178217822, 'f1': 0.7783251231527093, 'number': 303} | {'precision': 0.8176100628930818, 'recall': 0.8074534161490683, 'f1': 0.8125000000000001, 'number': 322} | 0.7731 | 0.7723 | 0.7727 | 0.9826 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
59 |
|
60 |
|
61 |
### Framework versions
|
62 |
|
63 |
+
- Transformers 4.29.1
|
64 |
- Pytorch 2.0.0+cu118
|
65 |
- Datasets 2.12.0
|
66 |
- Tokenizers 0.13.3
|