fgiauna commited on
Commit
6e26020
1 Parent(s): 39010d1

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -13
README.md CHANGED
@@ -14,15 +14,15 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.0716
18
- - Loc: {'precision': 0.7296511627906976, 'recall': 0.7943037974683544, 'f1': 0.7606060606060605, 'number': 316}
19
- - Misc: {'precision': 0.7857142857142857, 'recall': 0.39285714285714285, 'f1': 0.5238095238095237, 'number': 56}
20
- - Org: {'precision': 0.7745098039215687, 'recall': 0.7821782178217822, 'f1': 0.7783251231527093, 'number': 303}
21
- - Per: {'precision': 0.8176100628930818, 'recall': 0.8074534161490683, 'f1': 0.8125000000000001, 'number': 322}
22
- - Overall Precision: 0.7731
23
- - Overall Recall: 0.7723
24
- - Overall F1: 0.7727
25
- - Overall Accuracy: 0.9826
26
 
27
  ## Model description
28
 
@@ -47,15 +47,22 @@ The following hyperparameters were used during training:
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
- - num_epochs: 3
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Loc | Misc | Org | Per | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
  |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
- | No log | 1.0 | 476 | 0.0740 | {'precision': 0.6106442577030813, 'recall': 0.689873417721519, 'f1': 0.6478454680534919, 'number': 316} | {'precision': 0.6666666666666666, 'recall': 0.2857142857142857, 'f1': 0.4, 'number': 56} | {'precision': 0.665680473372781, 'recall': 0.7425742574257426, 'f1': 0.7020280811232449, 'number': 303} | {'precision': 0.7469879518072289, 'recall': 0.7701863354037267, 'f1': 0.7584097859327217, 'number': 322} | 0.6727 | 0.7091 | 0.6904 | 0.9794 |
57
- | 0.1185 | 2.0 | 952 | 0.0647 | {'precision': 0.7383720930232558, 'recall': 0.8037974683544303, 'f1': 0.7696969696969697, 'number': 316} | {'precision': 0.6363636363636364, 'recall': 0.375, 'f1': 0.47191011235955066, 'number': 56} | {'precision': 0.7966101694915254, 'recall': 0.7755775577557755, 'f1': 0.785953177257525, 'number': 303} | {'precision': 0.8158730158730159, 'recall': 0.7981366459627329, 'f1': 0.8069073783359498, 'number': 322} | 0.7771 | 0.7693 | 0.7732 | 0.9831 |
58
- | 0.0509 | 3.0 | 1428 | 0.0716 | {'precision': 0.7296511627906976, 'recall': 0.7943037974683544, 'f1': 0.7606060606060605, 'number': 316} | {'precision': 0.7857142857142857, 'recall': 0.39285714285714285, 'f1': 0.5238095238095237, 'number': 56} | {'precision': 0.7745098039215687, 'recall': 0.7821782178217822, 'f1': 0.7783251231527093, 'number': 303} | {'precision': 0.8176100628930818, 'recall': 0.8074534161490683, 'f1': 0.8125000000000001, 'number': 322} | 0.7731 | 0.7723 | 0.7727 | 0.9826 |
 
 
 
 
 
 
 
59
 
60
 
61
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.1016
18
+ - Loc: {'precision': 0.7560975609756098, 'recall': 0.7848101265822784, 'f1': 0.7701863354037267, 'number': 316}
19
+ - Misc: {'precision': 0.5681818181818182, 'recall': 0.44642857142857145, 'f1': 0.5, 'number': 56}
20
+ - Org: {'precision': 0.8184818481848185, 'recall': 0.8184818481848185, 'f1': 0.8184818481848186, 'number': 303}
21
+ - Per: {'precision': 0.8456591639871383, 'recall': 0.8167701863354038, 'f1': 0.8309636650868879, 'number': 322}
22
+ - Overall Precision: 0.7951
23
+ - Overall Recall: 0.7864
24
+ - Overall F1: 0.7907
25
+ - Overall Accuracy: 0.9834
26
 
27
  ## Model description
28
 
 
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 10
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Loc | Misc | Org | Per | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
  |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
+ | No log | 1.0 | 476 | 0.0747 | {'precision': 0.6298342541436464, 'recall': 0.7215189873417721, 'f1': 0.672566371681416, 'number': 316} | {'precision': 0.6666666666666666, 'recall': 0.2857142857142857, 'f1': 0.4, 'number': 56} | {'precision': 0.6743515850144092, 'recall': 0.7722772277227723, 'f1': 0.7199999999999999, 'number': 303} | {'precision': 0.7070422535211267, 'recall': 0.7795031055900621, 'f1': 0.741506646971935, 'number': 322} | 0.6700 | 0.7312 | 0.6993 | 0.9795 |
57
+ | 0.1172 | 2.0 | 952 | 0.0640 | {'precision': 0.7471264367816092, 'recall': 0.8227848101265823, 'f1': 0.7831325301204819, 'number': 316} | {'precision': 0.6363636363636364, 'recall': 0.375, 'f1': 0.47191011235955066, 'number': 56} | {'precision': 0.7660256410256411, 'recall': 0.7887788778877888, 'f1': 0.7772357723577236, 'number': 303} | {'precision': 0.8138801261829653, 'recall': 0.8012422360248447, 'f1': 0.8075117370892019, 'number': 322} | 0.7703 | 0.7803 | 0.7753 | 0.9839 |
58
+ | 0.0503 | 3.0 | 1428 | 0.0718 | {'precision': 0.707774798927614, 'recall': 0.8354430379746836, 'f1': 0.7663280116110305, 'number': 316} | {'precision': 0.75, 'recall': 0.42857142857142855, 'f1': 0.5454545454545454, 'number': 56} | {'precision': 0.7639751552795031, 'recall': 0.8118811881188119, 'f1': 0.7872, 'number': 303} | {'precision': 0.8024316109422492, 'recall': 0.8198757763975155, 'f1': 0.8110599078341013, 'number': 322} | 0.7557 | 0.8004 | 0.7774 | 0.9831 |
59
+ | 0.0273 | 4.0 | 1904 | 0.0859 | {'precision': 0.7601246105919003, 'recall': 0.7721518987341772, 'f1': 0.7660910518053375, 'number': 316} | {'precision': 0.5897435897435898, 'recall': 0.4107142857142857, 'f1': 0.4842105263157895, 'number': 56} | {'precision': 0.75, 'recall': 0.8514851485148515, 'f1': 0.7975270479134468, 'number': 303} | {'precision': 0.8178913738019169, 'recall': 0.7950310559006211, 'f1': 0.8062992125984252, 'number': 322} | 0.7679 | 0.7834 | 0.7756 | 0.9823 |
60
+ | 0.0179 | 5.0 | 2380 | 0.0924 | {'precision': 0.7652733118971061, 'recall': 0.7531645569620253, 'f1': 0.759170653907496, 'number': 316} | {'precision': 0.575, 'recall': 0.4107142857142857, 'f1': 0.47916666666666663, 'number': 56} | {'precision': 0.8116883116883117, 'recall': 0.8250825082508251, 'f1': 0.8183306055646481, 'number': 303} | {'precision': 0.8264984227129337, 'recall': 0.8136645962732919, 'f1': 0.8200312989045383, 'number': 322} | 0.7920 | 0.7753 | 0.7836 | 0.9834 |
61
+ | 0.0126 | 6.0 | 2856 | 0.0881 | {'precision': 0.7197640117994101, 'recall': 0.7721518987341772, 'f1': 0.7450381679389314, 'number': 316} | {'precision': 0.5333333333333333, 'recall': 0.42857142857142855, 'f1': 0.4752475247524753, 'number': 56} | {'precision': 0.8245033112582781, 'recall': 0.8217821782178217, 'f1': 0.8231404958677686, 'number': 303} | {'precision': 0.8096676737160121, 'recall': 0.8322981366459627, 'f1': 0.8208269525267994, 'number': 322} | 0.7719 | 0.7874 | 0.7795 | 0.9830 |
62
+ | 0.0094 | 7.0 | 3332 | 0.0988 | {'precision': 0.7570977917981072, 'recall': 0.759493670886076, 'f1': 0.7582938388625592, 'number': 316} | {'precision': 0.575, 'recall': 0.4107142857142857, 'f1': 0.47916666666666663, 'number': 56} | {'precision': 0.8305647840531561, 'recall': 0.8250825082508251, 'f1': 0.8278145695364238, 'number': 303} | {'precision': 0.8213166144200627, 'recall': 0.8136645962732919, 'f1': 0.8174726989079563, 'number': 322} | 0.7932 | 0.7773 | 0.7852 | 0.9833 |
63
+ | 0.0057 | 8.0 | 3808 | 0.1023 | {'precision': 0.7704402515723271, 'recall': 0.7753164556962026, 'f1': 0.7728706624605678, 'number': 316} | {'precision': 0.6153846153846154, 'recall': 0.42857142857142855, 'f1': 0.5052631578947369, 'number': 56} | {'precision': 0.8283828382838284, 'recall': 0.8283828382838284, 'f1': 0.8283828382838284, 'number': 303} | {'precision': 0.8436482084690554, 'recall': 0.8043478260869565, 'f1': 0.8235294117647061, 'number': 322} | 0.8056 | 0.7813 | 0.7933 | 0.9838 |
64
+ | 0.0058 | 9.0 | 4284 | 0.1005 | {'precision': 0.7584097859327217, 'recall': 0.7848101265822784, 'f1': 0.7713841368584758, 'number': 316} | {'precision': 0.5555555555555556, 'recall': 0.44642857142857145, 'f1': 0.4950495049504951, 'number': 56} | {'precision': 0.8239202657807309, 'recall': 0.8184818481848185, 'f1': 0.8211920529801325, 'number': 303} | {'precision': 0.8538961038961039, 'recall': 0.8167701863354038, 'f1': 0.834920634920635, 'number': 322} | 0.7992 | 0.7864 | 0.7927 | 0.9836 |
65
+ | 0.0045 | 10.0 | 4760 | 0.1016 | {'precision': 0.7560975609756098, 'recall': 0.7848101265822784, 'f1': 0.7701863354037267, 'number': 316} | {'precision': 0.5681818181818182, 'recall': 0.44642857142857145, 'f1': 0.5, 'number': 56} | {'precision': 0.8184818481848185, 'recall': 0.8184818481848185, 'f1': 0.8184818481848186, 'number': 303} | {'precision': 0.8456591639871383, 'recall': 0.8167701863354038, 'f1': 0.8309636650868879, 'number': 322} | 0.7951 | 0.7864 | 0.7907 | 0.9834 |
66
 
67
 
68
  ### Framework versions