HueyNemud commited on
Commit
b9bb906
1 Parent(s): f6e86d0

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -16
README.md CHANGED
@@ -13,13 +13,13 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  This model is a fine-tuned version of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on the None dataset.
15
  It achieves the following results on the evaluation set:
16
- - Loss: 0.0191
17
- - Ebegin: {'precision': 0.993127147766323, 'recall': 0.9626915389740173, 'f1': 0.9776725304465493, 'number': 3002}
18
- - Eend: {'precision': 0.9910313901345291, 'recall': 0.9576666666666667, 'f1': 0.9740634005763689, 'number': 3000}
19
- - Overall Precision: 0.9921
20
- - Overall Recall: 0.9602
21
- - Overall F1: 0.9759
22
- - Overall Accuracy: 0.9954
23
 
24
  ## Model description
25
 
@@ -44,24 +44,29 @@ The following hyperparameters were used during training:
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
- - training_steps: 6000
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
52
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
53
- | No log | 0.07 | 300 | 0.0312 | 0.9804 | 0.9791 | 0.9797 | 0.9960 |
54
- | 0.1423 | 0.14 | 600 | 0.0232 | 0.9920 | 0.9670 | 0.9793 | 0.9958 |
55
- | 0.1423 | 0.21 | 900 | 0.0164 | 0.9959 | 0.9696 | 0.9825 | 0.9965 |
56
- | 0.0242 | 0.29 | 1200 | 0.0174 | 0.9850 | 0.9744 | 0.9797 | 0.9959 |
57
- | 0.0169 | 0.36 | 1500 | 0.0165 | 0.9913 | 0.9696 | 0.9803 | 0.9960 |
58
- | 0.0169 | 0.43 | 1800 | 0.0168 | 0.9908 | 0.9715 | 0.9810 | 0.9962 |
59
- | 0.0153 | 0.5 | 2100 | 0.0164 | 0.9884 | 0.9702 | 0.9793 | 0.9960 |
 
 
 
 
 
60
 
61
 
62
  ### Framework versions
63
 
64
- - Transformers 4.26.0
65
  - Pytorch 1.13.1+cu116
66
  - Datasets 2.9.0
67
  - Tokenizers 0.13.2
 
13
 
14
  This model is a fine-tuned version of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on the None dataset.
15
  It achieves the following results on the evaluation set:
16
+ - Loss: 0.0319
17
+ - Ebegin: {'precision': 0.9928057553956835, 'recall': 0.9341857841293719, 'f1': 0.9626041464832397, 'number': 2659}
18
+ - Eend: {'precision': 0.9772727272727273, 'recall': 0.9641255605381166, 'f1': 0.9706546275395034, 'number': 2676}
19
+ - Overall Precision: 0.9848
20
+ - Overall Recall: 0.9492
21
+ - Overall F1: 0.9667
22
+ - Overall Accuracy: 0.9936
23
 
24
  ## Model description
25
 
 
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
+ - training_steps: 7500
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
52
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
53
+ | No log | 0.07 | 300 | 0.0422 | 0.9698 | 0.9675 | 0.9686 | 0.9945 |
54
+ | 0.2178 | 0.14 | 600 | 0.0249 | 0.9900 | 0.9637 | 0.9767 | 0.9952 |
55
+ | 0.2178 | 0.21 | 900 | 0.0236 | 0.9859 | 0.9721 | 0.9790 | 0.9957 |
56
+ | 0.0267 | 0.29 | 1200 | 0.0187 | 0.9908 | 0.9711 | 0.9808 | 0.9961 |
57
+ | 0.0209 | 0.36 | 1500 | 0.0191 | 0.9869 | 0.9727 | 0.9798 | 0.9959 |
58
+ | 0.0209 | 0.43 | 1800 | 0.0199 | 0.9886 | 0.9712 | 0.9798 | 0.9959 |
59
+ | 0.0167 | 0.5 | 2100 | 0.0178 | 0.9912 | 0.9715 | 0.9813 | 0.9962 |
60
+ | 0.0167 | 0.57 | 2400 | 0.0176 | 0.9937 | 0.9595 | 0.9763 | 0.9952 |
61
+ | 0.0147 | 0.64 | 2700 | 0.0213 | 0.9869 | 0.9692 | 0.9779 | 0.9955 |
62
+ | 0.0142 | 0.72 | 3000 | 0.0181 | 0.9854 | 0.9767 | 0.9810 | 0.9962 |
63
+ | 0.0142 | 0.79 | 3300 | 0.0222 | 0.9865 | 0.9744 | 0.9804 | 0.9960 |
64
+ | 0.0121 | 0.86 | 3600 | 0.0190 | 0.9855 | 0.9770 | 0.9813 | 0.9962 |
65
 
66
 
67
  ### Framework versions
68
 
69
+ - Transformers 4.26.1
70
  - Pytorch 1.13.1+cu116
71
  - Datasets 2.9.0
72
  - Tokenizers 0.13.2