muhtasham commited on
Commit
40e34e0
1 Parent(s): fdaea74

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -7
README.md CHANGED
@@ -43,7 +43,7 @@ should probably proofread and complete it, then remove this comment. -->
43
 
44
  This model is a fine-tuned version of [google/bert_uncased_L-2_H-128_A-2](https://huggingface.co/google/bert_uncased_L-2_H-128_A-2) on the wnut_17 dataset.
45
  It achieves the following results on the evaluation set:
46
- - Loss: 1.4820
47
  - Precision: 0.0
48
  - Recall: 0.0
49
  - F1: 0.0
@@ -72,15 +72,27 @@ The following hyperparameters were used during training:
72
  - seed: 42
73
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
74
  - lr_scheduler_type: linear
75
- - num_epochs: 3
76
 
77
  ### Training results
78
 
79
- | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
80
- |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
81
- | No log | 1.0 | 27 | 1.8665 | 0.0732 | 0.0119 | 0.0205 | 0.8878 |
82
- | No log | 2.0 | 54 | 1.5702 | 0.0 | 0.0 | 0.0 | 0.8961 |
83
- | No log | 3.0 | 81 | 1.4820 | 0.0 | 0.0 | 0.0 | 0.8961 |
 
 
 
 
 
 
 
 
 
 
 
 
84
 
85
 
86
  ### Framework versions
43
 
44
  This model is a fine-tuned version of [google/bert_uncased_L-2_H-128_A-2](https://huggingface.co/google/bert_uncased_L-2_H-128_A-2) on the wnut_17 dataset.
45
  It achieves the following results on the evaluation set:
46
+ - Loss: 0.6054
47
  - Precision: 0.0
48
  - Recall: 0.0
49
  - F1: 0.0
72
  - seed: 42
73
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
74
  - lr_scheduler_type: linear
75
+ - num_epochs: 15
76
 
77
  ### Training results
78
 
79
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
80
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:---:|:--------:|
81
+ | No log | 1.0 | 27 | 1.1060 | 0.0 | 0.0 | 0.0 | 0.8961 |
82
+ | No log | 2.0 | 54 | 0.9075 | 0.0 | 0.0 | 0.0 | 0.8961 |
83
+ | No log | 3.0 | 81 | 0.7978 | 0.0 | 0.0 | 0.0 | 0.8961 |
84
+ | No log | 4.0 | 108 | 0.7333 | 0.0 | 0.0 | 0.0 | 0.8961 |
85
+ | No log | 5.0 | 135 | 0.6929 | 0.0 | 0.0 | 0.0 | 0.8961 |
86
+ | No log | 6.0 | 162 | 0.6661 | 0.0 | 0.0 | 0.0 | 0.8961 |
87
+ | No log | 7.0 | 189 | 0.6477 | 0.0 | 0.0 | 0.0 | 0.8961 |
88
+ | No log | 8.0 | 216 | 0.6346 | 0.0 | 0.0 | 0.0 | 0.8961 |
89
+ | No log | 9.0 | 243 | 0.6251 | 0.0 | 0.0 | 0.0 | 0.8961 |
90
+ | No log | 10.0 | 270 | 0.6182 | 0.0 | 0.0 | 0.0 | 0.8961 |
91
+ | No log | 11.0 | 297 | 0.6132 | 0.0 | 0.0 | 0.0 | 0.8961 |
92
+ | No log | 12.0 | 324 | 0.6097 | 0.0 | 0.0 | 0.0 | 0.8961 |
93
+ | No log | 13.0 | 351 | 0.6073 | 0.0 | 0.0 | 0.0 | 0.8961 |
94
+ | No log | 14.0 | 378 | 0.6059 | 0.0 | 0.0 | 0.0 | 0.8961 |
95
+ | No log | 15.0 | 405 | 0.6054 | 0.0 | 0.0 | 0.0 | 0.8961 |
96
 
97
 
98
  ### Framework versions