xshubhamx commited on
Commit
a349dfd
1 Parent(s): c8ebedc

End of training

Browse files
README.md CHANGED
@@ -1,8 +1,8 @@
1
  ---
2
  license: cc-by-sa-4.0
 
3
  tags:
4
  - generated_from_trainer
5
- base_model: nlpaueb/legal-bert-base-uncased
6
  metrics:
7
  - accuracy
8
  - precision
@@ -19,21 +19,21 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  This model is a fine-tuned version of [nlpaueb/legal-bert-base-uncased](https://huggingface.co/nlpaueb/legal-bert-base-uncased) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 1.2564
23
- - Accuracy: 0.8273
24
- - Precision: 0.8292
25
- - Recall: 0.8273
26
- - Precision Macro: 0.7794
27
- - Recall Macro: 0.7759
28
- - Macro Fpr: 0.0153
29
- - Weighted Fpr: 0.0147
30
- - Weighted Specificity: 0.9772
31
- - Macro Specificity: 0.9870
32
- - Weighted Sensitivity: 0.8273
33
- - Macro Sensitivity: 0.7759
34
- - F1 Micro: 0.8273
35
- - F1 Macro: 0.7741
36
- - F1 Weighted: 0.8269
37
 
38
  ## Model description
39
 
@@ -58,23 +58,28 @@ The following hyperparameters were used during training:
58
  - seed: 42
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
- - num_epochs: 10
62
  - mixed_precision_training: Native AMP
63
 
64
  ### Training results
65
 
66
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
67
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
68
- | 1.2105 | 1.0 | 643 | 0.7916 | 0.7761 | 0.7729 | 0.7761 | 0.6230 | 0.5920 | 0.0214 | 0.0202 | 0.9664 | 0.9828 | 0.7761 | 0.5920 | 0.7761 | 0.5703 | 0.7551 |
69
- | 0.6521 | 2.0 | 1286 | 0.6834 | 0.8025 | 0.8067 | 0.8025 | 0.7779 | 0.7152 | 0.0180 | 0.0173 | 0.9721 | 0.9850 | 0.8025 | 0.7152 | 0.8025 | 0.7181 | 0.7983 |
70
- | 0.513 | 3.0 | 1929 | 0.8107 | 0.8141 | 0.8142 | 0.8141 | 0.7859 | 0.7227 | 0.0168 | 0.0160 | 0.9740 | 0.9859 | 0.8141 | 0.7227 | 0.8141 | 0.7261 | 0.8083 |
71
- | 0.2635 | 4.0 | 2572 | 0.8442 | 0.8249 | 0.8285 | 0.8249 | 0.8298 | 0.7733 | 0.0156 | 0.0149 | 0.9759 | 0.9867 | 0.8249 | 0.7733 | 0.8249 | 0.7812 | 0.8242 |
72
- | 0.1821 | 5.0 | 3215 | 0.9549 | 0.8226 | 0.8287 | 0.8226 | 0.8135 | 0.7623 | 0.0157 | 0.0152 | 0.9766 | 0.9866 | 0.8226 | 0.7623 | 0.8226 | 0.7758 | 0.8233 |
73
- | 0.1123 | 6.0 | 3858 | 1.0790 | 0.8273 | 0.8316 | 0.8273 | 0.7865 | 0.7758 | 0.0152 | 0.0147 | 0.9779 | 0.9870 | 0.8273 | 0.7758 | 0.8273 | 0.7671 | 0.8268 |
74
- | 0.0465 | 7.0 | 4501 | 1.1538 | 0.8280 | 0.8324 | 0.8280 | 0.7857 | 0.8054 | 0.0152 | 0.0146 | 0.9780 | 0.9871 | 0.8280 | 0.8054 | 0.8280 | 0.7890 | 0.8285 |
75
- | 0.0256 | 8.0 | 5144 | 1.2413 | 0.8180 | 0.8263 | 0.8180 | 0.7780 | 0.8012 | 0.0162 | 0.0156 | 0.9771 | 0.9863 | 0.8180 | 0.8012 | 0.8180 | 0.7792 | 0.8196 |
76
- | 0.0166 | 9.0 | 5787 | 1.2510 | 0.8218 | 0.8222 | 0.8218 | 0.7782 | 0.7600 | 0.0159 | 0.0152 | 0.9755 | 0.9865 | 0.8218 | 0.7600 | 0.8218 | 0.7660 | 0.8210 |
77
- | 0.0107 | 10.0 | 6430 | 1.2564 | 0.8273 | 0.8292 | 0.8273 | 0.7794 | 0.7759 | 0.0153 | 0.0147 | 0.9772 | 0.9870 | 0.8273 | 0.7759 | 0.8273 | 0.7741 | 0.8269 |
 
 
 
 
 
78
 
79
 
80
  ### Framework versions
 
1
  ---
2
  license: cc-by-sa-4.0
3
+ base_model: nlpaueb/legal-bert-base-uncased
4
  tags:
5
  - generated_from_trainer
 
6
  metrics:
7
  - accuracy
8
  - precision
 
19
 
20
  This model is a fine-tuned version of [nlpaueb/legal-bert-base-uncased](https://huggingface.co/nlpaueb/legal-bert-base-uncased) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
+ - Loss: 1.5469
23
+ - Accuracy: 0.8242
24
+ - Precision: 0.8220
25
+ - Recall: 0.8242
26
+ - Precision Macro: 0.7660
27
+ - Recall Macro: 0.7548
28
+ - Macro Fpr: 0.0156
29
+ - Weighted Fpr: 0.0150
30
+ - Weighted Specificity: 0.9766
31
+ - Macro Specificity: 0.9867
32
+ - Weighted Sensitivity: 0.8242
33
+ - Macro Sensitivity: 0.7548
34
+ - F1 Micro: 0.8242
35
+ - F1 Macro: 0.7566
36
+ - F1 Weighted: 0.8221
37
 
38
  ## Model description
39
 
 
58
  - seed: 42
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
+ - num_epochs: 15
62
  - mixed_precision_training: Native AMP
63
 
64
  ### Training results
65
 
66
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
67
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
68
+ | 1.1096 | 1.0 | 643 | 0.6748 | 0.7978 | 0.7855 | 0.7978 | 0.6239 | 0.6340 | 0.0188 | 0.0178 | 0.9702 | 0.9845 | 0.7978 | 0.6340 | 0.7978 | 0.6134 | 0.7840 |
69
+ | 0.6187 | 2.0 | 1286 | 0.6449 | 0.8110 | 0.8196 | 0.8110 | 0.7806 | 0.7327 | 0.0169 | 0.0164 | 0.9755 | 0.9858 | 0.8110 | 0.7327 | 0.8110 | 0.7268 | 0.8090 |
70
+ | 0.4747 | 3.0 | 1929 | 0.8151 | 0.8149 | 0.8192 | 0.8149 | 0.7659 | 0.7390 | 0.0166 | 0.0160 | 0.9761 | 0.9861 | 0.8149 | 0.7390 | 0.8149 | 0.7370 | 0.8125 |
71
+ | 0.2645 | 4.0 | 2572 | 0.9345 | 0.8218 | 0.8198 | 0.8218 | 0.7446 | 0.7413 | 0.0158 | 0.0152 | 0.9774 | 0.9866 | 0.8218 | 0.7413 | 0.8218 | 0.7385 | 0.8189 |
72
+ | 0.1901 | 5.0 | 3215 | 1.0929 | 0.8195 | 0.8242 | 0.8195 | 0.8264 | 0.7432 | 0.0161 | 0.0155 | 0.9750 | 0.9863 | 0.8195 | 0.7432 | 0.8195 | 0.7595 | 0.8166 |
73
+ | 0.1131 | 6.0 | 3858 | 1.1536 | 0.8203 | 0.8212 | 0.8203 | 0.7968 | 0.7786 | 0.0159 | 0.0154 | 0.9766 | 0.9865 | 0.8203 | 0.7786 | 0.8203 | 0.7840 | 0.8197 |
74
+ | 0.063 | 7.0 | 4501 | 1.3218 | 0.8118 | 0.8184 | 0.8118 | 0.7518 | 0.7526 | 0.0166 | 0.0163 | 0.9773 | 0.9859 | 0.8118 | 0.7526 | 0.8118 | 0.7495 | 0.8136 |
75
+ | 0.0264 | 8.0 | 5144 | 1.3863 | 0.8257 | 0.8262 | 0.8257 | 0.7784 | 0.7768 | 0.0155 | 0.0149 | 0.9768 | 0.9868 | 0.8257 | 0.7768 | 0.8257 | 0.7730 | 0.8247 |
76
+ | 0.03 | 9.0 | 5787 | 1.5542 | 0.8079 | 0.8167 | 0.8079 | 0.7639 | 0.7653 | 0.0172 | 0.0167 | 0.9744 | 0.9855 | 0.8079 | 0.7653 | 0.8079 | 0.7595 | 0.8096 |
77
+ | 0.0149 | 10.0 | 6430 | 1.5835 | 0.8141 | 0.8155 | 0.8141 | 0.7545 | 0.7361 | 0.0168 | 0.0160 | 0.9730 | 0.9858 | 0.8141 | 0.7361 | 0.8141 | 0.7412 | 0.8127 |
78
+ | 0.005 | 11.0 | 7073 | 1.5325 | 0.8242 | 0.8250 | 0.8242 | 0.7805 | 0.7812 | 0.0156 | 0.0150 | 0.9758 | 0.9867 | 0.8242 | 0.7812 | 0.8242 | 0.7681 | 0.8226 |
79
+ | 0.003 | 12.0 | 7716 | 1.5714 | 0.8288 | 0.8299 | 0.8288 | 0.7701 | 0.7679 | 0.0152 | 0.0145 | 0.9765 | 0.9870 | 0.8288 | 0.7679 | 0.8288 | 0.7626 | 0.8276 |
80
+ | 0.0033 | 13.0 | 8359 | 1.5511 | 0.8249 | 0.8219 | 0.8249 | 0.7676 | 0.7598 | 0.0156 | 0.0149 | 0.9760 | 0.9867 | 0.8249 | 0.7598 | 0.8249 | 0.7608 | 0.8225 |
81
+ | 0.0018 | 14.0 | 9002 | 1.5510 | 0.8249 | 0.8225 | 0.8249 | 0.7686 | 0.7554 | 0.0155 | 0.0149 | 0.9767 | 0.9868 | 0.8249 | 0.7554 | 0.8249 | 0.7572 | 0.8224 |
82
+ | 0.0008 | 15.0 | 9645 | 1.5469 | 0.8242 | 0.8220 | 0.8242 | 0.7660 | 0.7548 | 0.0156 | 0.0150 | 0.9766 | 0.9867 | 0.8242 | 0.7548 | 0.8242 | 0.7566 | 0.8221 |
83
 
84
 
85
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b8d15abcec7eb5845e054fc3e83e9a47ece3adb5435f7e1245b9837f2c34871d
3
  size 437998636
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aed7da85f80a27ffabe8816d204179fa449531d5ddddb8e7873efbe4755999ce
3
  size 437998636
runs/Apr14_00-53-02_3e1d3a604afe/events.out.tfevents.1713055983.3e1d3a604afe.34.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:159e8b82e3c4c3f6a9eb83efaf58275cbc6d251350af6452e0bb50c098827b49
3
- size 25324
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:48ea28ab4ecae1af9628780bf4be4d8a900ed7ecf6f7a52e3d34aa1ea883c29a
3
+ size 25678