lobrien001 commited on
Commit
c0efb4e
1 Parent(s): 389d751

Model save

Browse files
README.md CHANGED
@@ -20,7 +20,7 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.5495
24
  - Precision: 0.8574
25
  - Recall: 0.9049
26
  - F1: 0.8805
@@ -44,8 +44,8 @@ More information needed
44
 
45
  The following hyperparameters were used during training:
46
  - learning_rate: 2e-05
47
- - train_batch_size: 3
48
- - eval_batch_size: 4
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
@@ -55,77 +55,19 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | No log | 0.03 | 10 | 0.5589 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
59
- | No log | 0.06 | 20 | 0.5598 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
60
- | No log | 0.08 | 30 | 0.5778 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
61
- | No log | 0.11 | 40 | 0.5731 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
62
- | No log | 0.14 | 50 | 0.5984 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
63
- | No log | 0.17 | 60 | 0.5754 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
64
- | No log | 0.2 | 70 | 0.5766 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
65
- | No log | 0.22 | 80 | 0.5582 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
66
- | No log | 0.25 | 90 | 0.6164 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
67
- | No log | 0.28 | 100 | 0.5652 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
68
- | No log | 0.31 | 110 | 0.5564 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
69
- | No log | 0.34 | 120 | 0.5625 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
70
- | No log | 0.36 | 130 | 0.5569 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
71
- | No log | 0.39 | 140 | 0.5577 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
72
- | No log | 0.42 | 150 | 0.5639 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
73
- | No log | 0.45 | 160 | 0.5636 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
74
- | No log | 0.47 | 170 | 0.5539 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
75
- | No log | 0.5 | 180 | 0.5532 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
76
- | No log | 0.53 | 190 | 0.5563 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
77
- | No log | 0.56 | 200 | 0.5550 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
78
- | No log | 0.59 | 210 | 0.5560 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
79
- | No log | 0.61 | 220 | 0.5616 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
80
- | No log | 0.64 | 230 | 0.5533 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
81
- | No log | 0.67 | 240 | 0.5569 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
82
- | No log | 0.7 | 250 | 0.5528 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
83
- | No log | 0.73 | 260 | 0.5558 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
84
- | No log | 0.75 | 270 | 0.5631 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
85
- | No log | 0.78 | 280 | 0.5561 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
86
- | No log | 0.81 | 290 | 0.5547 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
87
- | No log | 0.84 | 300 | 0.5559 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
88
- | No log | 0.87 | 310 | 0.5551 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
89
- | No log | 0.89 | 320 | 0.5515 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
90
- | No log | 0.92 | 330 | 0.5641 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
91
- | No log | 0.95 | 340 | 0.5693 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
92
- | No log | 0.98 | 350 | 0.5558 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
93
- | No log | 1.01 | 360 | 0.5573 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
94
- | No log | 1.03 | 370 | 0.5522 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
95
- | No log | 1.06 | 380 | 0.5557 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
96
- | No log | 1.09 | 390 | 0.5523 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
97
- | No log | 1.12 | 400 | 0.5526 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
98
- | No log | 1.15 | 410 | 0.5566 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
99
- | No log | 1.17 | 420 | 0.5552 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
100
- | No log | 1.2 | 430 | 0.5534 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
101
- | No log | 1.23 | 440 | 0.5643 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
102
- | No log | 1.26 | 450 | 0.5746 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
103
- | No log | 1.28 | 460 | 0.5534 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
104
- | No log | 1.31 | 470 | 0.5534 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
105
- | No log | 1.34 | 480 | 0.5512 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
106
- | No log | 1.37 | 490 | 0.5576 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
107
- | 0.655 | 1.4 | 500 | 0.5510 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
108
- | 0.655 | 1.42 | 510 | 0.5531 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
109
- | 0.655 | 1.45 | 520 | 0.5519 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
110
- | 0.655 | 1.48 | 530 | 0.5530 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
111
- | 0.655 | 1.51 | 540 | 0.5528 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
112
- | 0.655 | 1.54 | 550 | 0.5493 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
113
- | 0.655 | 1.56 | 560 | 0.5495 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
114
- | 0.655 | 1.59 | 570 | 0.5504 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
115
- | 0.655 | 1.62 | 580 | 0.5540 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
116
- | 0.655 | 1.65 | 590 | 0.5558 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
117
- | 0.655 | 1.68 | 600 | 0.5513 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
118
- | 0.655 | 1.7 | 610 | 0.5507 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
119
- | 0.655 | 1.73 | 620 | 0.5503 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
120
- | 0.655 | 1.76 | 630 | 0.5499 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
121
- | 0.655 | 1.79 | 640 | 0.5501 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
122
- | 0.655 | 1.82 | 650 | 0.5499 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
123
- | 0.655 | 1.84 | 660 | 0.5526 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
124
- | 0.655 | 1.87 | 670 | 0.5541 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
125
- | 0.655 | 1.9 | 680 | 0.5511 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
126
- | 0.655 | 1.93 | 690 | 0.5496 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
127
- | 0.655 | 1.96 | 700 | 0.5496 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
128
- | 0.655 | 1.98 | 710 | 0.5495 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
129
 
130
 
131
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.5509
24
  - Precision: 0.8574
25
  - Recall: 0.9049
26
  - F1: 0.8805
 
44
 
45
  The following hyperparameters were used during training:
46
  - learning_rate: 2e-05
47
+ - train_batch_size: 16
48
+ - eval_batch_size: 16
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | No log | 0.15 | 10 | 0.5834 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
59
+ | No log | 0.3 | 20 | 0.5566 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
60
+ | No log | 0.45 | 30 | 0.5712 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
61
+ | No log | 0.6 | 40 | 0.5596 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
62
+ | No log | 0.75 | 50 | 0.5566 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
63
+ | No log | 0.9 | 60 | 0.5554 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
64
+ | No log | 1.04 | 70 | 0.5671 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
65
+ | No log | 1.19 | 80 | 0.5510 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
66
+ | No log | 1.34 | 90 | 0.5534 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
67
+ | No log | 1.49 | 100 | 0.5526 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
68
+ | No log | 1.64 | 110 | 0.5583 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
69
+ | No log | 1.79 | 120 | 0.5507 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
70
+ | No log | 1.94 | 130 | 0.5509 | 0.8574 | 0.9049 | 0.8805 | 0.8574 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
71
 
72
 
73
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:21e8c7fce0ffe093da7b529e9d32903fddff713d4eb36a2f721d0dbef841b62b
3
  size 496256392
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9ae4b59ee07752f62af4f337096109e2c98a2ab18bb2b25ea8da8ff963982841
3
  size 496256392
runs/Apr23_07-35-33_e0477794c3e0/events.out.tfevents.1713858384.e0477794c3e0.2322.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:eaa35c8c824508ec5416d085cb7abcb7fc4f517ec08169c1689573ad8e24b68f
3
- size 560
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ecc68c5dcd45b32263c91c8b0bf1fb21d1ad8bf53a1ccdab484b643bb0aabf06
3
+ size 38992
runs/Apr23_07-58-11_e0477794c3e0/events.out.tfevents.1713859180.e0477794c3e0.2322.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae2bacfff9a09d19e36bda2d95025d5e4022e9eeb613da1cfa98be40f6af74db
3
+ size 10950
runs/Apr23_07-58-11_e0477794c3e0/events.out.tfevents.1713859518.e0477794c3e0.2322.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c756b2b877aff22aa3963820c49c66b062d4db98faad123626bc2d58aac5e07
3
+ size 560
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3918971863bd69127fc8a72cd7e9ddb56815c808d732dadaa822d0c68921baee
3
  size 4728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8684799bc46eb70397199021a54f21cac51b0db8f29f578480690e66fe2176b
3
  size 4728