SimoneJLaudani commited on
Commit
a8d5d1b
1 Parent(s): 943d23f

End of training

Browse files
README.md CHANGED
@@ -3,11 +3,6 @@ license: apache-2.0
3
  base_model: distilbert-base-cased
4
  tags:
5
  - generated_from_trainer
6
- metrics:
7
- - precision
8
- - recall
9
- - f1
10
- - accuracy
11
  model-index:
12
  - name: trainerL
13
  results: []
@@ -20,11 +15,16 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 1.8547
24
- - Precision: 0.8345
25
- - Recall: 0.8291
26
- - F1: 0.8288
27
- - Accuracy: 0.8291
 
 
 
 
 
28
 
29
  ## Model description
30
 
@@ -51,86 +51,6 @@ The following hyperparameters were used during training:
51
  - lr_scheduler_type: linear
52
  - num_epochs: 5
53
 
54
- ### Training results
55
-
56
- | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
- |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | 0.0003 | 0.07 | 15 | 1.3457 | 0.8377 | 0.8291 | 0.8283 | 0.8291 |
59
- | 0.0002 | 0.14 | 30 | 1.4594 | 0.8514 | 0.8375 | 0.8374 | 0.8375 |
60
- | 0.0269 | 0.2 | 45 | 1.5729 | 0.8408 | 0.8319 | 0.8315 | 0.8319 |
61
- | 0.06 | 0.27 | 60 | 1.7336 | 0.8270 | 0.8123 | 0.8122 | 0.8123 |
62
- | 0.0288 | 0.34 | 75 | 1.6085 | 0.8421 | 0.8319 | 0.8328 | 0.8319 |
63
- | 0.0002 | 0.41 | 90 | 1.7197 | 0.8115 | 0.8039 | 0.8029 | 0.8039 |
64
- | 0.0485 | 0.47 | 105 | 2.0998 | 0.8058 | 0.7759 | 0.7732 | 0.7759 |
65
- | 0.005 | 0.54 | 120 | 1.9132 | 0.8125 | 0.7955 | 0.7963 | 0.7955 |
66
- | 0.2031 | 0.61 | 135 | 1.6651 | 0.8207 | 0.8151 | 0.8143 | 0.8151 |
67
- | 0.0742 | 0.68 | 150 | 1.7838 | 0.7996 | 0.7927 | 0.7901 | 0.7927 |
68
- | 0.0937 | 0.74 | 165 | 1.9159 | 0.8244 | 0.8123 | 0.8064 | 0.8123 |
69
- | 0.2124 | 0.81 | 180 | 1.6964 | 0.8131 | 0.8067 | 0.8053 | 0.8067 |
70
- | 0.001 | 0.88 | 195 | 1.6233 | 0.8105 | 0.8095 | 0.8091 | 0.8095 |
71
- | 0.2167 | 0.95 | 210 | 1.6295 | 0.8447 | 0.8375 | 0.8358 | 0.8375 |
72
- | 0.1011 | 1.01 | 225 | 1.6364 | 0.8261 | 0.8179 | 0.8168 | 0.8179 |
73
- | 0.0003 | 1.08 | 240 | 1.6730 | 0.8178 | 0.8123 | 0.8113 | 0.8123 |
74
- | 0.0389 | 1.15 | 255 | 1.6269 | 0.8180 | 0.8123 | 0.8121 | 0.8123 |
75
- | 0.0114 | 1.22 | 270 | 1.6331 | 0.8095 | 0.8039 | 0.8014 | 0.8039 |
76
- | 0.0036 | 1.28 | 285 | 1.7491 | 0.8023 | 0.7955 | 0.7952 | 0.7955 |
77
- | 0.1957 | 1.35 | 300 | 1.6646 | 0.8276 | 0.8067 | 0.8074 | 0.8067 |
78
- | 0.0174 | 1.42 | 315 | 1.6649 | 0.8266 | 0.8207 | 0.8207 | 0.8207 |
79
- | 0.0177 | 1.49 | 330 | 1.7413 | 0.8404 | 0.8291 | 0.8287 | 0.8291 |
80
- | 0.1063 | 1.55 | 345 | 1.8075 | 0.8342 | 0.8179 | 0.8194 | 0.8179 |
81
- | 0.0339 | 1.62 | 360 | 1.6491 | 0.8393 | 0.8319 | 0.8316 | 0.8319 |
82
- | 0.0582 | 1.69 | 375 | 1.7306 | 0.8279 | 0.8235 | 0.8232 | 0.8235 |
83
- | 0.0002 | 1.76 | 390 | 1.7161 | 0.8299 | 0.8263 | 0.8258 | 0.8263 |
84
- | 0.0001 | 1.82 | 405 | 1.7225 | 0.8326 | 0.8291 | 0.8286 | 0.8291 |
85
- | 0.0 | 1.89 | 420 | 1.7279 | 0.8326 | 0.8291 | 0.8286 | 0.8291 |
86
- | 0.0001 | 1.96 | 435 | 1.7450 | 0.8359 | 0.8319 | 0.8319 | 0.8319 |
87
- | 0.0 | 2.03 | 450 | 1.7531 | 0.8333 | 0.8291 | 0.8291 | 0.8291 |
88
- | 0.0001 | 2.09 | 465 | 1.7483 | 0.8327 | 0.8291 | 0.8292 | 0.8291 |
89
- | 0.0021 | 2.16 | 480 | 1.8257 | 0.8287 | 0.8207 | 0.8204 | 0.8207 |
90
- | 0.0002 | 2.23 | 495 | 1.8655 | 0.8306 | 0.8235 | 0.8236 | 0.8235 |
91
- | 0.0 | 2.3 | 510 | 1.8684 | 0.8285 | 0.8207 | 0.8210 | 0.8207 |
92
- | 0.0022 | 2.36 | 525 | 1.8991 | 0.8345 | 0.8263 | 0.8266 | 0.8263 |
93
- | 0.0001 | 2.43 | 540 | 1.9231 | 0.8244 | 0.8179 | 0.8177 | 0.8179 |
94
- | 0.052 | 2.5 | 555 | 1.9714 | 0.8131 | 0.8095 | 0.8092 | 0.8095 |
95
- | 0.0 | 2.57 | 570 | 2.0368 | 0.8110 | 0.8067 | 0.8062 | 0.8067 |
96
- | 0.0001 | 2.64 | 585 | 2.0347 | 0.8109 | 0.8067 | 0.8061 | 0.8067 |
97
- | 0.0001 | 2.7 | 600 | 2.0200 | 0.8137 | 0.8095 | 0.8089 | 0.8095 |
98
- | 0.0001 | 2.77 | 615 | 1.8724 | 0.8313 | 0.8263 | 0.8263 | 0.8263 |
99
- | 0.0 | 2.84 | 630 | 1.8784 | 0.8258 | 0.8207 | 0.8203 | 0.8207 |
100
- | 0.0 | 2.91 | 645 | 1.8808 | 0.8258 | 0.8207 | 0.8203 | 0.8207 |
101
- | 0.0 | 2.97 | 660 | 1.8829 | 0.8258 | 0.8207 | 0.8203 | 0.8207 |
102
- | 0.0 | 3.04 | 675 | 1.8849 | 0.8289 | 0.8235 | 0.8231 | 0.8235 |
103
- | 0.0 | 3.11 | 690 | 1.8872 | 0.8289 | 0.8235 | 0.8231 | 0.8235 |
104
- | 0.0 | 3.18 | 705 | 1.8890 | 0.8289 | 0.8235 | 0.8231 | 0.8235 |
105
- | 0.1104 | 3.24 | 720 | 1.9084 | 0.8306 | 0.8207 | 0.8198 | 0.8207 |
106
- | 0.0 | 3.31 | 735 | 1.9457 | 0.8269 | 0.8151 | 0.8139 | 0.8151 |
107
- | 0.0002 | 3.38 | 750 | 1.9062 | 0.8304 | 0.8207 | 0.8201 | 0.8207 |
108
- | 0.0001 | 3.45 | 765 | 1.9034 | 0.8298 | 0.8207 | 0.8200 | 0.8207 |
109
- | 0.0001 | 3.51 | 780 | 1.8853 | 0.8282 | 0.8207 | 0.8204 | 0.8207 |
110
- | 0.0 | 3.58 | 795 | 1.9085 | 0.8311 | 0.8235 | 0.8235 | 0.8235 |
111
- | 0.0 | 3.65 | 810 | 1.9161 | 0.8272 | 0.8207 | 0.8207 | 0.8207 |
112
- | 0.0442 | 3.72 | 825 | 1.8897 | 0.8334 | 0.8263 | 0.8263 | 0.8263 |
113
- | 0.0 | 3.78 | 840 | 1.8337 | 0.8376 | 0.8319 | 0.8316 | 0.8319 |
114
- | 0.0 | 3.85 | 855 | 1.8333 | 0.8415 | 0.8347 | 0.8345 | 0.8347 |
115
- | 0.0 | 3.92 | 870 | 1.8339 | 0.8415 | 0.8347 | 0.8345 | 0.8347 |
116
- | 0.0 | 3.99 | 885 | 1.8350 | 0.8415 | 0.8347 | 0.8345 | 0.8347 |
117
- | 0.0 | 4.05 | 900 | 1.8377 | 0.8415 | 0.8347 | 0.8345 | 0.8347 |
118
- | 0.0 | 4.12 | 915 | 1.8457 | 0.8347 | 0.8291 | 0.8287 | 0.8291 |
119
- | 0.0 | 4.19 | 930 | 1.8496 | 0.8318 | 0.8263 | 0.8260 | 0.8263 |
120
- | 0.0001 | 4.26 | 945 | 1.8585 | 0.8372 | 0.8291 | 0.8291 | 0.8291 |
121
- | 0.0517 | 4.32 | 960 | 1.8551 | 0.8372 | 0.8291 | 0.8291 | 0.8291 |
122
- | 0.0 | 4.39 | 975 | 1.8431 | 0.8330 | 0.8263 | 0.8262 | 0.8263 |
123
- | 0.0 | 4.46 | 990 | 1.8457 | 0.8320 | 0.8263 | 0.8259 | 0.8263 |
124
- | 0.0001 | 4.53 | 1005 | 1.8428 | 0.8378 | 0.8319 | 0.8316 | 0.8319 |
125
- | 0.0 | 4.59 | 1020 | 1.8415 | 0.8378 | 0.8319 | 0.8316 | 0.8319 |
126
- | 0.0401 | 4.66 | 1035 | 1.8479 | 0.8378 | 0.8319 | 0.8316 | 0.8319 |
127
- | 0.0003 | 4.73 | 1050 | 1.8510 | 0.8378 | 0.8319 | 0.8316 | 0.8319 |
128
- | 0.0004 | 4.8 | 1065 | 1.8525 | 0.8345 | 0.8291 | 0.8288 | 0.8291 |
129
- | 0.0 | 4.86 | 1080 | 1.8542 | 0.8345 | 0.8291 | 0.8288 | 0.8291 |
130
- | 0.0 | 4.93 | 1095 | 1.8545 | 0.8345 | 0.8291 | 0.8288 | 0.8291 |
131
- | 0.0025 | 5.0 | 1110 | 1.8547 | 0.8345 | 0.8291 | 0.8288 | 0.8291 |
132
-
133
-
134
  ### Framework versions
135
 
136
  - Transformers 4.39.1
 
3
  base_model: distilbert-base-cased
4
  tags:
5
  - generated_from_trainer
 
 
 
 
 
6
  model-index:
7
  - name: trainerL
8
  results: []
 
15
 
16
  This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - eval_loss: 0.8068
19
+ - eval_precision: 0.8217
20
+ - eval_recall: 0.8168
21
+ - eval_f1: 0.8162
22
+ - eval_accuracy: 0.8168
23
+ - eval_runtime: 58.8181
24
+ - eval_samples_per_second: 9.283
25
+ - eval_steps_per_second: 1.173
26
+ - epoch: 1.23
27
+ - step: 420
28
 
29
  ## Model description
30
 
 
51
  - lr_scheduler_type: linear
52
  - num_epochs: 5
53
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54
  ### Framework versions
55
 
56
  - Transformers 4.39.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ac310f8ed541e959bfb50961b39f7822771facd164e22e0387709d024511f72e
3
  size 263160068
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d91ee5b80164c0c424db6def7d2a7da8b15a1e99d2738968de8e6b4924143694
3
  size 263160068
runs/Mar23_17-06-08_9cfae778a1ed/events.out.tfevents.1711213574.9cfae778a1ed.298.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:822cfb59f0cffac7381f3f016b563c009f54dfcd6f8c88977676327e905ab02b
3
+ size 24007
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:66d8328421ae5ec96c843e642d3ca344e190471f48c8edf77933df6be77f5a62
3
  size 4920
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0f15ab20112200eba08c3c8ffa3a7feff115da30d85fbb9e13791d1ad2b59972
3
  size 4920