Narkantak commited on
Commit
2fc82b0
1 Parent(s): 274e6e0

End of training

Browse files
Files changed (2) hide show
  1. README.md +225 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,225 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: google-bert/bert-large-uncased
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ - f1
9
+ - precision
10
+ - recall
11
+ model-index:
12
+ - name: Intent-classification-BERT-Large-Ashuv2
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # Intent-classification-BERT-Large-Ashuv2
20
+
21
+ This model is a fine-tuned version of [google-bert/bert-large-uncased](https://huggingface.co/google-bert/bert-large-uncased) on an unknown dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.7819
24
+ - Accuracy: 0.8571
25
+ - F1: 0.7838
26
+ - Precision: 0.7803
27
+ - Recall: 0.7898
28
+
29
+ ## Model description
30
+
31
+ More information needed
32
+
33
+ ## Intended uses & limitations
34
+
35
+ More information needed
36
+
37
+ ## Training and evaluation data
38
+
39
+ More information needed
40
+
41
+ ## Training procedure
42
+
43
+ ### Training hyperparameters
44
+
45
+ The following hyperparameters were used during training:
46
+ - learning_rate: 2e-05
47
+ - train_batch_size: 16
48
+ - eval_batch_size: 32
49
+ - seed: 42
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: linear
52
+ - num_epochs: 100
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
57
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
58
+ | 1.4771 | 0.62 | 10 | 1.4650 | 0.5484 | 0.3724 | 0.3262 | 0.4815 |
59
+ | 1.1928 | 1.25 | 20 | 1.2691 | 0.5968 | 0.4620 | 0.4652 | 0.5370 |
60
+ | 0.9911 | 1.88 | 30 | 1.1678 | 0.6129 | 0.4794 | 0.4577 | 0.5556 |
61
+ | 0.7512 | 2.5 | 40 | 0.9525 | 0.6774 | 0.5424 | 0.4873 | 0.6296 |
62
+ | 0.7064 | 3.12 | 50 | 0.8495 | 0.6613 | 0.5319 | 0.4973 | 0.6111 |
63
+ | 0.5449 | 3.75 | 60 | 0.8052 | 0.6774 | 0.5744 | 0.6563 | 0.6349 |
64
+ | 0.4537 | 4.38 | 70 | 0.8058 | 0.7097 | 0.6281 | 0.6737 | 0.6772 |
65
+ | 0.398 | 5.0 | 80 | 0.5916 | 0.7581 | 0.7026 | 0.7035 | 0.7434 |
66
+ | 0.2933 | 5.62 | 90 | 0.8724 | 0.6935 | 0.6113 | 0.6623 | 0.6587 |
67
+ | 0.2834 | 6.25 | 100 | 0.6894 | 0.7419 | 0.7046 | 0.6973 | 0.7376 |
68
+ | 0.263 | 6.88 | 110 | 0.7285 | 0.7419 | 0.7244 | 0.7212 | 0.7556 |
69
+ | 0.181 | 7.5 | 120 | 0.6566 | 0.7419 | 0.7546 | 0.7617 | 0.7670 |
70
+ | 0.1736 | 8.12 | 130 | 1.0789 | 0.7903 | 0.7539 | 0.7372 | 0.7963 |
71
+ | 0.1837 | 8.75 | 140 | 0.8295 | 0.7419 | 0.7244 | 0.7212 | 0.7556 |
72
+ | 0.1696 | 9.38 | 150 | 1.1323 | 0.7581 | 0.7431 | 0.7313 | 0.7741 |
73
+ | 0.1758 | 10.0 | 160 | 0.8965 | 0.7258 | 0.7360 | 0.7516 | 0.7485 |
74
+ | 0.152 | 10.62 | 170 | 1.0633 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
75
+ | 0.1169 | 11.25 | 180 | 1.1007 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
76
+ | 0.1407 | 11.88 | 190 | 1.0659 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
77
+ | 0.0788 | 12.5 | 200 | 1.2677 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
78
+ | 0.2394 | 13.12 | 210 | 0.8819 | 0.7419 | 0.7645 | 0.7639 | 0.7744 |
79
+ | 0.114 | 13.75 | 220 | 1.1865 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
80
+ | 0.1454 | 14.38 | 230 | 1.3365 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
81
+ | 0.1023 | 15.0 | 240 | 1.2334 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
82
+ | 0.132 | 15.62 | 250 | 1.3341 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
83
+ | 0.1199 | 16.25 | 260 | 1.1251 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
84
+ | 0.1161 | 16.88 | 270 | 1.2843 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
85
+ | 0.0924 | 17.5 | 280 | 1.4196 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
86
+ | 0.1167 | 18.12 | 290 | 1.2224 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
87
+ | 0.1063 | 18.75 | 300 | 1.2558 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
88
+ | 0.1121 | 19.38 | 310 | 1.4312 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
89
+ | 0.1198 | 20.0 | 320 | 1.4862 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
90
+ | 0.1152 | 20.62 | 330 | 1.4057 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
91
+ | 0.0827 | 21.25 | 340 | 1.4738 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
92
+ | 0.1257 | 21.88 | 350 | 1.4706 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
93
+ | 0.1021 | 22.5 | 360 | 1.3139 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
94
+ | 0.1244 | 23.12 | 370 | 1.4685 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
95
+ | 0.1173 | 23.75 | 380 | 1.5196 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
96
+ | 0.0951 | 24.38 | 390 | 1.5036 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
97
+ | 0.1069 | 25.0 | 400 | 1.5056 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
98
+ | 0.1051 | 25.62 | 410 | 1.5297 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
99
+ | 0.1073 | 26.25 | 420 | 1.5805 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
100
+ | 0.0913 | 26.88 | 430 | 1.6029 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
101
+ | 0.0826 | 27.5 | 440 | 1.6013 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
102
+ | 0.0926 | 28.12 | 450 | 1.5705 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
103
+ | 0.0981 | 28.75 | 460 | 1.5954 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
104
+ | 0.0823 | 29.38 | 470 | 1.6280 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
105
+ | 0.1233 | 30.0 | 480 | 1.6143 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
106
+ | 0.098 | 30.62 | 490 | 1.5885 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
107
+ | 0.072 | 31.25 | 500 | 1.5868 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
108
+ | 0.1248 | 31.88 | 510 | 1.6264 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
109
+ | 0.1007 | 32.5 | 520 | 1.6531 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
110
+ | 0.0829 | 33.12 | 530 | 1.6675 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
111
+ | 0.0892 | 33.75 | 540 | 1.6814 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
112
+ | 0.1048 | 34.38 | 550 | 1.6926 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
113
+ | 0.1189 | 35.0 | 560 | 1.6922 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
114
+ | 0.0904 | 35.62 | 570 | 1.6460 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
115
+ | 0.088 | 36.25 | 580 | 1.6609 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
116
+ | 0.0902 | 36.88 | 590 | 1.7090 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
117
+ | 0.1151 | 37.5 | 600 | 1.7120 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
118
+ | 0.0665 | 38.12 | 610 | 1.7139 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
119
+ | 0.1057 | 38.75 | 620 | 1.7650 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
120
+ | 0.0926 | 39.38 | 630 | 1.7536 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
121
+ | 0.1225 | 40.0 | 640 | 1.6866 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
122
+ | 0.073 | 40.62 | 650 | 1.5809 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
123
+ | 0.1006 | 41.25 | 660 | 1.6110 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
124
+ | 0.096 | 41.88 | 670 | 1.6937 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
125
+ | 0.0824 | 42.5 | 680 | 1.7297 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
126
+ | 0.0803 | 43.12 | 690 | 1.7237 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
127
+ | 0.1029 | 43.75 | 700 | 1.7103 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
128
+ | 0.0923 | 44.38 | 710 | 1.7442 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
129
+ | 0.0939 | 45.0 | 720 | 1.7685 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
130
+ | 0.0894 | 45.62 | 730 | 1.7926 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
131
+ | 0.0954 | 46.25 | 740 | 1.7750 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
132
+ | 0.0947 | 46.88 | 750 | 1.7498 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
133
+ | 0.0621 | 47.5 | 760 | 1.7799 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
134
+ | 0.1132 | 48.12 | 770 | 1.7738 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
135
+ | 0.1054 | 48.75 | 780 | 1.7489 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
136
+ | 0.0764 | 49.38 | 790 | 1.7737 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
137
+ | 0.1055 | 50.0 | 800 | 1.7924 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
138
+ | 0.0754 | 50.62 | 810 | 1.7958 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
139
+ | 0.112 | 51.25 | 820 | 1.7691 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
140
+ | 0.0937 | 51.88 | 830 | 1.7532 | 0.7581 | 0.7451 | 0.7394 | 0.7688 |
141
+ | 0.0865 | 52.5 | 840 | 1.7491 | 0.7581 | 0.7451 | 0.7394 | 0.7688 |
142
+ | 0.0942 | 53.12 | 850 | 1.7697 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
143
+ | 0.0833 | 53.75 | 860 | 1.8022 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
144
+ | 0.0979 | 54.38 | 870 | 1.8034 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
145
+ | 0.0949 | 55.0 | 880 | 1.7938 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
146
+ | 0.0836 | 55.62 | 890 | 1.7926 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
147
+ | 0.0988 | 56.25 | 900 | 1.7862 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
148
+ | 0.0872 | 56.88 | 910 | 1.7967 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
149
+ | 0.0891 | 57.5 | 920 | 1.8087 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
150
+ | 0.0836 | 58.12 | 930 | 1.8217 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
151
+ | 0.085 | 58.75 | 940 | 1.8281 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
152
+ | 0.0917 | 59.38 | 950 | 1.8320 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
153
+ | 0.0931 | 60.0 | 960 | 1.8480 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
154
+ | 0.091 | 60.62 | 970 | 1.8438 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
155
+ | 0.0782 | 61.25 | 980 | 1.8527 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
156
+ | 0.1032 | 61.88 | 990 | 1.8643 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
157
+ | 0.1105 | 62.5 | 1000 | 1.8522 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
158
+ | 0.0732 | 63.12 | 1010 | 1.8443 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
159
+ | 0.0879 | 63.75 | 1020 | 1.8477 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
160
+ | 0.0991 | 64.38 | 1030 | 1.8533 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
161
+ | 0.0827 | 65.0 | 1040 | 1.8358 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
162
+ | 0.0942 | 65.62 | 1050 | 1.8442 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
163
+ | 0.0935 | 66.25 | 1060 | 1.8537 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
164
+ | 0.0818 | 66.88 | 1070 | 1.8601 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
165
+ | 0.0993 | 67.5 | 1080 | 1.8696 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
166
+ | 0.1181 | 68.12 | 1090 | 1.8594 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
167
+ | 0.1096 | 68.75 | 1100 | 1.8438 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
168
+ | 0.0545 | 69.38 | 1110 | 1.8344 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
169
+ | 0.0994 | 70.0 | 1120 | 1.8409 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
170
+ | 0.0905 | 70.62 | 1130 | 1.8529 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
171
+ | 0.1115 | 71.25 | 1140 | 1.8463 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
172
+ | 0.0775 | 71.88 | 1150 | 1.8440 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
173
+ | 0.1055 | 72.5 | 1160 | 1.8457 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
174
+ | 0.074 | 73.12 | 1170 | 1.8525 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
175
+ | 0.1023 | 73.75 | 1180 | 1.8586 | 0.7258 | 0.7333 | 0.7325 | 0.7466 |
176
+ | 0.1012 | 74.38 | 1190 | 1.8704 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
177
+ | 0.0814 | 75.0 | 1200 | 1.8778 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
178
+ | 0.0786 | 75.62 | 1210 | 1.8753 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
179
+ | 0.0852 | 76.25 | 1220 | 1.8770 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
180
+ | 0.112 | 76.88 | 1230 | 1.8797 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
181
+ | 0.0876 | 77.5 | 1240 | 1.8838 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
182
+ | 0.0779 | 78.12 | 1250 | 1.8866 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
183
+ | 0.0949 | 78.75 | 1260 | 1.8897 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
184
+ | 0.0946 | 79.38 | 1270 | 1.8907 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
185
+ | 0.0812 | 80.0 | 1280 | 1.8892 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
186
+ | 0.0844 | 80.62 | 1290 | 1.8903 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
187
+ | 0.0977 | 81.25 | 1300 | 1.8894 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
188
+ | 0.0787 | 81.88 | 1310 | 1.8935 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
189
+ | 0.1164 | 82.5 | 1320 | 1.8920 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
190
+ | 0.0752 | 83.12 | 1330 | 1.8886 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
191
+ | 0.0898 | 83.75 | 1340 | 1.8896 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
192
+ | 0.0983 | 84.38 | 1350 | 1.8847 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
193
+ | 0.095 | 85.0 | 1360 | 1.8840 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
194
+ | 0.0727 | 85.62 | 1370 | 1.8853 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
195
+ | 0.1182 | 86.25 | 1380 | 1.8857 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
196
+ | 0.0681 | 86.88 | 1390 | 1.8829 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
197
+ | 0.1079 | 87.5 | 1400 | 1.8880 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
198
+ | 0.0897 | 88.12 | 1410 | 1.8882 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
199
+ | 0.0675 | 88.75 | 1420 | 1.8889 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
200
+ | 0.1091 | 89.38 | 1430 | 1.8894 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
201
+ | 0.0831 | 90.0 | 1440 | 1.8917 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
202
+ | 0.0815 | 90.62 | 1450 | 1.8949 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
203
+ | 0.0903 | 91.25 | 1460 | 1.8959 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
204
+ | 0.0937 | 91.88 | 1470 | 1.9001 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
205
+ | 0.0797 | 92.5 | 1480 | 1.9006 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
206
+ | 0.1141 | 93.12 | 1490 | 1.9017 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
207
+ | 0.0696 | 93.75 | 1500 | 1.9018 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
208
+ | 0.0979 | 94.38 | 1510 | 1.9038 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
209
+ | 0.0846 | 95.0 | 1520 | 1.9055 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
210
+ | 0.078 | 95.62 | 1530 | 1.9060 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
211
+ | 0.0947 | 96.25 | 1540 | 1.9067 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
212
+ | 0.0823 | 96.88 | 1550 | 1.9081 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
213
+ | 0.1367 | 97.5 | 1560 | 1.9081 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
214
+ | 0.0597 | 98.12 | 1570 | 1.9085 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
215
+ | 0.1036 | 98.75 | 1580 | 1.9086 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
216
+ | 0.0826 | 99.38 | 1590 | 1.9089 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
217
+ | 0.0917 | 100.0 | 1600 | 1.9090 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
218
+
219
+
220
+ ### Framework versions
221
+
222
+ - Transformers 4.38.2
223
+ - Pytorch 2.1.2
224
+ - Datasets 2.1.0
225
+ - Tokenizers 0.15.2
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b2c10631842bcc1f1cf7d4d69894227b35de6aa83d3692e04e42139a34df605d
3
  size 1340639160
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:452e6f43045619e182108d1b5784b08bdd077ae8182448479194855a81a86dce
3
  size 1340639160