matchten commited on
Commit
4c078b7
1 Parent(s): 10c6836

End of training

Browse files
README.md CHANGED
@@ -2,11 +2,38 @@
2
  base_model: cardiffnlp/twitter-roberta-base-sentiment-latest
3
  tags:
4
  - generated_from_trainer
 
 
5
  metrics:
6
  - accuracy
 
 
 
7
  model-index:
8
  - name: text-message-analyzer-finetuned
9
- results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -14,10 +41,13 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # text-message-analyzer-finetuned
16
 
17
- This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-sentiment-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-sentiment-latest) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.1048
20
- - Accuracy: 0.9587
 
 
 
21
 
22
  ## Model description
23
 
@@ -46,14 +76,197 @@ The following hyperparameters were used during training:
46
 
47
  ### Training results
48
 
49
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
50
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
51
- | 0.1972 | 1.0 | 935 | 0.1048 | 0.9587 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52
 
53
 
54
  ### Framework versions
55
 
56
- - Transformers 4.35.0
57
  - Pytorch 2.1.0+cu118
58
- - Datasets 2.14.6
59
- - Tokenizers 0.14.1
 
2
  base_model: cardiffnlp/twitter-roberta-base-sentiment-latest
3
  tags:
4
  - generated_from_trainer
5
+ datasets:
6
+ - daily_dialog
7
  metrics:
8
  - accuracy
9
+ - f1
10
+ - precision
11
+ - recall
12
  model-index:
13
  - name: text-message-analyzer-finetuned
14
+ results:
15
+ - task:
16
+ name: Text Classification
17
+ type: text-classification
18
+ dataset:
19
+ name: daily_dialog
20
+ type: daily_dialog
21
+ config: default
22
+ split: validation
23
+ args: default
24
+ metrics:
25
+ - name: Accuracy
26
+ type: accuracy
27
+ value: 0.762
28
+ - name: F1
29
+ type: f1
30
+ value: 0.7650409655164931
31
+ - name: Precision
32
+ type: precision
33
+ value: 0.7705665981905709
34
+ - name: Recall
35
+ type: recall
36
+ value: 0.762
37
  ---
38
 
39
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
41
 
42
  # text-message-analyzer-finetuned
43
 
44
+ This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-sentiment-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-sentiment-latest) on the daily_dialog dataset.
45
  It achieves the following results on the evaluation set:
46
+ - Loss: 0.5913
47
+ - Accuracy: 0.762
48
+ - F1: 0.7650
49
+ - Precision: 0.7706
50
+ - Recall: 0.762
51
 
52
  ## Model description
53
 
 
76
 
77
  ### Training results
78
 
79
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
80
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
81
+ | No log | 0.01 | 5 | 0.8666 | 0.589 | 0.5903 | 0.6104 | 0.589 |
82
+ | No log | 0.01 | 10 | 0.7596 | 0.661 | 0.6590 | 0.6603 | 0.661 |
83
+ | No log | 0.02 | 15 | 1.1783 | 0.521 | 0.5244 | 0.7242 | 0.521 |
84
+ | No log | 0.02 | 20 | 0.8909 | 0.615 | 0.6318 | 0.6910 | 0.615 |
85
+ | No log | 0.03 | 25 | 0.7995 | 0.666 | 0.6743 | 0.6918 | 0.666 |
86
+ | No log | 0.03 | 30 | 0.7699 | 0.65 | 0.6585 | 0.6935 | 0.65 |
87
+ | No log | 0.04 | 35 | 0.7344 | 0.662 | 0.6691 | 0.6857 | 0.662 |
88
+ | No log | 0.04 | 40 | 0.7326 | 0.654 | 0.6675 | 0.7036 | 0.654 |
89
+ | No log | 0.05 | 45 | 0.9608 | 0.603 | 0.5705 | 0.7211 | 0.603 |
90
+ | No log | 0.05 | 50 | 0.8593 | 0.628 | 0.6338 | 0.7262 | 0.628 |
91
+ | No log | 0.06 | 55 | 0.8635 | 0.626 | 0.6066 | 0.7400 | 0.626 |
92
+ | No log | 0.07 | 60 | 0.7101 | 0.682 | 0.6782 | 0.6911 | 0.682 |
93
+ | No log | 0.07 | 65 | 0.7569 | 0.67 | 0.6780 | 0.7067 | 0.67 |
94
+ | No log | 0.08 | 70 | 0.7694 | 0.653 | 0.6608 | 0.7271 | 0.653 |
95
+ | No log | 0.08 | 75 | 0.6941 | 0.691 | 0.6925 | 0.7202 | 0.691 |
96
+ | No log | 0.09 | 80 | 0.8646 | 0.606 | 0.6168 | 0.7450 | 0.606 |
97
+ | No log | 0.09 | 85 | 0.6853 | 0.677 | 0.6895 | 0.7369 | 0.677 |
98
+ | No log | 0.1 | 90 | 0.6410 | 0.727 | 0.7264 | 0.7272 | 0.727 |
99
+ | No log | 0.1 | 95 | 0.7059 | 0.693 | 0.7020 | 0.7410 | 0.693 |
100
+ | No log | 0.11 | 100 | 0.7398 | 0.665 | 0.6734 | 0.7441 | 0.665 |
101
+ | No log | 0.11 | 105 | 0.7205 | 0.683 | 0.6884 | 0.7243 | 0.683 |
102
+ | No log | 0.12 | 110 | 0.7492 | 0.661 | 0.6741 | 0.7410 | 0.661 |
103
+ | No log | 0.12 | 115 | 0.7273 | 0.676 | 0.6932 | 0.7388 | 0.676 |
104
+ | No log | 0.13 | 120 | 0.6670 | 0.678 | 0.6853 | 0.7079 | 0.678 |
105
+ | No log | 0.14 | 125 | 0.7238 | 0.663 | 0.6707 | 0.7348 | 0.663 |
106
+ | No log | 0.14 | 130 | 0.7109 | 0.68 | 0.6948 | 0.7333 | 0.68 |
107
+ | No log | 0.15 | 135 | 0.6813 | 0.685 | 0.6832 | 0.7324 | 0.685 |
108
+ | No log | 0.15 | 140 | 0.6859 | 0.692 | 0.7002 | 0.7304 | 0.692 |
109
+ | No log | 0.16 | 145 | 0.7968 | 0.622 | 0.6231 | 0.7268 | 0.622 |
110
+ | No log | 0.16 | 150 | 0.6754 | 0.695 | 0.7022 | 0.7212 | 0.695 |
111
+ | No log | 0.17 | 155 | 0.6520 | 0.698 | 0.6981 | 0.7296 | 0.698 |
112
+ | No log | 0.17 | 160 | 0.6198 | 0.726 | 0.7282 | 0.7334 | 0.726 |
113
+ | No log | 0.18 | 165 | 0.6745 | 0.703 | 0.6974 | 0.7346 | 0.703 |
114
+ | No log | 0.18 | 170 | 0.6724 | 0.707 | 0.7182 | 0.7486 | 0.707 |
115
+ | No log | 0.19 | 175 | 0.7787 | 0.636 | 0.6392 | 0.7409 | 0.636 |
116
+ | No log | 0.2 | 180 | 0.7098 | 0.667 | 0.6663 | 0.7338 | 0.667 |
117
+ | No log | 0.2 | 185 | 0.6340 | 0.728 | 0.7290 | 0.7340 | 0.728 |
118
+ | No log | 0.21 | 190 | 0.6561 | 0.698 | 0.7023 | 0.7229 | 0.698 |
119
+ | No log | 0.21 | 195 | 0.6790 | 0.678 | 0.6804 | 0.7318 | 0.678 |
120
+ | No log | 0.22 | 200 | 0.7213 | 0.654 | 0.6497 | 0.7337 | 0.654 |
121
+ | No log | 0.22 | 205 | 0.7410 | 0.652 | 0.6609 | 0.7242 | 0.652 |
122
+ | No log | 0.23 | 210 | 0.6848 | 0.703 | 0.7084 | 0.7332 | 0.703 |
123
+ | No log | 0.23 | 215 | 0.6946 | 0.689 | 0.6796 | 0.7291 | 0.689 |
124
+ | No log | 0.24 | 220 | 0.7092 | 0.674 | 0.6870 | 0.7311 | 0.674 |
125
+ | No log | 0.24 | 225 | 0.6285 | 0.705 | 0.7085 | 0.7295 | 0.705 |
126
+ | No log | 0.25 | 230 | 0.6449 | 0.696 | 0.6990 | 0.7166 | 0.696 |
127
+ | No log | 0.25 | 235 | 0.7303 | 0.671 | 0.6694 | 0.7366 | 0.671 |
128
+ | No log | 0.26 | 240 | 0.7583 | 0.67 | 0.6822 | 0.7399 | 0.67 |
129
+ | No log | 0.27 | 245 | 0.7154 | 0.678 | 0.6866 | 0.7443 | 0.678 |
130
+ | No log | 0.27 | 250 | 0.7337 | 0.686 | 0.6852 | 0.7369 | 0.686 |
131
+ | No log | 0.28 | 255 | 0.6486 | 0.711 | 0.7136 | 0.7362 | 0.711 |
132
+ | No log | 0.28 | 260 | 0.6231 | 0.736 | 0.7350 | 0.7410 | 0.736 |
133
+ | No log | 0.29 | 265 | 0.6963 | 0.709 | 0.7211 | 0.7532 | 0.709 |
134
+ | No log | 0.29 | 270 | 0.6847 | 0.693 | 0.7028 | 0.7403 | 0.693 |
135
+ | No log | 0.3 | 275 | 0.6581 | 0.696 | 0.6969 | 0.7464 | 0.696 |
136
+ | No log | 0.3 | 280 | 0.6182 | 0.702 | 0.7061 | 0.7187 | 0.702 |
137
+ | No log | 0.31 | 285 | 0.6653 | 0.682 | 0.6898 | 0.7144 | 0.682 |
138
+ | No log | 0.31 | 290 | 0.6917 | 0.699 | 0.7091 | 0.7372 | 0.699 |
139
+ | No log | 0.32 | 295 | 0.6722 | 0.704 | 0.7067 | 0.7285 | 0.704 |
140
+ | No log | 0.33 | 300 | 0.6582 | 0.703 | 0.7073 | 0.7238 | 0.703 |
141
+ | No log | 0.33 | 305 | 0.6568 | 0.687 | 0.6934 | 0.7146 | 0.687 |
142
+ | No log | 0.34 | 310 | 0.6912 | 0.665 | 0.6605 | 0.7292 | 0.665 |
143
+ | No log | 0.34 | 315 | 0.6223 | 0.71 | 0.7119 | 0.7311 | 0.71 |
144
+ | No log | 0.35 | 320 | 0.6409 | 0.714 | 0.7146 | 0.7244 | 0.714 |
145
+ | No log | 0.35 | 325 | 0.7169 | 0.689 | 0.7023 | 0.7385 | 0.689 |
146
+ | No log | 0.36 | 330 | 0.7887 | 0.649 | 0.6580 | 0.7435 | 0.649 |
147
+ | No log | 0.36 | 335 | 0.6594 | 0.694 | 0.6987 | 0.7111 | 0.694 |
148
+ | No log | 0.37 | 340 | 0.6559 | 0.713 | 0.7121 | 0.7137 | 0.713 |
149
+ | No log | 0.37 | 345 | 0.6490 | 0.686 | 0.6927 | 0.7076 | 0.686 |
150
+ | No log | 0.38 | 350 | 0.6964 | 0.67 | 0.6837 | 0.7424 | 0.67 |
151
+ | No log | 0.39 | 355 | 0.7011 | 0.669 | 0.6873 | 0.7460 | 0.669 |
152
+ | No log | 0.39 | 360 | 0.6987 | 0.668 | 0.6875 | 0.7409 | 0.668 |
153
+ | No log | 0.4 | 365 | 0.6375 | 0.696 | 0.7057 | 0.7340 | 0.696 |
154
+ | No log | 0.4 | 370 | 0.6365 | 0.695 | 0.6972 | 0.7270 | 0.695 |
155
+ | No log | 0.41 | 375 | 0.6212 | 0.712 | 0.7190 | 0.7488 | 0.712 |
156
+ | No log | 0.41 | 380 | 0.7102 | 0.667 | 0.6770 | 0.7532 | 0.667 |
157
+ | No log | 0.42 | 385 | 0.7385 | 0.66 | 0.6616 | 0.7498 | 0.66 |
158
+ | No log | 0.42 | 390 | 0.6221 | 0.723 | 0.7276 | 0.7533 | 0.723 |
159
+ | No log | 0.43 | 395 | 0.6174 | 0.74 | 0.7469 | 0.7651 | 0.74 |
160
+ | No log | 0.43 | 400 | 0.6092 | 0.748 | 0.7538 | 0.7644 | 0.748 |
161
+ | No log | 0.44 | 405 | 0.5978 | 0.737 | 0.7412 | 0.7483 | 0.737 |
162
+ | No log | 0.44 | 410 | 0.6645 | 0.697 | 0.6964 | 0.7402 | 0.697 |
163
+ | No log | 0.45 | 415 | 0.7153 | 0.67 | 0.6654 | 0.7372 | 0.67 |
164
+ | No log | 0.46 | 420 | 0.6236 | 0.728 | 0.7343 | 0.7560 | 0.728 |
165
+ | No log | 0.46 | 425 | 0.7162 | 0.682 | 0.6915 | 0.7441 | 0.682 |
166
+ | No log | 0.47 | 430 | 0.6658 | 0.712 | 0.7228 | 0.7530 | 0.712 |
167
+ | No log | 0.47 | 435 | 0.6350 | 0.725 | 0.7326 | 0.7535 | 0.725 |
168
+ | No log | 0.48 | 440 | 0.5977 | 0.725 | 0.7293 | 0.7378 | 0.725 |
169
+ | No log | 0.48 | 445 | 0.5900 | 0.722 | 0.7246 | 0.7312 | 0.722 |
170
+ | No log | 0.49 | 450 | 0.5993 | 0.716 | 0.7198 | 0.7327 | 0.716 |
171
+ | No log | 0.49 | 455 | 0.6322 | 0.711 | 0.7189 | 0.7450 | 0.711 |
172
+ | No log | 0.5 | 460 | 0.7598 | 0.668 | 0.6824 | 0.7507 | 0.668 |
173
+ | No log | 0.5 | 465 | 0.7033 | 0.7 | 0.7133 | 0.7620 | 0.7 |
174
+ | No log | 0.51 | 470 | 0.6343 | 0.726 | 0.7348 | 0.7525 | 0.726 |
175
+ | No log | 0.52 | 475 | 0.6080 | 0.729 | 0.7352 | 0.7507 | 0.729 |
176
+ | No log | 0.52 | 480 | 0.5939 | 0.741 | 0.7455 | 0.7539 | 0.741 |
177
+ | No log | 0.53 | 485 | 0.6038 | 0.739 | 0.7448 | 0.7560 | 0.739 |
178
+ | No log | 0.53 | 490 | 0.6240 | 0.734 | 0.7386 | 0.7566 | 0.734 |
179
+ | No log | 0.54 | 495 | 0.6442 | 0.724 | 0.7323 | 0.7560 | 0.724 |
180
+ | 0.7055 | 0.54 | 500 | 0.7067 | 0.71 | 0.7237 | 0.7583 | 0.71 |
181
+ | 0.7055 | 0.55 | 505 | 0.7353 | 0.704 | 0.7133 | 0.7484 | 0.704 |
182
+ | 0.7055 | 0.55 | 510 | 0.6534 | 0.733 | 0.7377 | 0.7475 | 0.733 |
183
+ | 0.7055 | 0.56 | 515 | 0.7046 | 0.729 | 0.7315 | 0.7533 | 0.729 |
184
+ | 0.7055 | 0.56 | 520 | 0.7140 | 0.711 | 0.7130 | 0.7487 | 0.711 |
185
+ | 0.7055 | 0.57 | 525 | 0.6423 | 0.716 | 0.7193 | 0.7443 | 0.716 |
186
+ | 0.7055 | 0.57 | 530 | 0.6074 | 0.733 | 0.7377 | 0.7481 | 0.733 |
187
+ | 0.7055 | 0.58 | 535 | 0.6066 | 0.735 | 0.7405 | 0.7513 | 0.735 |
188
+ | 0.7055 | 0.59 | 540 | 0.5945 | 0.732 | 0.7374 | 0.7486 | 0.732 |
189
+ | 0.7055 | 0.59 | 545 | 0.6231 | 0.705 | 0.7112 | 0.7439 | 0.705 |
190
+ | 0.7055 | 0.6 | 550 | 0.6108 | 0.737 | 0.7460 | 0.7660 | 0.737 |
191
+ | 0.7055 | 0.6 | 555 | 0.5846 | 0.754 | 0.7572 | 0.7675 | 0.754 |
192
+ | 0.7055 | 0.61 | 560 | 0.5965 | 0.748 | 0.7496 | 0.7640 | 0.748 |
193
+ | 0.7055 | 0.61 | 565 | 0.5849 | 0.753 | 0.7577 | 0.7687 | 0.753 |
194
+ | 0.7055 | 0.62 | 570 | 0.6037 | 0.723 | 0.7269 | 0.7514 | 0.723 |
195
+ | 0.7055 | 0.62 | 575 | 0.5773 | 0.742 | 0.7455 | 0.7598 | 0.742 |
196
+ | 0.7055 | 0.63 | 580 | 0.5661 | 0.751 | 0.7545 | 0.7607 | 0.751 |
197
+ | 0.7055 | 0.63 | 585 | 0.5717 | 0.752 | 0.7555 | 0.7626 | 0.752 |
198
+ | 0.7055 | 0.64 | 590 | 0.5905 | 0.762 | 0.7674 | 0.7808 | 0.762 |
199
+ | 0.7055 | 0.65 | 595 | 0.5876 | 0.759 | 0.7649 | 0.7773 | 0.759 |
200
+ | 0.7055 | 0.65 | 600 | 0.5651 | 0.77 | 0.7717 | 0.7741 | 0.77 |
201
+ | 0.7055 | 0.66 | 605 | 0.5791 | 0.748 | 0.7465 | 0.7502 | 0.748 |
202
+ | 0.7055 | 0.66 | 610 | 0.6135 | 0.721 | 0.7210 | 0.7434 | 0.721 |
203
+ | 0.7055 | 0.67 | 615 | 0.6268 | 0.723 | 0.7242 | 0.7523 | 0.723 |
204
+ | 0.7055 | 0.67 | 620 | 0.6211 | 0.71 | 0.7106 | 0.7449 | 0.71 |
205
+ | 0.7055 | 0.68 | 625 | 0.5829 | 0.757 | 0.7607 | 0.7742 | 0.757 |
206
+ | 0.7055 | 0.68 | 630 | 0.5718 | 0.765 | 0.7681 | 0.7744 | 0.765 |
207
+ | 0.7055 | 0.69 | 635 | 0.5685 | 0.775 | 0.7769 | 0.7830 | 0.775 |
208
+ | 0.7055 | 0.69 | 640 | 0.5731 | 0.752 | 0.7545 | 0.7653 | 0.752 |
209
+ | 0.7055 | 0.7 | 645 | 0.5903 | 0.733 | 0.7356 | 0.7570 | 0.733 |
210
+ | 0.7055 | 0.7 | 650 | 0.5973 | 0.73 | 0.7327 | 0.7575 | 0.73 |
211
+ | 0.7055 | 0.71 | 655 | 0.6056 | 0.72 | 0.7213 | 0.7535 | 0.72 |
212
+ | 0.7055 | 0.72 | 660 | 0.5617 | 0.763 | 0.7648 | 0.7703 | 0.763 |
213
+ | 0.7055 | 0.72 | 665 | 0.5781 | 0.761 | 0.7576 | 0.7688 | 0.761 |
214
+ | 0.7055 | 0.73 | 670 | 0.5993 | 0.745 | 0.7409 | 0.7650 | 0.745 |
215
+ | 0.7055 | 0.73 | 675 | 0.6027 | 0.746 | 0.7504 | 0.7675 | 0.746 |
216
+ | 0.7055 | 0.74 | 680 | 0.5825 | 0.751 | 0.7534 | 0.7600 | 0.751 |
217
+ | 0.7055 | 0.74 | 685 | 0.5742 | 0.745 | 0.7469 | 0.7513 | 0.745 |
218
+ | 0.7055 | 0.75 | 690 | 0.5907 | 0.731 | 0.7313 | 0.7462 | 0.731 |
219
+ | 0.7055 | 0.75 | 695 | 0.6017 | 0.734 | 0.7340 | 0.7555 | 0.734 |
220
+ | 0.7055 | 0.76 | 700 | 0.5767 | 0.746 | 0.7477 | 0.7599 | 0.746 |
221
+ | 0.7055 | 0.76 | 705 | 0.5859 | 0.747 | 0.7510 | 0.7676 | 0.747 |
222
+ | 0.7055 | 0.77 | 710 | 0.6001 | 0.747 | 0.7518 | 0.7690 | 0.747 |
223
+ | 0.7055 | 0.78 | 715 | 0.6427 | 0.719 | 0.7233 | 0.7541 | 0.719 |
224
+ | 0.7055 | 0.78 | 720 | 0.6600 | 0.72 | 0.7247 | 0.7556 | 0.72 |
225
+ | 0.7055 | 0.79 | 725 | 0.6365 | 0.744 | 0.7468 | 0.7640 | 0.744 |
226
+ | 0.7055 | 0.79 | 730 | 0.6089 | 0.754 | 0.7555 | 0.7596 | 0.754 |
227
+ | 0.7055 | 0.8 | 735 | 0.6050 | 0.749 | 0.7484 | 0.7494 | 0.749 |
228
+ | 0.7055 | 0.8 | 740 | 0.6120 | 0.745 | 0.7442 | 0.7518 | 0.745 |
229
+ | 0.7055 | 0.81 | 745 | 0.6205 | 0.736 | 0.7356 | 0.7490 | 0.736 |
230
+ | 0.7055 | 0.81 | 750 | 0.6174 | 0.737 | 0.7376 | 0.7544 | 0.737 |
231
+ | 0.7055 | 0.82 | 755 | 0.6222 | 0.733 | 0.7358 | 0.7585 | 0.733 |
232
+ | 0.7055 | 0.82 | 760 | 0.6216 | 0.737 | 0.7428 | 0.7636 | 0.737 |
233
+ | 0.7055 | 0.83 | 765 | 0.6138 | 0.749 | 0.7548 | 0.7691 | 0.749 |
234
+ | 0.7055 | 0.84 | 770 | 0.5977 | 0.76 | 0.7628 | 0.7682 | 0.76 |
235
+ | 0.7055 | 0.84 | 775 | 0.5930 | 0.762 | 0.7639 | 0.7671 | 0.762 |
236
+ | 0.7055 | 0.85 | 780 | 0.6002 | 0.762 | 0.7632 | 0.7682 | 0.762 |
237
+ | 0.7055 | 0.85 | 785 | 0.6029 | 0.76 | 0.7621 | 0.7676 | 0.76 |
238
+ | 0.7055 | 0.86 | 790 | 0.6068 | 0.751 | 0.7544 | 0.7615 | 0.751 |
239
+ | 0.7055 | 0.86 | 795 | 0.6188 | 0.746 | 0.7508 | 0.7615 | 0.746 |
240
+ | 0.7055 | 0.87 | 800 | 0.6398 | 0.725 | 0.7300 | 0.7486 | 0.725 |
241
+ | 0.7055 | 0.87 | 805 | 0.6555 | 0.717 | 0.7205 | 0.7461 | 0.717 |
242
+ | 0.7055 | 0.88 | 810 | 0.6550 | 0.726 | 0.7282 | 0.7578 | 0.726 |
243
+ | 0.7055 | 0.88 | 815 | 0.6376 | 0.726 | 0.7283 | 0.7474 | 0.726 |
244
+ | 0.7055 | 0.89 | 820 | 0.6115 | 0.741 | 0.7436 | 0.7524 | 0.741 |
245
+ | 0.7055 | 0.89 | 825 | 0.6048 | 0.756 | 0.7583 | 0.7638 | 0.756 |
246
+ | 0.7055 | 0.9 | 830 | 0.6039 | 0.753 | 0.7548 | 0.7591 | 0.753 |
247
+ | 0.7055 | 0.91 | 835 | 0.6018 | 0.754 | 0.7559 | 0.7605 | 0.754 |
248
+ | 0.7055 | 0.91 | 840 | 0.5967 | 0.757 | 0.7597 | 0.7653 | 0.757 |
249
+ | 0.7055 | 0.92 | 845 | 0.5937 | 0.766 | 0.7687 | 0.7738 | 0.766 |
250
+ | 0.7055 | 0.92 | 850 | 0.5945 | 0.766 | 0.7689 | 0.7740 | 0.766 |
251
+ | 0.7055 | 0.93 | 855 | 0.5951 | 0.764 | 0.7669 | 0.7722 | 0.764 |
252
+ | 0.7055 | 0.93 | 860 | 0.5953 | 0.761 | 0.7640 | 0.7699 | 0.761 |
253
+ | 0.7055 | 0.94 | 865 | 0.5977 | 0.762 | 0.7651 | 0.7726 | 0.762 |
254
+ | 0.7055 | 0.94 | 870 | 0.5969 | 0.763 | 0.7659 | 0.7733 | 0.763 |
255
+ | 0.7055 | 0.95 | 875 | 0.5957 | 0.764 | 0.7667 | 0.7740 | 0.764 |
256
+ | 0.7055 | 0.95 | 880 | 0.5927 | 0.762 | 0.7650 | 0.7717 | 0.762 |
257
+ | 0.7055 | 0.96 | 885 | 0.5916 | 0.763 | 0.7660 | 0.7715 | 0.763 |
258
+ | 0.7055 | 0.97 | 890 | 0.5935 | 0.762 | 0.7654 | 0.7717 | 0.762 |
259
+ | 0.7055 | 0.97 | 895 | 0.5934 | 0.759 | 0.7625 | 0.7689 | 0.759 |
260
+ | 0.7055 | 0.98 | 900 | 0.5919 | 0.763 | 0.7660 | 0.7715 | 0.763 |
261
+ | 0.7055 | 0.98 | 905 | 0.5913 | 0.762 | 0.7650 | 0.7705 | 0.762 |
262
+ | 0.7055 | 0.99 | 910 | 0.5916 | 0.764 | 0.7671 | 0.7726 | 0.764 |
263
+ | 0.7055 | 0.99 | 915 | 0.5916 | 0.762 | 0.7650 | 0.7706 | 0.762 |
264
+ | 0.7055 | 1.0 | 920 | 0.5913 | 0.762 | 0.7650 | 0.7706 | 0.762 |
265
 
266
 
267
  ### Framework versions
268
 
269
+ - Transformers 4.35.2
270
  - Pytorch 2.1.0+cu118
271
+ - Datasets 2.15.0
272
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:16afc3c45031edf8debf9ada13ce17e83cb9a55d98d7937bcdfc150755e38427
3
  size 498615900
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cc2799b9a28e80efe8deaa09ed9c62a40f2ca29a1880579da62bb08ec914c00c
3
  size 498615900
runs/Nov19_19-07-15_cd53a32cc461/events.out.tfevents.1700420839.cd53a32cc461.517.7 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:35142b1cae24042df789652876b2f70c7cea6cb2ab309064fe9e5a794a0a58ff
3
- size 51654
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b76fef90b2aca60892fc1672472e63bd2c613a4607ce50575a54acc4881406c
3
+ size 91656