martimfasantos commited on
Commit
8eaf049
1 Parent(s): c5b4368

Model save

Browse files
README.md ADDED
@@ -0,0 +1,186 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: martimfasantos/tinyllama-1.1b-sum-sft-full_old
4
+ tags:
5
+ - trl
6
+ - dpo
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: tinyllama-1.1b-sum-dpo-full_LR1e-7_2epochs_old
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # tinyllama-1.1b-sum-dpo-full_LR1e-7_2epochs_old
17
+
18
+ This model is a fine-tuned version of [martimfasantos/tinyllama-1.1b-sum-sft-full_old](https://huggingface.co/martimfasantos/tinyllama-1.1b-sum-sft-full_old) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.6549
21
+ - Rewards/chosen: -0.4976
22
+ - Rewards/rejected: -0.6011
23
+ - Rewards/accuracies: 0.6194
24
+ - Rewards/margins: 0.1035
25
+ - Logps/rejected: -123.2918
26
+ - Logps/chosen: -108.4708
27
+ - Logits/rejected: -2.5511
28
+ - Logits/chosen: -2.5579
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 1e-07
48
+ - train_batch_size: 8
49
+ - eval_batch_size: 8
50
+ - seed: 42
51
+ - distributed_type: multi-GPU
52
+ - gradient_accumulation_steps: 2
53
+ - total_train_batch_size: 16
54
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
+ - lr_scheduler_type: cosine
56
+ - lr_scheduler_warmup_ratio: 0.1
57
+ - num_epochs: 2
58
+
59
+ ### Training results
60
+
61
+ | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
62
+ |:-------------:|:------:|:-----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
63
+ | 0.6932 | 0.0172 | 100 | 0.6932 | -0.0000 | 0.0000 | 0.4930 | -0.0001 | -63.1768 | -58.7146 | -3.1573 | -3.1630 |
64
+ | 0.6932 | 0.0345 | 200 | 0.6932 | -0.0001 | -0.0000 | 0.4772 | -0.0001 | -63.1802 | -58.7210 | -3.1574 | -3.1630 |
65
+ | 0.6931 | 0.0517 | 300 | 0.6932 | -0.0000 | 0.0001 | 0.4840 | -0.0001 | -63.1670 | -58.7127 | -3.1573 | -3.1630 |
66
+ | 0.693 | 0.0689 | 400 | 0.6932 | -0.0000 | 0.0001 | 0.4828 | -0.0001 | -63.1728 | -58.7120 | -3.1575 | -3.1632 |
67
+ | 0.6931 | 0.0861 | 500 | 0.6932 | 0.0002 | 0.0003 | 0.4775 | -0.0001 | -63.1514 | -58.6883 | -3.1571 | -3.1627 |
68
+ | 0.6924 | 0.1034 | 600 | 0.6931 | 0.0004 | 0.0003 | 0.5021 | 0.0001 | -63.1466 | -58.6704 | -3.1564 | -3.1621 |
69
+ | 0.6926 | 0.1206 | 700 | 0.6931 | 0.0006 | 0.0004 | 0.5163 | 0.0002 | -63.1388 | -58.6536 | -3.1556 | -3.1613 |
70
+ | 0.6922 | 0.1378 | 800 | 0.6930 | 0.0011 | 0.0007 | 0.5328 | 0.0004 | -63.1062 | -58.6016 | -3.1544 | -3.1601 |
71
+ | 0.6919 | 0.1551 | 900 | 0.6928 | 0.0015 | 0.0008 | 0.5467 | 0.0008 | -63.1024 | -58.5586 | -3.1525 | -3.1581 |
72
+ | 0.6924 | 0.1723 | 1000 | 0.6926 | 0.0018 | 0.0007 | 0.5632 | 0.0011 | -63.1061 | -58.5285 | -3.1495 | -3.1551 |
73
+ | 0.6913 | 0.1895 | 1100 | 0.6924 | 0.0021 | 0.0006 | 0.5748 | 0.0015 | -63.1198 | -58.5001 | -3.1456 | -3.1512 |
74
+ | 0.6911 | 0.2068 | 1200 | 0.6921 | 0.0023 | 0.0001 | 0.5829 | 0.0022 | -63.1702 | -58.4863 | -3.1409 | -3.1465 |
75
+ | 0.6911 | 0.2240 | 1300 | 0.6918 | 0.0018 | -0.0011 | 0.5783 | 0.0029 | -63.2862 | -58.5324 | -3.1359 | -3.1415 |
76
+ | 0.6871 | 0.2412 | 1400 | 0.6914 | 0.0005 | -0.0030 | 0.5718 | 0.0036 | -63.4832 | -58.6569 | -3.1301 | -3.1358 |
77
+ | 0.6865 | 0.2584 | 1500 | 0.6910 | -0.0015 | -0.0060 | 0.5760 | 0.0045 | -63.7806 | -58.8602 | -3.1249 | -3.1305 |
78
+ | 0.6876 | 0.2757 | 1600 | 0.6906 | -0.0038 | -0.0091 | 0.5860 | 0.0053 | -64.0945 | -59.0966 | -3.1178 | -3.1235 |
79
+ | 0.6883 | 0.2929 | 1700 | 0.6903 | -0.0066 | -0.0127 | 0.5846 | 0.0061 | -64.4541 | -59.3744 | -3.1115 | -3.1171 |
80
+ | 0.684 | 0.3101 | 1800 | 0.6900 | -0.0121 | -0.0190 | 0.5843 | 0.0069 | -65.0824 | -59.9254 | -3.1036 | -3.1093 |
81
+ | 0.6834 | 0.3274 | 1900 | 0.6895 | -0.0157 | -0.0236 | 0.5881 | 0.0078 | -65.5351 | -60.2850 | -3.0983 | -3.1039 |
82
+ | 0.6852 | 0.3446 | 2000 | 0.6890 | -0.0228 | -0.0319 | 0.5888 | 0.0091 | -66.3715 | -60.9889 | -3.0904 | -3.0961 |
83
+ | 0.6827 | 0.3618 | 2100 | 0.6883 | -0.0310 | -0.0417 | 0.5885 | 0.0107 | -67.3509 | -61.8145 | -3.0840 | -3.0897 |
84
+ | 0.6745 | 0.3790 | 2200 | 0.6876 | -0.0382 | -0.0505 | 0.5860 | 0.0123 | -68.2293 | -62.5301 | -3.0753 | -3.0810 |
85
+ | 0.678 | 0.3963 | 2300 | 0.6872 | -0.0406 | -0.0536 | 0.5890 | 0.0131 | -68.5438 | -62.7670 | -3.0691 | -3.0748 |
86
+ | 0.6808 | 0.4135 | 2400 | 0.6867 | -0.0471 | -0.0614 | 0.5881 | 0.0143 | -69.3158 | -63.4223 | -3.0596 | -3.0652 |
87
+ | 0.683 | 0.4307 | 2500 | 0.6861 | -0.0556 | -0.0712 | 0.5897 | 0.0157 | -70.3045 | -64.2686 | -3.0500 | -3.0557 |
88
+ | 0.6754 | 0.4480 | 2600 | 0.6856 | -0.0611 | -0.0780 | 0.5885 | 0.0169 | -70.9754 | -64.8212 | -3.0432 | -3.0489 |
89
+ | 0.6768 | 0.4652 | 2700 | 0.6851 | -0.0674 | -0.0855 | 0.5927 | 0.0181 | -71.7327 | -65.4567 | -3.0371 | -3.0427 |
90
+ | 0.6767 | 0.4824 | 2800 | 0.6846 | -0.0729 | -0.0920 | 0.5943 | 0.0192 | -72.3822 | -65.9983 | -3.0311 | -3.0368 |
91
+ | 0.677 | 0.4997 | 2900 | 0.6843 | -0.0755 | -0.0955 | 0.5997 | 0.0200 | -72.7311 | -66.2650 | -3.0233 | -3.0290 |
92
+ | 0.678 | 0.5169 | 3000 | 0.6838 | -0.0814 | -0.1025 | 0.6008 | 0.0211 | -73.4252 | -66.8486 | -3.0141 | -3.0198 |
93
+ | 0.67 | 0.5341 | 3100 | 0.6836 | -0.0822 | -0.1038 | 0.6018 | 0.0216 | -73.5633 | -66.9356 | -3.0096 | -3.0153 |
94
+ | 0.6718 | 0.5513 | 3200 | 0.6827 | -0.0939 | -0.1175 | 0.6034 | 0.0236 | -74.9309 | -68.1066 | -2.9982 | -3.0040 |
95
+ | 0.6724 | 0.5686 | 3300 | 0.6821 | -0.0998 | -0.1249 | 0.6041 | 0.0251 | -75.6721 | -68.6965 | -2.9850 | -2.9907 |
96
+ | 0.6625 | 0.5858 | 3400 | 0.6819 | -0.1010 | -0.1266 | 0.6066 | 0.0256 | -75.8434 | -68.8117 | -2.9759 | -2.9817 |
97
+ | 0.6743 | 0.6030 | 3500 | 0.6814 | -0.1069 | -0.1336 | 0.6113 | 0.0267 | -76.5408 | -69.4021 | -2.9688 | -2.9746 |
98
+ | 0.6721 | 0.6203 | 3600 | 0.6810 | -0.1127 | -0.1405 | 0.6078 | 0.0278 | -77.2252 | -69.9806 | -2.9599 | -2.9657 |
99
+ | 0.664 | 0.6375 | 3700 | 0.6804 | -0.1212 | -0.1504 | 0.6073 | 0.0292 | -78.2202 | -70.8276 | -2.9486 | -2.9544 |
100
+ | 0.6644 | 0.6547 | 3800 | 0.6795 | -0.1329 | -0.1643 | 0.6104 | 0.0313 | -79.6058 | -72.0042 | -2.9392 | -2.9450 |
101
+ | 0.6665 | 0.6720 | 3900 | 0.6787 | -0.1452 | -0.1785 | 0.6059 | 0.0333 | -81.0310 | -73.2281 | -2.9298 | -2.9357 |
102
+ | 0.6653 | 0.6892 | 4000 | 0.6781 | -0.1559 | -0.1908 | 0.6062 | 0.0349 | -82.2593 | -74.3019 | -2.9178 | -2.9236 |
103
+ | 0.6534 | 0.7064 | 4100 | 0.6777 | -0.1615 | -0.1973 | 0.6080 | 0.0359 | -82.9142 | -74.8574 | -2.9005 | -2.9063 |
104
+ | 0.6736 | 0.7236 | 4200 | 0.6769 | -0.1724 | -0.2103 | 0.6069 | 0.0379 | -84.2087 | -75.9475 | -2.8890 | -2.8949 |
105
+ | 0.6617 | 0.7409 | 4300 | 0.6764 | -0.1802 | -0.2194 | 0.6071 | 0.0392 | -85.1160 | -76.7326 | -2.8792 | -2.8851 |
106
+ | 0.6625 | 0.7581 | 4400 | 0.6756 | -0.1938 | -0.2351 | 0.6039 | 0.0413 | -86.6852 | -78.0909 | -2.8681 | -2.8740 |
107
+ | 0.6604 | 0.7753 | 4500 | 0.6746 | -0.2102 | -0.2541 | 0.6076 | 0.0439 | -88.5854 | -79.7309 | -2.8589 | -2.8650 |
108
+ | 0.6436 | 0.7926 | 4600 | 0.6736 | -0.2248 | -0.2712 | 0.6066 | 0.0463 | -90.2984 | -81.1957 | -2.8510 | -2.8569 |
109
+ | 0.6527 | 0.8098 | 4700 | 0.6728 | -0.2396 | -0.2882 | 0.6078 | 0.0486 | -92.0000 | -82.6740 | -2.8433 | -2.8492 |
110
+ | 0.6604 | 0.8270 | 4800 | 0.6721 | -0.2501 | -0.3005 | 0.6066 | 0.0504 | -93.2272 | -83.7222 | -2.8340 | -2.8399 |
111
+ | 0.6665 | 0.8442 | 4900 | 0.6713 | -0.2626 | -0.3152 | 0.6053 | 0.0526 | -94.6995 | -84.9707 | -2.8265 | -2.8324 |
112
+ | 0.65 | 0.8615 | 5000 | 0.6706 | -0.2707 | -0.3251 | 0.5936 | 0.0543 | -95.6856 | -85.7848 | -2.8110 | -2.8169 |
113
+ | 0.6625 | 0.8787 | 5100 | 0.6697 | -0.2838 | -0.3407 | 0.5941 | 0.0569 | -97.2505 | -87.0959 | -2.8023 | -2.8083 |
114
+ | 0.6511 | 0.8959 | 5200 | 0.6695 | -0.2869 | -0.3443 | 0.5983 | 0.0574 | -97.6072 | -87.3982 | -2.7964 | -2.8024 |
115
+ | 0.6473 | 0.9132 | 5300 | 0.6691 | -0.2904 | -0.3488 | 0.5992 | 0.0584 | -98.0594 | -87.7473 | -2.7880 | -2.7940 |
116
+ | 0.6492 | 0.9304 | 5400 | 0.6687 | -0.2941 | -0.3536 | 0.6004 | 0.0594 | -98.5365 | -88.1234 | -2.7825 | -2.7885 |
117
+ | 0.6409 | 0.9476 | 5500 | 0.6682 | -0.3026 | -0.3636 | 0.5978 | 0.0609 | -99.5376 | -88.9754 | -2.7736 | -2.7795 |
118
+ | 0.6531 | 0.9649 | 5600 | 0.6679 | -0.2997 | -0.3615 | 0.6006 | 0.0617 | -99.3275 | -88.6850 | -2.7683 | -2.7743 |
119
+ | 0.6523 | 0.9821 | 5700 | 0.6671 | -0.3127 | -0.3766 | 0.6018 | 0.0639 | -100.8429 | -89.9807 | -2.7604 | -2.7664 |
120
+ | 0.6355 | 0.9993 | 5800 | 0.6663 | -0.3277 | -0.3941 | 0.6078 | 0.0664 | -102.5891 | -91.4845 | -2.7485 | -2.7544 |
121
+ | 0.6363 | 1.0165 | 5900 | 0.6654 | -0.3506 | -0.4200 | 0.6013 | 0.0695 | -105.1840 | -93.7690 | -2.7327 | -2.7388 |
122
+ | 0.6587 | 1.0338 | 6000 | 0.6654 | -0.3455 | -0.4149 | 0.6090 | 0.0694 | -104.6700 | -93.2587 | -2.7256 | -2.7317 |
123
+ | 0.6335 | 1.0510 | 6100 | 0.6650 | -0.3500 | -0.4204 | 0.6085 | 0.0704 | -105.2201 | -93.7083 | -2.7173 | -2.7233 |
124
+ | 0.637 | 1.0682 | 6200 | 0.6641 | -0.3684 | -0.4416 | 0.6083 | 0.0731 | -107.3361 | -95.5533 | -2.7081 | -2.7143 |
125
+ | 0.6557 | 1.0855 | 6300 | 0.6634 | -0.3813 | -0.4567 | 0.6092 | 0.0754 | -108.8497 | -96.8372 | -2.7011 | -2.7073 |
126
+ | 0.6406 | 1.1027 | 6400 | 0.6629 | -0.3842 | -0.4611 | 0.6104 | 0.0769 | -109.2875 | -97.1323 | -2.6938 | -2.7001 |
127
+ | 0.6445 | 1.1199 | 6500 | 0.6627 | -0.3897 | -0.4671 | 0.6104 | 0.0774 | -109.8874 | -97.6783 | -2.6856 | -2.6919 |
128
+ | 0.6444 | 1.1371 | 6600 | 0.6626 | -0.3914 | -0.4693 | 0.6087 | 0.0779 | -110.1084 | -97.8481 | -2.6817 | -2.6880 |
129
+ | 0.6412 | 1.1544 | 6700 | 0.6621 | -0.3997 | -0.4794 | 0.6094 | 0.0796 | -111.1156 | -98.6842 | -2.6724 | -2.6787 |
130
+ | 0.6223 | 1.1716 | 6800 | 0.6614 | -0.4163 | -0.4982 | 0.6145 | 0.0819 | -113.0004 | -100.3420 | -2.6623 | -2.6687 |
131
+ | 0.6439 | 1.1888 | 6900 | 0.6612 | -0.4231 | -0.5061 | 0.6106 | 0.0830 | -113.7915 | -101.0268 | -2.6555 | -2.6619 |
132
+ | 0.6269 | 1.2061 | 7000 | 0.6606 | -0.4424 | -0.5279 | 0.6099 | 0.0855 | -115.9700 | -102.9478 | -2.6489 | -2.6553 |
133
+ | 0.6301 | 1.2233 | 7100 | 0.6603 | -0.4383 | -0.5243 | 0.6122 | 0.0860 | -115.6095 | -102.5456 | -2.6439 | -2.6503 |
134
+ | 0.625 | 1.2405 | 7200 | 0.6600 | -0.4436 | -0.5309 | 0.6129 | 0.0873 | -116.2657 | -103.0681 | -2.6385 | -2.6450 |
135
+ | 0.653 | 1.2578 | 7300 | 0.6599 | -0.4335 | -0.5204 | 0.6134 | 0.0868 | -115.2167 | -102.0655 | -2.6367 | -2.6430 |
136
+ | 0.6456 | 1.2750 | 7400 | 0.6600 | -0.4315 | -0.5182 | 0.6113 | 0.0866 | -114.9959 | -101.8630 | -2.6344 | -2.6409 |
137
+ | 0.6454 | 1.2922 | 7500 | 0.6597 | -0.4307 | -0.5182 | 0.6162 | 0.0875 | -114.9953 | -101.7817 | -2.6295 | -2.6359 |
138
+ | 0.6769 | 1.3094 | 7600 | 0.6593 | -0.4390 | -0.5278 | 0.6162 | 0.0888 | -115.9591 | -102.6077 | -2.6216 | -2.6281 |
139
+ | 0.6367 | 1.3267 | 7700 | 0.6591 | -0.4402 | -0.5295 | 0.6166 | 0.0893 | -116.1309 | -102.7307 | -2.6170 | -2.6235 |
140
+ | 0.621 | 1.3439 | 7800 | 0.6587 | -0.4486 | -0.5395 | 0.6190 | 0.0909 | -117.1267 | -103.5701 | -2.6111 | -2.6176 |
141
+ | 0.6413 | 1.3611 | 7900 | 0.6581 | -0.4553 | -0.5479 | 0.6201 | 0.0926 | -117.9684 | -104.2417 | -2.6072 | -2.6137 |
142
+ | 0.6228 | 1.3784 | 8000 | 0.6580 | -0.4586 | -0.5519 | 0.6217 | 0.0932 | -118.3658 | -104.5737 | -2.6039 | -2.6105 |
143
+ | 0.6409 | 1.3956 | 8100 | 0.6577 | -0.4652 | -0.5596 | 0.6213 | 0.0944 | -119.1380 | -105.2326 | -2.5999 | -2.6065 |
144
+ | 0.6504 | 1.4128 | 8200 | 0.6572 | -0.4709 | -0.5666 | 0.6166 | 0.0958 | -119.8450 | -105.8004 | -2.5972 | -2.6038 |
145
+ | 0.6468 | 1.4300 | 8300 | 0.6573 | -0.4657 | -0.5609 | 0.6231 | 0.0953 | -119.2726 | -105.2789 | -2.5953 | -2.6019 |
146
+ | 0.6301 | 1.4473 | 8400 | 0.6574 | -0.4609 | -0.5559 | 0.6211 | 0.0950 | -118.7683 | -104.8034 | -2.5927 | -2.5993 |
147
+ | 0.6207 | 1.4645 | 8500 | 0.6575 | -0.4578 | -0.5526 | 0.6187 | 0.0948 | -118.4422 | -104.4934 | -2.5884 | -2.5951 |
148
+ | 0.6661 | 1.4817 | 8600 | 0.6570 | -0.4650 | -0.5611 | 0.6206 | 0.0961 | -119.2866 | -105.2096 | -2.5845 | -2.5911 |
149
+ | 0.6475 | 1.4990 | 8700 | 0.6572 | -0.4575 | -0.5529 | 0.6197 | 0.0954 | -118.4655 | -104.4587 | -2.5841 | -2.5908 |
150
+ | 0.6478 | 1.5162 | 8800 | 0.6569 | -0.4607 | -0.5569 | 0.6199 | 0.0962 | -118.8732 | -104.7842 | -2.5812 | -2.5879 |
151
+ | 0.6338 | 1.5334 | 8900 | 0.6566 | -0.4694 | -0.5668 | 0.6201 | 0.0974 | -119.8600 | -105.6548 | -2.5766 | -2.5833 |
152
+ | 0.6283 | 1.5507 | 9000 | 0.6565 | -0.4721 | -0.5700 | 0.6199 | 0.0979 | -120.1781 | -105.9173 | -2.5752 | -2.5819 |
153
+ | 0.6462 | 1.5679 | 9100 | 0.6564 | -0.4728 | -0.5710 | 0.6187 | 0.0982 | -120.2769 | -105.9869 | -2.5728 | -2.5796 |
154
+ | 0.6228 | 1.5851 | 9200 | 0.6562 | -0.4767 | -0.5756 | 0.6194 | 0.0989 | -120.7382 | -106.3830 | -2.5720 | -2.5787 |
155
+ | 0.6224 | 1.6023 | 9300 | 0.6561 | -0.4771 | -0.5764 | 0.6197 | 0.0993 | -120.8189 | -106.4213 | -2.5689 | -2.5756 |
156
+ | 0.6286 | 1.6196 | 9400 | 0.6558 | -0.4825 | -0.5830 | 0.6211 | 0.1004 | -121.4753 | -106.9631 | -2.5668 | -2.5735 |
157
+ | 0.6221 | 1.6368 | 9500 | 0.6558 | -0.4833 | -0.5838 | 0.6199 | 0.1005 | -121.5581 | -107.0399 | -2.5650 | -2.5717 |
158
+ | 0.6358 | 1.6540 | 9600 | 0.6557 | -0.4891 | -0.5901 | 0.6194 | 0.1010 | -122.1902 | -107.6185 | -2.5614 | -2.5681 |
159
+ | 0.6358 | 1.6713 | 9700 | 0.6556 | -0.4886 | -0.5899 | 0.6206 | 0.1013 | -122.1670 | -107.5694 | -2.5605 | -2.5673 |
160
+ | 0.6243 | 1.6885 | 9800 | 0.6554 | -0.4898 | -0.5916 | 0.6211 | 0.1019 | -122.3449 | -107.6895 | -2.5598 | -2.5665 |
161
+ | 0.5825 | 1.7057 | 9900 | 0.6554 | -0.4917 | -0.5936 | 0.6211 | 0.1019 | -122.5433 | -107.8852 | -2.5589 | -2.5656 |
162
+ | 0.6181 | 1.7229 | 10000 | 0.6552 | -0.4927 | -0.5951 | 0.6208 | 0.1024 | -122.6864 | -107.9799 | -2.5578 | -2.5645 |
163
+ | 0.6364 | 1.7402 | 10100 | 0.6553 | -0.4917 | -0.5940 | 0.6201 | 0.1023 | -122.5787 | -107.8781 | -2.5562 | -2.5630 |
164
+ | 0.6272 | 1.7574 | 10200 | 0.6552 | -0.4947 | -0.5974 | 0.6206 | 0.1027 | -122.9187 | -108.1824 | -2.5552 | -2.5620 |
165
+ | 0.6576 | 1.7746 | 10300 | 0.6551 | -0.4968 | -0.5997 | 0.6204 | 0.1029 | -123.1503 | -108.3895 | -2.5543 | -2.5610 |
166
+ | 0.6036 | 1.7919 | 10400 | 0.6552 | -0.4950 | -0.5977 | 0.6187 | 0.1027 | -122.9548 | -108.2141 | -2.5535 | -2.5603 |
167
+ | 0.6174 | 1.8091 | 10500 | 0.6551 | -0.4961 | -0.5990 | 0.6194 | 0.1029 | -123.0769 | -108.3228 | -2.5536 | -2.5603 |
168
+ | 0.6403 | 1.8263 | 10600 | 0.6551 | -0.4962 | -0.5992 | 0.6197 | 0.1030 | -123.0967 | -108.3300 | -2.5527 | -2.5595 |
169
+ | 0.6341 | 1.8436 | 10700 | 0.6551 | -0.4973 | -0.6004 | 0.6185 | 0.1031 | -123.2222 | -108.4462 | -2.5520 | -2.5588 |
170
+ | 0.627 | 1.8608 | 10800 | 0.6549 | -0.4976 | -0.6011 | 0.6211 | 0.1035 | -123.2887 | -108.4688 | -2.5518 | -2.5586 |
171
+ | 0.6336 | 1.8780 | 10900 | 0.6549 | -0.4972 | -0.6009 | 0.6201 | 0.1037 | -123.2694 | -108.4345 | -2.5519 | -2.5587 |
172
+ | 0.626 | 1.8952 | 11000 | 0.6550 | -0.4983 | -0.6016 | 0.6206 | 0.1034 | -123.3421 | -108.5379 | -2.5516 | -2.5584 |
173
+ | 0.6314 | 1.9125 | 11100 | 0.6551 | -0.4974 | -0.6004 | 0.6194 | 0.1030 | -123.2212 | -108.4520 | -2.5517 | -2.5585 |
174
+ | 0.6239 | 1.9297 | 11200 | 0.6549 | -0.4976 | -0.6012 | 0.6192 | 0.1036 | -123.3044 | -108.4749 | -2.5519 | -2.5587 |
175
+ | 0.6632 | 1.9469 | 11300 | 0.6550 | -0.4977 | -0.6011 | 0.6194 | 0.1033 | -123.2879 | -108.4866 | -2.5514 | -2.5582 |
176
+ | 0.6306 | 1.9642 | 11400 | 0.6550 | -0.4978 | -0.6010 | 0.6183 | 0.1032 | -123.2786 | -108.4874 | -2.5514 | -2.5583 |
177
+ | 0.6532 | 1.9814 | 11500 | 0.6549 | -0.4977 | -0.6012 | 0.6206 | 0.1035 | -123.3012 | -108.4803 | -2.5513 | -2.5581 |
178
+ | 0.6257 | 1.9986 | 11600 | 0.6549 | -0.4976 | -0.6011 | 0.6194 | 0.1035 | -123.2918 | -108.4708 | -2.5511 | -2.5579 |
179
+
180
+
181
+ ### Framework versions
182
+
183
+ - Transformers 4.41.2
184
+ - Pytorch 2.1.2
185
+ - Datasets 2.19.2
186
+ - Tokenizers 0.19.1
all_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.0,
3
+ "total_flos": 0.0,
4
+ "train_loss": 0.6539956355489261,
5
+ "train_runtime": 95357.4081,
6
+ "train_samples": 92858,
7
+ "train_samples_per_second": 1.948,
8
+ "train_steps_per_second": 0.122
9
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 1,
3
+ "eos_token_id": 2,
4
+ "max_length": 2048,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.41.2"
7
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d368db4da5fa70dd7c4777111fe2d65c973fc86ec1e6f8b1288d5a1538bb7b6a
3
  size 4400216536
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fdbf3263fb2039a5e0a8d52ab0752e4a7d556d8f68d14afc28a38601021050c1
3
  size 4400216536
runs/Jun14_10-39-33_poseidon/events.out.tfevents.1718361915.poseidon.1236766.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3757e1afcdeec041b292f2cc2fd6a2530e04973f90fc96ba1d037edbec45e59d
3
- size 889599
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:97c70dbb1209ea195e3fddb0e543ff88016a513cfa6b3cede21cbf29ec094329
3
+ size 889953
train_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.0,
3
+ "total_flos": 0.0,
4
+ "train_loss": 0.6539956355489261,
5
+ "train_runtime": 95357.4081,
6
+ "train_samples": 92858,
7
+ "train_samples_per_second": 1.948,
8
+ "train_steps_per_second": 0.122
9
+ }
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff