BERiT_2.0
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.3491
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0008131878854370431
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 75
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
6.7675 | 0.3873 | 500 | 6.5915 |
6.5119 | 0.7746 | 1000 | 6.3912 |
6.378 | 1.1619 | 1500 | 6.3244 |
6.3103 | 1.5492 | 2000 | 6.2670 |
6.3097 | 1.9365 | 2500 | 6.2378 |
6.2799 | 2.3238 | 3000 | 6.2406 |
6.2717 | 2.7111 | 3500 | 6.2725 |
6.2694 | 3.0984 | 4000 | 6.2770 |
6.2595 | 3.4857 | 4500 | 6.2695 |
6.2386 | 3.8730 | 5000 | 6.2952 |
6.2393 | 4.2603 | 5500 | 6.2538 |
6.2442 | 4.6476 | 6000 | 6.2138 |
6.2322 | 5.0349 | 6500 | 6.2639 |
6.2028 | 5.4222 | 7000 | 6.3072 |
6.2306 | 5.8095 | 7500 | 6.2069 |
6.2027 | 6.1967 | 8000 | 6.2065 |
6.2 | 6.5840 | 8500 | 6.2331 |
6.1877 | 6.9713 | 9000 | 6.2548 |
6.181 | 7.3586 | 9500 | 6.2235 |
6.1608 | 7.7459 | 10000 | 6.2316 |
6.1933 | 8.1332 | 10500 | 6.2513 |
6.187 | 8.5205 | 11000 | 6.1930 |
6.1734 | 8.9078 | 11500 | 6.2105 |
6.167 | 9.2951 | 12000 | 6.2032 |
6.148 | 9.6824 | 12500 | 6.2129 |
6.1411 | 10.0697 | 13000 | 6.2228 |
6.131 | 10.4570 | 13500 | 6.2537 |
6.1321 | 10.8443 | 14000 | 6.2499 |
6.1412 | 11.2316 | 14500 | 6.2424 |
6.1389 | 11.6189 | 15000 | 6.1810 |
6.1265 | 12.0062 | 15500 | 6.1974 |
6.1345 | 12.3935 | 16000 | 6.2404 |
6.1072 | 12.7808 | 16500 | 6.1847 |
6.1213 | 13.1681 | 17000 | 6.1444 |
6.1016 | 13.5554 | 17500 | 6.1880 |
6.1306 | 13.9427 | 18000 | 6.1328 |
6.108 | 14.3300 | 18500 | 6.1761 |
6.1137 | 14.7173 | 19000 | 6.1504 |
6.0994 | 15.1046 | 19500 | 6.1977 |
6.1026 | 15.4919 | 20000 | 6.1830 |
6.0999 | 15.8792 | 20500 | 6.1811 |
6.0978 | 16.2665 | 21000 | 6.1263 |
6.0877 | 16.6538 | 21500 | 6.1645 |
6.0917 | 17.0411 | 22000 | 6.1742 |
6.0973 | 17.4284 | 22500 | 6.1636 |
6.1098 | 17.8156 | 23000 | 6.2017 |
6.0828 | 18.2029 | 23500 | 6.1007 |
6.0999 | 18.5902 | 24000 | 6.0879 |
6.0953 | 18.9775 | 24500 | 6.1521 |
6.079 | 19.3648 | 25000 | 6.1391 |
6.0682 | 19.7521 | 25500 | 6.0962 |
6.058 | 20.1394 | 26000 | 6.0719 |
6.0643 | 20.5267 | 26500 | 6.1114 |
6.0498 | 20.9140 | 27000 | 6.1111 |
6.0665 | 21.3013 | 27500 | 6.1200 |
6.0825 | 21.6886 | 28000 | 6.0961 |
6.0369 | 22.0759 | 28500 | 6.1578 |
6.0512 | 22.4632 | 29000 | 6.0876 |
6.026 | 22.8505 | 29500 | 6.1211 |
6.0558 | 23.2378 | 30000 | 6.0837 |
6.0466 | 23.6251 | 30500 | 6.0552 |
6.0202 | 24.0124 | 31000 | 6.0906 |
6.0019 | 24.3997 | 31500 | 6.0580 |
6.0352 | 24.7870 | 32000 | 6.0521 |
5.9983 | 25.1743 | 32500 | 6.0701 |
6.0367 | 25.5616 | 33000 | 6.0859 |
6.0183 | 25.9489 | 33500 | 6.1353 |
5.9726 | 26.3362 | 34000 | 6.0918 |
5.982 | 26.7235 | 34500 | 6.0434 |
6.0261 | 27.1108 | 35000 | 6.0038 |
5.9818 | 27.4981 | 35500 | 6.0328 |
5.9659 | 27.8854 | 36000 | 6.0672 |
5.9835 | 28.2727 | 36500 | 6.0334 |
5.98 | 28.6600 | 37000 | 6.0673 |
5.9756 | 29.0473 | 37500 | 5.9969 |
5.979 | 29.4345 | 38000 | 6.0067 |
5.9728 | 29.8218 | 38500 | 6.0297 |
5.9596 | 30.2091 | 39000 | 5.9682 |
5.9866 | 30.5964 | 39500 | 6.0026 |
5.975 | 30.9837 | 40000 | 5.9987 |
5.9678 | 31.3710 | 40500 | 5.9919 |
5.9676 | 31.7583 | 41000 | 5.9807 |
5.9294 | 32.1456 | 41500 | 5.9629 |
5.9465 | 32.5329 | 42000 | 5.9608 |
5.9554 | 32.9202 | 42500 | 5.9522 |
5.9042 | 33.3075 | 43000 | 5.9674 |
5.9359 | 33.6948 | 43500 | 5.9959 |
5.9339 | 34.0821 | 44000 | 5.9914 |
5.9215 | 34.4694 | 44500 | 5.9134 |
5.8901 | 34.8567 | 45000 | 5.9219 |
5.9134 | 35.2440 | 45500 | 5.9305 |
5.9086 | 35.6313 | 46000 | 5.9433 |
5.9051 | 36.0186 | 46500 | 5.8672 |
5.8991 | 36.4059 | 47000 | 5.8599 |
5.8789 | 36.7932 | 47500 | 5.8966 |
5.892 | 37.1805 | 48000 | 5.8956 |
5.8591 | 37.5678 | 48500 | 5.8597 |
5.8855 | 37.9551 | 49000 | 5.8776 |
5.856 | 38.3424 | 49500 | 5.9281 |
5.838 | 38.7297 | 50000 | 5.8091 |
5.8556 | 39.1170 | 50500 | 5.7789 |
5.8527 | 39.5043 | 51000 | 5.7454 |
5.8172 | 39.8916 | 51500 | 5.7894 |
5.8249 | 40.2789 | 52000 | 5.7938 |
5.809 | 40.6662 | 52500 | 5.7688 |
5.8152 | 41.0534 | 53000 | 5.7233 |
5.7961 | 41.4407 | 53500 | 5.6899 |
5.7767 | 41.8280 | 54000 | 5.7165 |
5.7582 | 42.2153 | 54500 | 5.7639 |
5.7893 | 42.6026 | 55000 | 5.6803 |
5.7365 | 42.9899 | 55500 | 5.6790 |
5.7365 | 43.3772 | 56000 | 5.6499 |
5.7569 | 43.7645 | 56500 | 5.6215 |
5.7321 | 44.1518 | 57000 | 5.6148 |
5.7186 | 44.5391 | 57500 | 5.5600 |
5.701 | 44.9264 | 58000 | 5.5373 |
5.7009 | 45.3137 | 58500 | 5.5664 |
5.7 | 45.7010 | 59000 | 5.5163 |
5.677 | 46.0883 | 59500 | 5.4210 |
5.6673 | 46.4756 | 60000 | 5.3903 |
5.6297 | 46.8629 | 60500 | 5.3785 |
5.6222 | 47.2502 | 61000 | 5.3162 |
5.6181 | 47.6375 | 61500 | 5.2644 |
5.5784 | 48.0248 | 62000 | 5.2543 |
5.5799 | 48.4121 | 62500 | 5.2034 |
5.5509 | 48.7994 | 63000 | 5.1793 |
5.5665 | 49.1867 | 63500 | 5.1611 |
5.5418 | 49.5740 | 64000 | 5.1162 |
5.5094 | 49.9613 | 64500 | 5.0998 |
5.4983 | 50.3486 | 65000 | 5.0847 |
5.488 | 50.7359 | 65500 | 5.0962 |
5.4842 | 51.1232 | 66000 | 5.0385 |
5.4456 | 51.5105 | 66500 | 5.0509 |
5.4167 | 51.8978 | 67000 | 4.9671 |
5.4094 | 52.2851 | 67500 | 4.9199 |
5.4044 | 52.6723 | 68000 | 4.9520 |
5.3853 | 53.0596 | 68500 | 4.9233 |
5.388 | 53.4469 | 69000 | 4.8602 |
5.3735 | 53.8342 | 69500 | 4.8504 |
5.3755 | 54.2215 | 70000 | 4.8019 |
5.3352 | 54.6088 | 70500 | 4.8239 |
5.3469 | 54.9961 | 71000 | 4.8391 |
5.3198 | 55.3834 | 71500 | 4.7593 |
5.2901 | 55.7707 | 72000 | 4.7801 |
5.2921 | 56.1580 | 72500 | 4.7699 |
5.2942 | 56.5453 | 73000 | 4.7290 |
5.2615 | 56.9326 | 73500 | 4.7631 |
5.2719 | 57.3199 | 74000 | 4.7217 |
5.2724 | 57.7072 | 74500 | 4.7176 |
5.256 | 58.0945 | 75000 | 4.6826 |
5.2318 | 58.4818 | 75500 | 4.6467 |
5.2152 | 58.8691 | 76000 | 4.6603 |
5.2301 | 59.2564 | 76500 | 4.6464 |
5.2057 | 59.6437 | 77000 | 4.7027 |
5.2073 | 60.0310 | 77500 | 4.6243 |
5.1845 | 60.4183 | 78000 | 4.6194 |
5.1919 | 60.8056 | 78500 | 4.5877 |
5.1805 | 61.1929 | 79000 | 4.5641 |
5.1723 | 61.5802 | 79500 | 4.5644 |
5.1805 | 61.9675 | 80000 | 4.5706 |
5.1738 | 62.3548 | 80500 | 4.5631 |
5.1586 | 62.7421 | 81000 | 4.5652 |
5.1666 | 63.1294 | 81500 | 4.5658 |
5.1412 | 63.5167 | 82000 | 4.5540 |
5.1336 | 63.9040 | 82500 | 4.4969 |
5.1561 | 64.2912 | 83000 | 4.5313 |
5.1223 | 64.6785 | 83500 | 4.5508 |
5.1426 | 65.0658 | 84000 | 4.6013 |
5.1124 | 65.4531 | 84500 | 4.4723 |
5.1187 | 65.8404 | 85000 | 4.5116 |
5.1162 | 66.2277 | 85500 | 4.4991 |
5.0953 | 66.6150 | 86000 | 4.4395 |
5.1166 | 67.0023 | 86500 | 4.4828 |
5.0869 | 67.3896 | 87000 | 4.4650 |
5.0936 | 67.7769 | 87500 | 4.4560 |
5.0963 | 68.1642 | 88000 | 4.4724 |
5.1117 | 68.5515 | 88500 | 4.4756 |
5.0679 | 68.9388 | 89000 | 4.4347 |
5.0803 | 69.3261 | 89500 | 4.4484 |
5.0786 | 69.7134 | 90000 | 4.4027 |
5.0483 | 70.1007 | 90500 | 4.3975 |
5.0678 | 70.4880 | 91000 | 4.4025 |
5.053 | 70.8753 | 91500 | 4.4227 |
5.068 | 71.2626 | 92000 | 4.4204 |
5.0643 | 71.6499 | 92500 | 4.3634 |
5.0812 | 72.0372 | 93000 | 4.4053 |
5.0543 | 72.4245 | 93500 | 4.4241 |
5.0564 | 72.8118 | 94000 | 4.3448 |
5.0486 | 73.1991 | 94500 | 4.4480 |
5.0508 | 73.5864 | 95000 | 4.3254 |
5.0298 | 73.9737 | 95500 | 4.4201 |
5.0585 | 74.3610 | 96000 | 4.4283 |
5.0454 | 74.7483 | 96500 | 4.3491 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 39