metadata
library_name: transformers
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: lge_tests_prelim
results: []
lge_tests_prelim
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0033
- Accuracy: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0 | 0 | 2.6145 | 0.0 |
2.5809 | 0.0064 | 100 | 2.5687 | 0.0 |
2.523 | 0.0128 | 200 | 2.5229 | 0.0 |
2.4811 | 0.0192 | 300 | 2.4679 | 0.0 |
2.4303 | 0.0256 | 400 | 2.4110 | 0.0 |
2.3619 | 0.0320 | 500 | 2.3606 | 0.0 |
2.3345 | 0.0384 | 600 | 2.3222 | 0.0 |
2.3092 | 0.0448 | 700 | 2.2921 | 0.0 |
2.2821 | 0.0512 | 800 | 2.2657 | 0.0 |
2.2333 | 0.0576 | 900 | 2.2310 | 0.0 |
2.177 | 0.0640 | 1000 | 2.1707 | 0.0 |
2.1536 | 0.0704 | 1100 | 2.1321 | 0.0 |
2.1367 | 0.0768 | 1200 | 2.0913 | 0.0 |
2.0166 | 0.0832 | 1300 | 2.0268 | 0.0 |
1.9742 | 0.0896 | 1400 | 2.0546 | 0.0 |
1.9483 | 0.0960 | 1500 | 1.9994 | 0.0 |
1.8687 | 0.1024 | 1600 | 1.8746 | 0.0 |
1.8801 | 0.1088 | 1700 | 1.9225 | 0.0 |
1.884 | 0.1152 | 1800 | 1.9165 | 0.0 |
1.8477 | 0.1216 | 1900 | 1.7852 | 0.0 |
1.8183 | 0.1280 | 2000 | 1.7833 | 0.0 |
1.8369 | 0.1344 | 2100 | 1.9553 | 0.0 |
1.8137 | 0.1408 | 2200 | 1.7512 | 0.0 |
1.6711 | 0.1472 | 2300 | 1.7676 | 0.01 |
1.663 | 0.1536 | 2400 | 1.7389 | 0.0 |
1.7879 | 0.1600 | 2500 | 1.6750 | 0.005 |
1.69 | 0.1664 | 2600 | 1.6753 | 0.005 |
1.6681 | 0.1728 | 2700 | 1.7350 | 0.0 |
1.7412 | 0.1792 | 2800 | 1.6032 | 0.01 |
1.5453 | 0.1856 | 2900 | 1.6210 | 0.005 |
1.5741 | 0.1920 | 3000 | 1.6635 | 0.0 |
1.5371 | 0.1984 | 3100 | 1.6253 | 0.0 |
1.6883 | 0.2048 | 3200 | 1.5333 | 0.005 |
1.4715 | 0.2112 | 3300 | 1.6502 | 0.005 |
1.4137 | 0.2176 | 3400 | 1.4267 | 0.0 |
1.4928 | 0.2240 | 3500 | 1.4612 | 0.0 |
1.3538 | 0.2304 | 3600 | 1.3609 | 0.015 |
1.341 | 0.2368 | 3700 | 1.3231 | 0.015 |
1.3125 | 0.2432 | 3800 | 1.3416 | 0.0 |
1.6622 | 0.2496 | 3900 | 1.4710 | 0.005 |
1.5242 | 0.2560 | 4000 | 1.4332 | 0.005 |
1.2997 | 0.2625 | 4100 | 1.3173 | 0.01 |
1.2837 | 0.2689 | 4200 | 1.3149 | 0.005 |
1.2307 | 0.2753 | 4300 | 1.1461 | 0.015 |
1.5046 | 0.2817 | 4400 | 1.3055 | 0.005 |
1.2688 | 0.2881 | 4500 | 1.1161 | 0.01 |
1.1872 | 0.2945 | 4600 | 1.1132 | 0.01 |
1.1344 | 0.3009 | 4700 | 1.0692 | 0.01 |
1.2026 | 0.3073 | 4800 | 1.0552 | 0.0 |
1.0938 | 0.3137 | 4900 | 1.0710 | 0.02 |
1.0049 | 0.3201 | 5000 | 0.9988 | 0.005 |
1.1265 | 0.3265 | 5100 | 0.9553 | 0.03 |
0.9829 | 0.3329 | 5200 | 0.9911 | 0.01 |
0.9873 | 0.3393 | 5300 | 0.9368 | 0.02 |
0.9269 | 0.3457 | 5400 | 0.8815 | 0.02 |
0.9027 | 0.3521 | 5500 | 0.9123 | 0.01 |
0.8419 | 0.3585 | 5600 | 0.9692 | 0.02 |
0.9754 | 0.3649 | 5700 | 0.9221 | 0.04 |
0.8729 | 0.3713 | 5800 | 0.9506 | 0.045 |
0.7891 | 0.3777 | 5900 | 0.7808 | 0.125 |
0.7072 | 0.3841 | 6000 | 0.6781 | 0.17 |
0.6546 | 0.3905 | 6100 | 0.6591 | 0.18 |
0.5607 | 0.3969 | 6200 | 0.5789 | 0.3 |
0.5397 | 0.4033 | 6300 | 0.4997 | 0.445 |
0.5981 | 0.4097 | 6400 | 0.4789 | 0.475 |
0.4037 | 0.4161 | 6500 | 0.5675 | 0.245 |
0.4213 | 0.4225 | 6600 | 0.3815 | 0.63 |
0.4639 | 0.4289 | 6700 | 0.3542 | 0.59 |
0.3786 | 0.4353 | 6800 | 0.3166 | 0.625 |
0.5791 | 0.4417 | 6900 | 0.8131 | 0.13 |
0.4567 | 0.4481 | 7000 | 0.2814 | 0.65 |
0.4709 | 0.4545 | 7100 | 0.6059 | 0.16 |
0.2642 | 0.4609 | 7200 | 0.3014 | 0.53 |
0.3518 | 0.4673 | 7300 | 0.2250 | 0.66 |
0.2309 | 0.4737 | 7400 | 0.1933 | 0.75 |
0.2686 | 0.4801 | 7500 | 0.2457 | 0.54 |
0.2142 | 0.4865 | 7600 | 0.2393 | 0.625 |
0.1771 | 0.4929 | 7700 | 0.2440 | 0.565 |
0.1637 | 0.4993 | 7800 | 0.1620 | 0.775 |
0.2961 | 0.5057 | 7900 | 0.5910 | 0.12 |
0.1414 | 0.5121 | 8000 | 0.1640 | 0.74 |
0.1106 | 0.5185 | 8100 | 0.1175 | 0.855 |
0.1494 | 0.5249 | 8200 | 0.1550 | 0.725 |
0.1337 | 0.5313 | 8300 | 0.1139 | 0.85 |
0.1713 | 0.5377 | 8400 | 0.1009 | 0.86 |
0.1294 | 0.5441 | 8500 | 0.1391 | 0.755 |
0.1582 | 0.5505 | 8600 | 0.0950 | 0.86 |
0.0931 | 0.5569 | 8700 | 0.0985 | 0.845 |
0.0663 | 0.5633 | 8800 | 0.1735 | 0.635 |
0.1151 | 0.5697 | 8900 | 0.1516 | 0.69 |
0.1891 | 0.5761 | 9000 | 0.0983 | 0.8 |
0.1057 | 0.5825 | 9100 | 0.0902 | 0.85 |
0.1255 | 0.5889 | 9200 | 0.0935 | 0.825 |
0.1474 | 0.5953 | 9300 | 0.0715 | 0.89 |
0.1108 | 0.6017 | 9400 | 0.1197 | 0.78 |
0.1694 | 0.6081 | 9500 | 0.2394 | 0.485 |
0.0989 | 0.6145 | 9600 | 0.0985 | 0.83 |
0.1155 | 0.6209 | 9700 | 0.0745 | 0.88 |
0.2256 | 0.6273 | 9800 | 0.1757 | 0.63 |
0.1155 | 0.6337 | 9900 | 0.1612 | 0.6 |
0.0529 | 0.6401 | 10000 | 0.0762 | 0.85 |
0.0928 | 0.6465 | 10100 | 0.0647 | 0.875 |
0.0858 | 0.6529 | 10200 | 0.1147 | 0.735 |
0.0486 | 0.6593 | 10300 | 0.0699 | 0.85 |
0.1232 | 0.6657 | 10400 | 0.0697 | 0.87 |
0.0504 | 0.6721 | 10500 | 0.0576 | 0.9 |
0.0307 | 0.6785 | 10600 | 0.0409 | 0.935 |
0.0489 | 0.6849 | 10700 | 0.0815 | 0.835 |
0.0388 | 0.6913 | 10800 | 0.0256 | 0.97 |
0.0296 | 0.6977 | 10900 | 0.0586 | 0.865 |
0.0444 | 0.7041 | 11000 | 0.0278 | 0.96 |
0.0251 | 0.7105 | 11100 | 0.0280 | 0.95 |
0.0489 | 0.7169 | 11200 | 0.0504 | 0.895 |
0.0264 | 0.7233 | 11300 | 0.0315 | 0.945 |
0.0293 | 0.7297 | 11400 | 0.0254 | 0.955 |
0.0143 | 0.7361 | 11500 | 0.0211 | 0.955 |
0.0288 | 0.7425 | 11600 | 0.0614 | 0.855 |
0.0278 | 0.7489 | 11700 | 0.0228 | 0.965 |
0.034 | 0.7553 | 11800 | 0.0175 | 0.975 |
0.0408 | 0.7617 | 11900 | 0.0374 | 0.93 |
0.0255 | 0.7681 | 12000 | 0.0453 | 0.9 |
0.0175 | 0.7745 | 12100 | 0.0229 | 0.965 |
0.014 | 0.7809 | 12200 | 0.0112 | 0.995 |
0.0213 | 0.7874 | 12300 | 0.0238 | 0.965 |
0.0082 | 0.7938 | 12400 | 0.0110 | 0.985 |
0.0211 | 0.8002 | 12500 | 0.0120 | 0.985 |
0.0111 | 0.8066 | 12600 | 0.0117 | 0.98 |
0.0074 | 0.8130 | 12700 | 0.0136 | 0.965 |
0.0108 | 0.8194 | 12800 | 0.0083 | 0.995 |
0.013 | 0.8258 | 12900 | 0.0098 | 0.99 |
0.0076 | 0.8322 | 13000 | 0.0074 | 0.995 |
0.0084 | 0.8386 | 13100 | 0.0106 | 0.98 |
0.0119 | 0.8450 | 13200 | 0.0068 | 0.995 |
0.0059 | 0.8514 | 13300 | 0.0079 | 0.98 |
0.0064 | 0.8578 | 13400 | 0.0067 | 0.99 |
0.0048 | 0.8642 | 13500 | 0.0059 | 0.995 |
0.0043 | 0.8706 | 13600 | 0.0044 | 1.0 |
0.007 | 0.8770 | 13700 | 0.0088 | 0.985 |
0.0043 | 0.8834 | 13800 | 0.0042 | 1.0 |
0.003 | 0.8898 | 13900 | 0.0060 | 0.995 |
0.0037 | 0.8962 | 14000 | 0.0052 | 0.99 |
0.0064 | 0.9026 | 14100 | 0.0089 | 0.985 |
0.0029 | 0.9090 | 14200 | 0.0039 | 1.0 |
0.0054 | 0.9154 | 14300 | 0.0037 | 1.0 |
0.0031 | 0.9218 | 14400 | 0.0037 | 1.0 |
0.0031 | 0.9282 | 14500 | 0.0035 | 1.0 |
0.0039 | 0.9346 | 14600 | 0.0036 | 1.0 |
0.0028 | 0.9410 | 14700 | 0.0039 | 1.0 |
0.0027 | 0.9474 | 14800 | 0.0033 | 1.0 |
0.0027 | 0.9538 | 14900 | 0.0031 | 1.0 |
0.0037 | 0.9602 | 15000 | 0.0032 | 1.0 |
0.0026 | 0.9666 | 15100 | 0.0031 | 1.0 |
0.0025 | 0.9730 | 15200 | 0.0033 | 1.0 |
0.0027 | 0.9794 | 15300 | 0.0031 | 1.0 |
0.0033 | 0.9858 | 15400 | 0.0034 | 1.0 |
0.0025 | 0.9922 | 15500 | 0.0033 | 1.0 |
0.0025 | 0.9986 | 15600 | 0.0033 | 1.0 |
Framework versions
- Transformers 4.46.0
- Pytorch 2.5.1
- Datasets 3.1.0
- Tokenizers 0.20.1