Edit model card

finetuned-marktextepoch-n600

This model is a fine-tuned version of leokai/finetuned-marktextepoch-n500 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6814

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 182

Training results

Training Loss Epoch Step Validation Loss
0.5332 1.0 1606 2.5256
0.5315 2.0 3212 2.4835
0.5181 3.0 4818 2.5471
0.5318 4.0 6424 2.5213
0.5398 5.0 8030 2.5408
0.5474 6.0 9636 2.5557
0.541 7.0 11242 2.5415
0.529 8.0 12848 2.5729
0.5294 9.0 14454 2.5533
0.5404 10.0 16060 2.5414
0.5359 11.0 17666 2.5316
0.5295 12.0 19272 2.5985
0.5319 13.0 20878 2.5644
0.5492 14.0 22484 2.5594
0.5403 15.0 24090 2.5830
0.526 16.0 25696 2.5999
0.5308 17.0 27302 2.5629
0.516 18.0 28908 2.6015
0.525 19.0 30514 2.5726
0.5238 20.0 32120 2.5656
0.5244 21.0 33726 2.5961
0.531 22.0 35332 2.5809
0.517 23.0 36938 2.5688
0.5334 24.0 38544 2.5828
0.505 25.0 40150 2.5861
0.5136 26.0 41756 2.6044
0.5228 27.0 43362 2.6000
0.5066 28.0 44968 2.5971
0.5183 29.0 46574 2.6240
0.5076 30.0 48180 2.6201
0.5059 31.0 49786 2.5746
0.5033 32.0 51392 2.6229
0.5041 33.0 52998 2.6086
0.5132 34.0 54604 2.6115
0.5007 35.0 56210 2.5865
0.4947 36.0 57816 2.6159
0.4997 37.0 59422 2.6072
0.4988 38.0 61028 2.5931
0.5001 39.0 62634 2.5956
0.5027 40.0 64240 2.6444
0.4969 41.0 65846 2.6194
0.4916 42.0 67452 2.6394
0.4986 43.0 69058 2.6401
0.5 44.0 70664 2.6164
0.4848 45.0 72270 2.6479
0.4926 46.0 73876 2.6519
0.492 47.0 75482 2.6359
0.4939 48.0 77088 2.6211
0.4914 49.0 78694 2.6536
0.4743 50.0 80300 2.6434
0.4787 51.0 81906 2.6396
0.4686 52.0 83512 2.6337
0.4775 53.0 85118 2.6352
0.4844 54.0 86724 2.6382
0.4802 55.0 88330 2.6473
0.4799 56.0 89936 2.6284
0.4749 57.0 91542 2.6371
0.4779 58.0 93148 2.6264
0.4727 59.0 94754 2.6506
0.4875 60.0 96360 2.6677
0.4695 61.0 97966 2.6507
0.4612 62.0 99572 2.6600
0.4658 63.0 101178 2.6442
0.4737 64.0 102784 2.6593
0.47 65.0 104390 2.6451
0.4658 66.0 105996 2.6493
0.4634 67.0 107602 2.6795
0.4713 68.0 109208 2.6392
0.4771 69.0 110814 2.6633
0.4704 70.0 112420 2.6273
0.458 71.0 114026 2.6426
0.4577 72.0 115632 2.6652
0.4585 73.0 117238 2.6609
0.4567 74.0 118844 2.6285
0.4524 75.0 120450 2.6860
0.4615 76.0 122056 2.7033
0.4725 77.0 123662 2.6877
0.4621 78.0 125268 2.6343
0.4555 79.0 126874 2.6664
0.4485 80.0 128480 2.6650
0.4508 81.0 130086 2.6777
0.4475 82.0 131692 2.6759
0.4432 83.0 133298 2.6711
0.4541 84.0 134904 2.6905
0.444 85.0 136510 2.6699
0.4428 86.0 138116 2.6737
0.4436 87.0 139722 2.6536
0.4522 88.0 141328 2.6504
0.4632 89.0 142934 2.6697
0.4514 90.0 144540 2.6854
0.4369 91.0 146146 2.6804
0.4324 92.0 147752 2.7011
0.4436 93.0 149358 2.7145
0.4317 94.0 150964 2.6880
0.4468 95.0 152570 2.6784
0.4364 96.0 154176 2.7050
0.4505 97.0 155782 2.7214
0.4273 98.0 157388 2.6843
0.4374 99.0 158994 2.7047
0.4436 100.0 160600 2.6934
0.4399 101.0 162206 2.6913
0.4273 102.0 163812 2.6949
0.4334 103.0 165418 2.6628
0.4277 104.0 167024 2.7170
0.439 105.0 168630 2.6752
0.4418 106.0 170236 2.6832
0.4278 107.0 171842 2.6386
0.4226 108.0 173448 2.6946
0.4255 109.0 175054 2.6911
0.4349 110.0 176660 2.7073
0.4259 111.0 178266 2.7048
0.4328 112.0 179872 2.7105
0.4242 113.0 181478 2.6897
0.4228 114.0 183084 2.6921
0.4227 115.0 184690 2.6833
0.4192 116.0 186296 2.6483
0.4381 117.0 187902 2.6690
0.425 118.0 189508 2.6866
0.4273 119.0 191114 2.6892
0.4201 120.0 192720 2.7128
0.4252 121.0 194326 2.6883
0.423 122.0 195932 2.6766
0.4371 123.0 197538 2.7092
0.4363 124.0 199144 2.7084
0.4315 125.0 200750 2.7321
0.4367 126.0 202356 2.7005
0.4114 127.0 203962 2.6878
0.4025 128.0 205568 2.7100
0.4376 129.0 207174 2.7073
0.4201 130.0 208780 2.7064
0.4248 131.0 210386 2.6755
0.4333 132.0 211992 2.6884
0.4178 133.0 213598 2.6688
0.433 134.0 215204 2.6911
0.4145 135.0 216810 2.7116
0.4163 136.0 218416 2.6867
0.4203 137.0 220022 2.7109
0.4164 138.0 221628 2.7031
0.4252 139.0 223234 2.6656
0.4302 140.0 224840 2.7018
0.4205 141.0 226446 2.6912
0.4055 142.0 228052 2.7107
0.4204 143.0 229658 2.7236
0.4104 144.0 231264 2.6931
0.4146 145.0 232870 2.7160
0.4113 146.0 234476 2.7116
0.4375 147.0 236082 2.6680
0.4135 148.0 237688 2.6984
0.4198 149.0 239294 2.6823
0.4154 150.0 240900 2.7031
0.4159 151.0 242506 2.7000
0.4104 152.0 244112 2.6974
0.4283 153.0 245718 2.6649
0.4046 154.0 247324 2.6989
0.4174 155.0 248930 2.6774
0.4199 156.0 250536 2.6943
0.421 157.0 252142 2.6728
0.4106 158.0 253748 2.6836
0.4081 159.0 255354 2.6946
0.4233 160.0 256960 2.6992
0.4183 161.0 258566 2.6585
0.4213 162.0 260172 2.6761
0.4259 163.0 261778 2.7186
0.4157 164.0 263384 2.7150
0.4257 165.0 264990 2.7004
0.4251 166.0 266596 2.6728
0.4228 167.0 268202 2.6831
0.4233 168.0 269808 2.6781
0.418 169.0 271414 2.6598
0.4263 170.0 273020 2.6930
0.4104 171.0 274626 2.7045
0.4213 172.0 276232 2.6979
0.419 173.0 277838 2.6726
0.4273 174.0 279444 2.6631
0.4189 175.0 281050 2.6802
0.4228 176.0 282656 2.6872
0.431 177.0 284262 2.6677
0.4363 178.0 285868 2.6710
0.4145 179.0 287474 2.6654
0.4256 180.0 289080 2.6802
0.4277 181.0 290686 2.6698
0.4249 182.0 292292 2.6814

Framework versions

  • Transformers 4.21.1
  • Pytorch 1.12.0+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
0