shafin's picture
update model card README.md
2d9480d
|
raw
history blame
11.7 kB
metadata
tags:
  - generated_from_trainer
model-index:
  - name: chemical-bert-uncased-finetuned-cust
    results: []

chemical-bert-uncased-finetuned-cust

This model is a fine-tuned version of recobo/chemical-bert-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7104

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
3.5876 1.0 63 2.7997
2.7843 2.0 126 2.3734
2.418 3.0 189 2.1510
2.2247 4.0 252 1.9822
2.062 5.0 315 1.8463
1.9875 6.0 378 1.8293
1.9034 7.0 441 1.7666
1.7818 8.0 504 1.6783
1.7131 9.0 567 1.5754
1.6793 10.0 630 1.5480
1.5773 11.0 693 1.4568
1.5391 12.0 756 1.5101
1.5049 13.0 819 1.4340
1.4476 14.0 882 1.4046
1.4032 15.0 945 1.3593
1.395 16.0 1008 1.3689
1.3353 17.0 1071 1.3350
1.3122 18.0 1134 1.2863
1.3036 19.0 1197 1.3690
1.2644 20.0 1260 1.1904
1.222 21.0 1323 1.1986
1.2091 22.0 1386 1.1650
1.2007 23.0 1449 1.1949
1.1456 24.0 1512 1.1649
1.1426 25.0 1575 1.1498
1.0883 26.0 1638 1.1489
1.0915 27.0 1701 1.1179
1.0635 28.0 1764 1.0726
1.0899 29.0 1827 1.1107
1.0251 30.0 1890 1.0944
1.0387 31.0 1953 1.0488
1.0037 32.0 2016 1.0679
1.0101 33.0 2079 1.0272
0.9595 34.0 2142 1.0158
0.9661 35.0 2205 1.0316
0.9535 36.0 2268 1.0086
0.9269 37.0 2331 1.0221
0.9395 38.0 2394 0.9626
0.9105 39.0 2457 0.9903
0.8888 40.0 2520 0.9892
0.9316 41.0 2583 0.9786
0.8804 42.0 2646 0.9938
0.8589 43.0 2709 1.0105
0.8573 44.0 2772 0.9729
0.8566 45.0 2835 0.9972
0.8392 46.0 2898 1.0085
0.8363 47.0 2961 0.9336
0.8184 48.0 3024 0.9886
0.7964 49.0 3087 0.9661
0.8025 50.0 3150 0.8956
0.8156 51.0 3213 0.9415
0.7906 52.0 3276 0.9381
0.7783 53.0 3339 0.9445
0.7696 54.0 3402 0.8859
0.763 55.0 3465 0.8851
0.7638 56.0 3528 0.9128
0.7576 57.0 3591 0.8629
0.757 58.0 3654 0.8917
0.7232 59.0 3717 0.8956
0.7327 60.0 3780 0.8727
0.7321 61.0 3843 0.8558
0.7131 62.0 3906 0.8876
0.696 63.0 3969 0.8872
0.6996 64.0 4032 0.7758
0.6807 65.0 4095 0.8657
0.6899 66.0 4158 0.8813
0.6873 67.0 4221 0.8488
0.6681 68.0 4284 0.8865
0.6758 69.0 4347 0.8447
0.6626 70.0 4410 0.8421
0.6535 71.0 4473 0.8313
0.6505 72.0 4536 0.8636
0.6654 73.0 4599 0.8433
0.6363 74.0 4662 0.7666
0.6395 75.0 4725 0.8882
0.6206 76.0 4788 0.8409
0.6365 77.0 4851 0.8807
0.6325 78.0 4914 0.8012
0.6142 79.0 4977 0.7705
0.6108 80.0 5040 0.8270
0.62 81.0 5103 0.8552
0.6188 82.0 5166 0.8377
0.6024 83.0 5229 0.7985
0.631 84.0 5292 0.8352
0.5871 85.0 5355 0.8086
0.6014 86.0 5418 0.8129
0.5842 87.0 5481 0.8649
0.5837 88.0 5544 0.8269
0.5958 89.0 5607 0.8407
0.564 90.0 5670 0.7906
0.5748 91.0 5733 0.7393
0.5918 92.0 5796 0.8445
0.5682 93.0 5859 0.8073
0.5497 94.0 5922 0.8165
0.5606 95.0 5985 0.7638
0.5593 96.0 6048 0.7929
0.5556 97.0 6111 0.7991
0.5604 98.0 6174 0.7417
0.5503 99.0 6237 0.8070
0.5561 100.0 6300 0.7845
0.5344 101.0 6363 0.7933
0.5209 102.0 6426 0.7741
0.5337 103.0 6489 0.7760
0.5437 104.0 6552 0.7634
0.5165 105.0 6615 0.7543
0.5343 106.0 6678 0.7661
0.5155 107.0 6741 0.7953
0.512 108.0 6804 0.8253
0.5259 109.0 6867 0.7570
0.5045 110.0 6930 0.7977
0.5115 111.0 6993 0.7598
0.5134 112.0 7056 0.7680
0.5076 113.0 7119 0.7696
0.5126 114.0 7182 0.7451
0.4963 115.0 7245 0.7923
0.5032 116.0 7308 0.7842
0.5137 117.0 7371 0.7239
0.488 118.0 7434 0.8188
0.4938 119.0 7497 0.7479
0.4866 120.0 7560 0.7761
0.4901 121.0 7623 0.7930
0.4877 122.0 7686 0.7733
0.4858 123.0 7749 0.7492
0.4813 124.0 7812 0.7645
0.4817 125.0 7875 0.7938
0.4822 126.0 7938 0.7253
0.4771 127.0 8001 0.7481
0.4769 128.0 8064 0.7402
0.4666 129.0 8127 0.7993
0.474 130.0 8190 0.7653
0.4718 131.0 8253 0.7524
0.4682 132.0 8316 0.7129
0.4698 133.0 8379 0.7806
0.4669 134.0 8442 0.7237
0.4401 135.0 8505 0.7185
0.4656 136.0 8568 0.7542
0.4569 137.0 8631 0.7412
0.4751 138.0 8694 0.7740
0.4474 139.0 8757 0.7636
0.4652 140.0 8820 0.7958
0.4539 141.0 8883 0.7410
0.4452 142.0 8946 0.7652
0.4516 143.0 9009 0.7337
0.4423 144.0 9072 0.7601
0.4542 145.0 9135 0.7692
0.4328 146.0 9198 0.7528
0.4503 147.0 9261 0.7673
0.4416 148.0 9324 0.7193
0.447 149.0 9387 0.7517
0.4434 150.0 9450 0.7241
0.4374 151.0 9513 0.7281
0.4334 152.0 9576 0.7150
0.4209 153.0 9639 0.7531
0.4405 154.0 9702 0.7252
0.4384 155.0 9765 0.7367
0.4265 156.0 9828 0.7111
0.4386 157.0 9891 0.7215
0.4276 158.0 9954 0.7119
0.4289 159.0 10017 0.7587
0.4415 160.0 10080 0.7935
0.4315 161.0 10143 0.7574
0.4227 162.0 10206 0.7296
0.4352 163.0 10269 0.7145
0.4108 164.0 10332 0.7133
0.433 165.0 10395 0.7369
0.4336 166.0 10458 0.7471
0.4016 167.0 10521 0.7329
0.4164 168.0 10584 0.7331
0.4182 169.0 10647 0.7449
0.4136 170.0 10710 0.7365
0.4183 171.0 10773 0.7248
0.4225 172.0 10836 0.7346
0.4294 173.0 10899 0.7099
0.4113 174.0 10962 0.7264
0.4216 175.0 11025 0.6822
0.4208 176.0 11088 0.7198
0.407 177.0 11151 0.7266
0.4164 178.0 11214 0.7466
0.4112 179.0 11277 0.7409
0.4067 180.0 11340 0.7058
0.4297 181.0 11403 0.6918
0.4137 182.0 11466 0.7432
0.4102 183.0 11529 0.7272
0.4184 184.0 11592 0.7309
0.4049 185.0 11655 0.7215
0.4097 186.0 11718 0.7375
0.419 187.0 11781 0.7575
0.4122 188.0 11844 0.7481
0.4089 189.0 11907 0.7790
0.4094 190.0 11970 0.7547
0.4107 191.0 12033 0.7390
0.4044 192.0 12096 0.7472
0.4065 193.0 12159 0.7283
0.4172 194.0 12222 0.7112
0.4124 195.0 12285 0.7470
0.4026 196.0 12348 0.7067
0.4179 197.0 12411 0.7259
0.4027 198.0 12474 0.7328
0.4101 199.0 12537 0.6891
0.3969 200.0 12600 0.7104

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.2