Edit model card

chemical-bert-uncased-finetuned-cust-c2

This model is a fine-tuned version of shafin/chemical-bert-uncased-finetuned-cust on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5768

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
1.9422 1.0 63 1.6236
1.6662 2.0 126 1.5136
1.5299 3.0 189 1.4435
1.4542 4.0 252 1.2997
1.374 5.0 315 1.2431
1.2944 6.0 378 1.1990
1.2439 7.0 441 1.1733
1.2304 8.0 504 1.1494
1.1495 9.0 567 1.1410
1.1325 10.0 630 1.1208
1.0798 11.0 693 1.0691
1.074 12.0 756 1.0918
1.0422 13.0 819 1.0823
1.0124 14.0 882 1.0101
1.0172 15.0 945 0.9742
0.9821 16.0 1008 0.9740
0.9347 17.0 1071 0.9711
0.9193 18.0 1134 0.9291
0.9229 19.0 1197 0.9317
0.8751 20.0 1260 0.9331
0.8914 21.0 1323 0.9137
0.8686 22.0 1386 0.9209
0.8482 23.0 1449 0.8724
0.8201 24.0 1512 0.8512
0.8131 25.0 1575 0.8753
0.8123 26.0 1638 0.8651
0.8046 27.0 1701 0.8374
0.7668 28.0 1764 0.8981
0.7732 29.0 1827 0.8691
0.7567 30.0 1890 0.7845
0.7465 31.0 1953 0.8493
0.7451 32.0 2016 0.8270
0.7211 33.0 2079 0.8148
0.7006 34.0 2142 0.8163
0.7107 35.0 2205 0.7866
0.6889 36.0 2268 0.7712
0.674 37.0 2331 0.7762
0.6847 38.0 2394 0.7583
0.6639 39.0 2457 0.7800
0.6615 40.0 2520 0.8270
0.6566 41.0 2583 0.7851
0.6364 42.0 2646 0.7645
0.6261 43.0 2709 0.7044
0.6338 44.0 2772 0.7952
0.6315 45.0 2835 0.7439
0.6122 46.0 2898 0.7566
0.5941 47.0 2961 0.7124
0.6076 48.0 3024 0.7591
0.59 49.0 3087 0.7473
0.5838 50.0 3150 0.6961
0.5931 51.0 3213 0.7604
0.5847 52.0 3276 0.7260
0.5691 53.0 3339 0.7309
0.5778 54.0 3402 0.7200
0.5464 55.0 3465 0.7014
0.5592 56.0 3528 0.7567
0.555 57.0 3591 0.7062
0.5436 58.0 3654 0.7284
0.5328 59.0 3717 0.6896
0.5397 60.0 3780 0.7041
0.5263 61.0 3843 0.7029
0.5181 62.0 3906 0.7223
0.5166 63.0 3969 0.7043
0.5066 64.0 4032 0.6723
0.5115 65.0 4095 0.6871
0.4956 66.0 4158 0.6818
0.5006 67.0 4221 0.7075
0.4837 68.0 4284 0.6686
0.4874 69.0 4347 0.6943
0.4808 70.0 4410 0.6584
0.4775 71.0 4473 0.6954
0.4776 72.0 4536 0.6741
0.4773 73.0 4599 0.6591
0.4699 74.0 4662 0.7000
0.4779 75.0 4725 0.6829
0.4543 76.0 4788 0.6839
0.4641 77.0 4851 0.6444
0.4495 78.0 4914 0.6604
0.4489 79.0 4977 0.6713
0.4394 80.0 5040 0.6905
0.4461 81.0 5103 0.6879
0.4386 82.0 5166 0.6458
0.4529 83.0 5229 0.6306
0.4261 84.0 5292 0.6291
0.4306 85.0 5355 0.6518
0.4428 86.0 5418 0.6456
0.4336 87.0 5481 0.6686
0.4105 88.0 5544 0.6735
0.4281 89.0 5607 0.6645
0.4172 90.0 5670 0.6527
0.4037 91.0 5733 0.6004
0.4137 92.0 5796 0.6643
0.4135 93.0 5859 0.6783
0.3988 94.0 5922 0.6687
0.4172 95.0 5985 0.6486
0.3819 96.0 6048 0.6466
0.3938 97.0 6111 0.5946
0.4053 98.0 6174 0.6146
0.3988 99.0 6237 0.6166
0.3798 100.0 6300 0.6383
0.386 101.0 6363 0.6631
0.3962 102.0 6426 0.6298
0.399 103.0 6489 0.6251
0.3851 104.0 6552 0.6339
0.3767 105.0 6615 0.6610
0.3756 106.0 6678 0.6292
0.375 107.0 6741 0.6201
0.3648 108.0 6804 0.6384
0.3664 109.0 6867 0.6046
0.3679 110.0 6930 0.6169
0.368 111.0 6993 0.6450
0.3605 112.0 7056 0.6518
0.3675 113.0 7119 0.6082
0.3559 114.0 7182 0.6232
0.3563 115.0 7245 0.6438
0.3664 116.0 7308 0.6381
0.3662 117.0 7371 0.6412
0.3596 118.0 7434 0.6631
0.3447 119.0 7497 0.6065
0.3421 120.0 7560 0.6072
0.347 121.0 7623 0.5787
0.3474 122.0 7686 0.6343
0.3426 123.0 7749 0.6114
0.3418 124.0 7812 0.6084
0.3485 125.0 7875 0.6188
0.3411 126.0 7938 0.6112
0.3371 127.0 8001 0.5991
0.3353 128.0 8064 0.5861
0.3318 129.0 8127 0.6419
0.3417 130.0 8190 0.6272
0.3235 131.0 8253 0.6293
0.3363 132.0 8316 0.6017
0.3358 133.0 8379 0.5816
0.3273 134.0 8442 0.6384
0.3277 135.0 8505 0.6063
0.3336 136.0 8568 0.6482
0.3205 137.0 8631 0.6428
0.3136 138.0 8694 0.6322
0.3212 139.0 8757 0.6218
0.3275 140.0 8820 0.6328
0.3227 141.0 8883 0.6406
0.3166 142.0 8946 0.6317
0.3111 143.0 9009 0.6308
0.309 144.0 9072 0.5972
0.316 145.0 9135 0.6229
0.3163 146.0 9198 0.6244
0.3125 147.0 9261 0.6195
0.3164 148.0 9324 0.5676
0.3151 149.0 9387 0.6225
0.3014 150.0 9450 0.6044
0.3106 151.0 9513 0.6262
0.3065 152.0 9576 0.5927
0.2982 153.0 9639 0.6402
0.3054 154.0 9702 0.6329
0.3172 155.0 9765 0.6227
0.3005 156.0 9828 0.5882
0.3174 157.0 9891 0.6049
0.3023 158.0 9954 0.5990
0.3013 159.0 10017 0.5909
0.3044 160.0 10080 0.6317
0.298 161.0 10143 0.6237
0.2984 162.0 10206 0.6074
0.3075 163.0 10269 0.5746
0.2921 164.0 10332 0.5633
0.3014 165.0 10395 0.6034
0.297 166.0 10458 0.6420
0.2936 167.0 10521 0.6206
0.2946 168.0 10584 0.5869
0.2923 169.0 10647 0.5898
0.2936 170.0 10710 0.5810
0.2968 171.0 10773 0.5888
0.2863 172.0 10836 0.6124
0.3038 173.0 10899 0.5823
0.2845 174.0 10962 0.6187
0.2847 175.0 11025 0.5749
0.2984 176.0 11088 0.5900
0.297 177.0 11151 0.6243
0.2914 178.0 11214 0.5839
0.2904 179.0 11277 0.6085
0.2946 180.0 11340 0.6257
0.2934 181.0 11403 0.5918
0.2858 182.0 11466 0.6072
0.2912 183.0 11529 0.6394
0.2771 184.0 11592 0.5962
0.289 185.0 11655 0.6039
0.2801 186.0 11718 0.5819
0.2875 187.0 11781 0.6264
0.2875 188.0 11844 0.6156
0.2853 189.0 11907 0.5968
0.2874 190.0 11970 0.6028
0.2844 191.0 12033 0.5767
0.2855 192.0 12096 0.6124
0.2879 193.0 12159 0.5856
0.2801 194.0 12222 0.6163
0.2902 195.0 12285 0.5939
0.2879 196.0 12348 0.5780
0.2946 197.0 12411 0.6052
0.2801 198.0 12474 0.6251
0.287 199.0 12537 0.5839
0.2864 200.0 12600 0.5768

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.2
Downloads last month
22
Inference API
Examples
Mask token: [MASK]
This model can be loaded on Inference API (serverless).