Edit model card

esm2_t130_150M-lora-classifier_2024-04-26_00-25-40

This model is a fine-tuned version of facebook/esm2_t30_150M_UR50D on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6470
  • Accuracy: 0.8887

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005701568055793089
  • train_batch_size: 28
  • eval_batch_size: 28
  • seed: 8893
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 300
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.7096 1.0 55 0.6718 0.6055
0.6769 2.0 110 0.6739 0.6055
0.579 3.0 165 0.6608 0.6484
0.5726 4.0 220 0.5777 0.7109
0.6381 5.0 275 0.5020 0.7676
0.183 6.0 330 0.3725 0.8320
0.3701 7.0 385 0.3508 0.8535
0.2147 8.0 440 0.3191 0.8711
0.1654 9.0 495 0.3036 0.875
0.1581 10.0 550 0.3761 0.8516
0.3459 11.0 605 0.3746 0.8594
0.3325 12.0 660 0.3025 0.8867
0.1237 13.0 715 0.2983 0.8770
0.5167 14.0 770 0.3044 0.8887
0.3541 15.0 825 0.2927 0.8906
0.0378 16.0 880 0.3669 0.8906
0.062 17.0 935 0.3298 0.8887
0.1695 18.0 990 0.2912 0.9004
0.0444 19.0 1045 0.3034 0.9004
0.1794 20.0 1100 0.3641 0.8828
0.0634 21.0 1155 0.3521 0.8867
0.0446 22.0 1210 0.3438 0.8887
0.0266 23.0 1265 0.4553 0.8867
0.2637 24.0 1320 0.4715 0.8867
0.159 25.0 1375 0.4323 0.8945
0.2401 26.0 1430 0.6019 0.8809
0.1317 27.0 1485 0.5549 0.8906
0.1223 28.0 1540 0.4819 0.8926
0.0015 29.0 1595 0.6432 0.8711
0.0007 30.0 1650 0.6480 0.8926
0.0774 31.0 1705 0.7596 0.8926
0.1262 32.0 1760 0.7614 0.8809
0.034 33.0 1815 0.7392 0.8789
0.0021 34.0 1870 0.9068 0.8848
0.0003 35.0 1925 0.8724 0.8711
0.0001 36.0 1980 0.9483 0.8867
0.0127 37.0 2035 0.9638 0.8828
0.0001 38.0 2090 0.9105 0.8926
0.0001 39.0 2145 0.9231 0.8809
0.0008 40.0 2200 1.0224 0.8867
0.0001 41.0 2255 1.0666 0.8848
0.0002 42.0 2310 1.1028 0.8848
0.0 43.0 2365 0.9653 0.8906
0.0006 44.0 2420 1.1108 0.8848
0.0001 45.0 2475 1.2919 0.8730
0.0002 46.0 2530 1.0834 0.8926
0.0002 47.0 2585 1.1240 0.8887
0.0135 48.0 2640 1.1466 0.8887
0.0008 49.0 2695 1.2674 0.8691
0.0 50.0 2750 1.1311 0.8887
0.0086 51.0 2805 1.0957 0.8887
0.0 52.0 2860 1.1336 0.8789
0.0007 53.0 2915 1.1494 0.875
0.0002 54.0 2970 1.0790 0.8848
0.0002 55.0 3025 1.1489 0.8809
0.0 56.0 3080 1.1479 0.8867
0.0022 57.0 3135 1.2092 0.8848
0.2415 58.0 3190 1.2060 0.8848
0.7813 59.0 3245 1.3750 0.8613
0.0 60.0 3300 1.1202 0.875
0.0 61.0 3355 1.0502 0.8848
0.0 62.0 3410 1.3270 0.8730
0.0015 63.0 3465 1.0082 0.875
0.0002 64.0 3520 0.9724 0.8867
0.0014 65.0 3575 1.0862 0.8770
0.0002 66.0 3630 1.1366 0.8730
0.1868 67.0 3685 1.1838 0.8770
0.0004 68.0 3740 1.2073 0.875
0.0007 69.0 3795 1.1793 0.8770
0.0 70.0 3850 1.2262 0.8652
0.2838 71.0 3905 1.2415 0.875
0.0 72.0 3960 1.2346 0.8770
0.0041 73.0 4015 1.0830 0.8789
0.0055 74.0 4070 1.0731 0.8867
0.0 75.0 4125 1.4096 0.8652
0.0034 76.0 4180 1.1142 0.8711
0.0 77.0 4235 1.0250 0.8848
0.0002 78.0 4290 1.0700 0.8691
0.0009 79.0 4345 0.9032 0.8789
0.0001 80.0 4400 1.0556 0.8730
0.0001 81.0 4455 1.0740 0.8770
0.0002 82.0 4510 1.2571 0.8691
0.0 83.0 4565 1.2007 0.8809
0.0 84.0 4620 1.2515 0.875
0.0001 85.0 4675 1.0750 0.8828
0.0006 86.0 4730 1.3016 0.8730
0.0001 87.0 4785 1.2393 0.8809
0.0 88.0 4840 1.2232 0.8848
0.0003 89.0 4895 1.2187 0.8789
0.0 90.0 4950 1.2328 0.8730
0.0 91.0 5005 1.3026 0.8848
0.0 92.0 5060 1.3152 0.8770
0.0 93.0 5115 1.4069 0.875
0.0 94.0 5170 1.3988 0.8770
0.0 95.0 5225 1.3675 0.8594
0.0 96.0 5280 1.3366 0.8770
0.0003 97.0 5335 1.2140 0.8848
0.0 98.0 5390 1.3585 0.8711
0.0 99.0 5445 1.1665 0.8672
0.0 100.0 5500 1.0947 0.8809
0.0099 101.0 5555 1.2993 0.8730
0.0 102.0 5610 1.3578 0.8789
0.0 103.0 5665 1.3596 0.8867
0.0006 104.0 5720 1.3164 0.8848
0.0 105.0 5775 1.4100 0.8770
0.0 106.0 5830 1.3459 0.875
0.0005 107.0 5885 1.3783 0.8809
0.0 108.0 5940 1.2698 0.8770
0.0 109.0 5995 1.3933 0.8848
0.0 110.0 6050 1.3813 0.8809
0.0 111.0 6105 1.5747 0.875
0.0001 112.0 6160 1.3368 0.8867
0.0486 113.0 6215 1.3833 0.8828
0.1476 114.0 6270 1.4943 0.8828
0.0002 115.0 6325 1.4725 0.8789
0.0 116.0 6380 1.4614 0.875
0.0047 117.0 6435 1.6313 0.8770
0.0 118.0 6490 1.4459 0.8848
0.0026 119.0 6545 1.4150 0.8730
0.0 120.0 6600 1.6055 0.8555
0.0001 121.0 6655 1.3710 0.8789
0.3319 122.0 6710 1.3940 0.8867
0.0001 123.0 6765 1.2486 0.875
0.0002 124.0 6820 1.2946 0.8711
0.0 125.0 6875 1.2341 0.8711
0.0 126.0 6930 1.1418 0.8887
0.0 127.0 6985 1.0713 0.8926
0.0001 128.0 7040 1.1391 0.8613
0.1624 129.0 7095 1.2195 0.8789
0.0 130.0 7150 1.1576 0.8770
0.0001 131.0 7205 1.2939 0.8730
0.0 132.0 7260 1.1568 0.8867
0.0 133.0 7315 1.2117 0.8848
0.0 134.0 7370 1.1264 0.8926
0.0 135.0 7425 1.1675 0.8848
0.0 136.0 7480 1.1983 0.8828
0.0 137.0 7535 1.2666 0.8770
0.0001 138.0 7590 1.1287 0.8848
0.0 139.0 7645 1.0505 0.8848
0.0 140.0 7700 1.1770 0.8770
0.0 141.0 7755 1.1749 0.8906
0.0 142.0 7810 1.1311 0.8711
0.0 143.0 7865 1.1114 0.8652
0.0 144.0 7920 1.1419 0.8691
0.0 145.0 7975 1.1666 0.8691
0.0 146.0 8030 1.1712 0.8711
0.0 147.0 8085 1.1831 0.8711
0.0 148.0 8140 1.1799 0.8711
0.0 149.0 8195 1.1876 0.8711
0.0 150.0 8250 1.1884 0.8730
0.0 151.0 8305 1.2389 0.8730
0.0 152.0 8360 1.3622 0.875
0.0 153.0 8415 1.2604 0.8789
0.0 154.0 8470 1.3336 0.875
0.0 155.0 8525 1.3496 0.8809
0.0 156.0 8580 1.3882 0.8555
0.1815 157.0 8635 1.3679 0.8789
0.288 158.0 8690 1.3804 0.8691
0.0 159.0 8745 1.2980 0.8770
0.0 160.0 8800 1.4075 0.8789
0.0 161.0 8855 1.4231 0.8789
0.0 162.0 8910 1.4730 0.875
0.0019 163.0 8965 1.5861 0.8672
0.0 164.0 9020 1.4080 0.8809
0.0005 165.0 9075 1.5852 0.8711
0.0 166.0 9130 1.5370 0.875
0.0 167.0 9185 1.5288 0.875
0.0 168.0 9240 1.5516 0.8711
0.0 169.0 9295 1.5268 0.8730
0.0 170.0 9350 1.5061 0.8672
0.0 171.0 9405 1.4843 0.875
0.0 172.0 9460 1.5478 0.8633
0.0 173.0 9515 1.4753 0.8730
0.0 174.0 9570 1.6709 0.8730
0.0 175.0 9625 1.6663 0.875
0.0 176.0 9680 1.6980 0.8672
0.0 177.0 9735 1.5563 0.8770
0.0 178.0 9790 1.6146 0.875
0.0 179.0 9845 1.5599 0.8770
0.0 180.0 9900 1.5558 0.8789
0.0 181.0 9955 1.8485 0.8633
0.0 182.0 10010 1.7223 0.8789
0.0 183.0 10065 1.7169 0.875
0.0 184.0 10120 1.7125 0.8711
0.0 185.0 10175 1.7065 0.8711
0.0 186.0 10230 1.7748 0.8730
0.0 187.0 10285 1.6861 0.8789
0.0 188.0 10340 1.7325 0.8887
0.0 189.0 10395 1.7658 0.8828
0.0 190.0 10450 1.7649 0.8809
0.0 191.0 10505 1.7555 0.8828
0.0162 192.0 10560 1.8313 0.8691
0.0001 193.0 10615 1.8314 0.8574
0.0 194.0 10670 1.7706 0.8672
0.0 195.0 10725 1.6568 0.8730
0.0 196.0 10780 1.6568 0.8770
0.0 197.0 10835 1.6185 0.8848
0.0 198.0 10890 1.6133 0.8848
0.0 199.0 10945 1.6129 0.8848
0.0 200.0 11000 1.6121 0.8848
0.0 201.0 11055 1.6104 0.8828
0.0 202.0 11110 1.6075 0.8828
0.0 203.0 11165 1.6153 0.8867
0.0 204.0 11220 1.6339 0.8828
0.0 205.0 11275 1.6164 0.8867
0.0 206.0 11330 1.6114 0.8848
0.0 207.0 11385 1.6122 0.8867
0.0 208.0 11440 1.6079 0.8867
0.0 209.0 11495 1.6132 0.8867
0.0 210.0 11550 1.6141 0.8867
0.0 211.0 11605 1.6122 0.8867
0.0 212.0 11660 1.6070 0.8867
0.0 213.0 11715 1.6010 0.8867
0.0 214.0 11770 1.6562 0.8789
0.0005 215.0 11825 1.6297 0.8887
0.0 216.0 11880 1.6070 0.8809
0.0 217.0 11935 1.6750 0.8770
0.0 218.0 11990 1.6822 0.8730
0.0 219.0 12045 1.6819 0.8730
0.0 220.0 12100 1.6846 0.8770
0.0 221.0 12155 1.6827 0.875
0.0 222.0 12210 1.6822 0.875
0.0 223.0 12265 1.6780 0.8770
0.0 224.0 12320 1.6813 0.8770
0.0 225.0 12375 1.6770 0.8770
0.0 226.0 12430 1.6878 0.8789
0.0 227.0 12485 1.8890 0.8672
0.0 228.0 12540 1.6978 0.8828
0.0 229.0 12595 1.6945 0.8867
0.0 230.0 12650 1.6960 0.8848
0.0 231.0 12705 1.6972 0.8867
0.0 232.0 12760 1.6929 0.8867
0.0 233.0 12815 1.6911 0.8848
0.0 234.0 12870 1.6887 0.8867
0.0 235.0 12925 1.6999 0.8848
0.0 236.0 12980 1.7000 0.8848
0.0 237.0 13035 1.6877 0.8867
0.0 238.0 13090 1.6858 0.8867
0.0 239.0 13145 1.6859 0.8867
0.0 240.0 13200 1.6842 0.8867
0.0 241.0 13255 1.6829 0.8867
0.0 242.0 13310 1.6800 0.8867
0.0 243.0 13365 1.6870 0.8848
0.0 244.0 13420 1.6856 0.8848
0.0 245.0 13475 1.6831 0.8848
0.0 246.0 13530 1.6864 0.8828
0.0 247.0 13585 1.6896 0.8828
0.0 248.0 13640 1.6900 0.8828
0.0 249.0 13695 1.6906 0.8848
0.0 250.0 13750 1.6928 0.8828
0.0 251.0 13805 1.6943 0.8828
0.0 252.0 13860 1.6902 0.8789
0.0 253.0 13915 1.6638 0.8887
0.0 254.0 13970 1.6632 0.8867
0.0 255.0 14025 1.6627 0.8867
0.0 256.0 14080 1.6631 0.8867
0.0 257.0 14135 1.6626 0.8867
0.0 258.0 14190 1.6629 0.8867
0.0 259.0 14245 1.6617 0.8867
0.0 260.0 14300 1.6606 0.8867
0.0 261.0 14355 1.6598 0.8867
0.0 262.0 14410 1.6559 0.8867
0.0 263.0 14465 1.6564 0.8867
0.0 264.0 14520 1.6555 0.8867
0.0 265.0 14575 1.6588 0.8867
0.0 266.0 14630 1.6565 0.8867
0.0 267.0 14685 1.6558 0.8867
0.0 268.0 14740 1.6564 0.8848
0.0 269.0 14795 1.6578 0.8848
0.0 270.0 14850 1.6566 0.8848
0.0 271.0 14905 1.6560 0.8867
0.0 272.0 14960 1.6587 0.8848
0.0 273.0 15015 1.6575 0.8867
0.0 274.0 15070 1.6575 0.8848
0.0 275.0 15125 1.6570 0.8867
0.0 276.0 15180 1.6586 0.8848
0.0 277.0 15235 1.6572 0.8887
0.0 278.0 15290 1.6577 0.8848
0.0 279.0 15345 1.6570 0.8867
0.0 280.0 15400 1.6567 0.8887
0.0 281.0 15455 1.6548 0.8887
0.0 282.0 15510 1.6558 0.8867
0.0 283.0 15565 1.6505 0.8887
0.0 284.0 15620 1.6515 0.8887
0.0 285.0 15675 1.6513 0.8887
0.0 286.0 15730 1.6456 0.8887
0.0 287.0 15785 1.6471 0.8887
0.0 288.0 15840 1.6451 0.8887
0.0 289.0 15895 1.6468 0.8887
0.0 290.0 15950 1.6470 0.8887
0.0 291.0 16005 1.6448 0.8887
0.0 292.0 16060 1.6478 0.8887
0.0 293.0 16115 1.6475 0.8887
0.0 294.0 16170 1.6471 0.8887
0.0 295.0 16225 1.6476 0.8887
0.0 296.0 16280 1.6475 0.8887
0.0 297.0 16335 1.6460 0.8887
0.0 298.0 16390 1.6471 0.8887
0.0 299.0 16445 1.6469 0.8887
0.0 300.0 16500 1.6470 0.8887

Framework versions

  • PEFT 0.10.0
  • Transformers 4.39.3
  • Pytorch 2.2.1
  • Datasets 2.16.1
  • Tokenizers 0.15.2
Downloads last month
2
Unable to determine this model’s pipeline type. Check the docs .

Adapter for