Edit model card

resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_hint_rand

This model is a fine-tuned version of bdpc/resnet101_rvl-cdip on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 19.9624
  • Accuracy: 0.765
  • Brier Loss: 0.3910
  • Nll: 2.2998
  • F1 Micro: 0.765
  • F1 Macro: 0.7641
  • Ece: 0.1669
  • Aurc: 0.0775

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 250 26.8546 0.2102 0.8652 3.4225 0.2102 0.1897 0.0659 0.6137
26.8891 2.0 500 25.9803 0.2845 0.8381 3.4360 0.2845 0.2457 0.0762 0.5573
26.8891 3.0 750 25.8252 0.337 0.8092 3.6483 0.337 0.3168 0.1091 0.4988
25.1656 4.0 1000 24.9957 0.4088 0.7588 3.0528 0.4088 0.3859 0.1283 0.4163
25.1656 5.0 1250 24.2209 0.5517 0.5964 2.6661 0.5517 0.5510 0.0726 0.2489
23.8526 6.0 1500 23.4086 0.5915 0.5431 2.5360 0.5915 0.5840 0.0549 0.1994
23.8526 7.0 1750 23.0800 0.625 0.5049 2.4839 0.625 0.6268 0.0608 0.1668
22.7511 8.0 2000 22.9512 0.6573 0.4660 2.4202 0.6573 0.6564 0.0565 0.1407
22.7511 9.0 2250 22.7991 0.6783 0.4509 2.4137 0.6783 0.6749 0.0634 0.1315
21.9881 10.0 2500 22.8533 0.6352 0.5056 2.6549 0.6352 0.6231 0.1085 0.1554
21.9881 11.0 2750 22.7499 0.669 0.4673 2.5291 0.669 0.6642 0.1053 0.1347
21.391 12.0 3000 22.6520 0.6757 0.4767 2.5038 0.6757 0.6745 0.1204 0.1355
21.391 13.0 3250 22.4767 0.6737 0.4850 2.6030 0.6737 0.6718 0.1385 0.1380
20.9347 14.0 3500 22.3023 0.6767 0.4832 2.5438 0.6767 0.6770 0.1594 0.1301
20.9347 15.0 3750 22.1482 0.693 0.4666 2.5622 0.693 0.6913 0.1581 0.1209
20.5776 16.0 4000 22.1655 0.6943 0.4849 2.5685 0.6943 0.6994 0.1766 0.1288
20.5776 17.0 4250 22.0213 0.686 0.4922 2.6576 0.686 0.6925 0.1749 0.1250
20.2836 18.0 4500 21.5434 0.7023 0.4560 2.5508 0.7023 0.7018 0.1720 0.1146
20.2836 19.0 4750 21.7105 0.715 0.4501 2.5953 0.715 0.7128 0.1738 0.1083
20.0339 20.0 5000 21.6301 0.7057 0.4645 2.6131 0.7057 0.7033 0.1701 0.1153
20.0339 21.0 5250 21.9130 0.7007 0.4825 2.7114 0.7007 0.6989 0.1917 0.1231
19.8026 22.0 5500 21.7975 0.713 0.4702 2.7340 0.713 0.7117 0.1879 0.1148
19.8026 23.0 5750 21.5577 0.7173 0.4621 2.7138 0.7173 0.7100 0.1931 0.1072
19.6001 24.0 6000 21.2486 0.722 0.4491 2.5651 0.722 0.7214 0.1853 0.1045
19.6001 25.0 6250 21.0363 0.7348 0.4344 2.4688 0.7348 0.7364 0.1780 0.0974
19.4158 26.0 6500 21.3527 0.728 0.4495 2.7492 0.728 0.7219 0.1864 0.1005
19.4158 27.0 6750 20.8258 0.7355 0.4339 2.4375 0.7355 0.7352 0.1838 0.0943
19.2585 28.0 7000 21.0491 0.729 0.4465 2.6324 0.729 0.7245 0.1953 0.1010
19.2585 29.0 7250 20.7774 0.7425 0.4283 2.4694 0.7425 0.7410 0.1799 0.0908
19.1051 30.0 7500 20.6908 0.741 0.4311 2.4924 0.7410 0.7405 0.1890 0.0888
19.1051 31.0 7750 20.8242 0.743 0.4264 2.5098 0.743 0.7407 0.1826 0.0903
18.9722 32.0 8000 20.6257 0.7435 0.4288 2.4740 0.7435 0.7432 0.1865 0.0872
18.9722 33.0 8250 20.6265 0.745 0.4289 2.4552 0.745 0.7435 0.1862 0.0929
18.854 34.0 8500 20.4251 0.7505 0.4124 2.4631 0.7505 0.7513 0.1790 0.0845
18.854 35.0 8750 20.4164 0.741 0.4278 2.3888 0.7410 0.7402 0.1859 0.0889
18.7477 36.0 9000 20.3432 0.751 0.4184 2.4020 0.751 0.7485 0.1800 0.0850
18.7477 37.0 9250 20.4310 0.7555 0.4154 2.4639 0.7555 0.7528 0.1759 0.0842
18.6548 38.0 9500 20.1987 0.7542 0.4111 2.2921 0.7542 0.7542 0.1792 0.0815
18.6548 39.0 9750 20.2326 0.7562 0.4017 2.3536 0.7562 0.7537 0.1767 0.0829
18.5776 40.0 10000 20.1571 0.7575 0.3985 2.3405 0.7575 0.7568 0.1703 0.0811
18.5776 41.0 10250 20.1580 0.7625 0.3962 2.3855 0.7625 0.7621 0.1713 0.0814
18.5133 42.0 10500 20.0952 0.7572 0.4038 2.3600 0.7572 0.7563 0.1768 0.0794
18.5133 43.0 10750 20.1483 0.7575 0.4008 2.3713 0.7575 0.7564 0.1755 0.0820
18.4613 44.0 11000 20.0749 0.762 0.3992 2.3372 0.762 0.7618 0.1720 0.0795
18.4613 45.0 11250 20.0664 0.7578 0.4035 2.3570 0.7577 0.7566 0.1769 0.0795
18.4218 46.0 11500 19.9611 0.7622 0.3946 2.3399 0.7622 0.7617 0.1674 0.0784
18.4218 47.0 11750 19.9678 0.7632 0.3907 2.3011 0.7632 0.7635 0.1692 0.0772
18.3945 48.0 12000 19.9950 0.763 0.3910 2.2773 0.763 0.7616 0.1695 0.0775
18.3945 49.0 12250 20.0013 0.7625 0.3911 2.2875 0.7625 0.7618 0.1705 0.0777
18.3792 50.0 12500 19.9624 0.765 0.3910 2.2998 0.765 0.7641 0.1669 0.0775

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1.post200
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
10