SignBart-ASL-2730

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0663
  • Accuracy: 0.7522
  • Precision: 0.7748
  • Recall: 0.7522

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 1000

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall
7.9504 1.0 316 7.5910 0.0015 0.0000 0.0015
7.3419 2.0 632 6.8014 0.0120 0.0012 0.0120
6.7466 3.0 948 6.2123 0.0324 0.0085 0.0324
6.2532 4.0 1264 5.7091 0.0574 0.0246 0.0574
5.8427 5.0 1580 5.2449 0.1003 0.0630 0.1003
5.4563 6.0 1896 4.8599 0.1334 0.0990 0.1334
5.1075 7.0 2212 4.4728 0.1887 0.1664 0.1887
4.8626 8.0 2528 4.1315 0.2347 0.2114 0.2347
4.5077 9.0 2844 3.8084 0.2820 0.2686 0.2820
4.3056 10.0 3160 3.5184 0.3342 0.3262 0.3342
4.0527 11.0 3476 3.2573 0.3712 0.3816 0.3712
3.9347 12.0 3792 3.0366 0.4050 0.4086 0.4050
3.6534 13.0 4108 2.7821 0.4473 0.4689 0.4473
3.4758 14.0 4424 2.6075 0.4692 0.4862 0.4692
3.3033 15.0 4740 2.4175 0.5104 0.5303 0.5104
3.1366 16.0 5056 2.2547 0.5277 0.5506 0.5277
2.9912 17.0 5372 2.1465 0.5388 0.5582 0.5388
2.9211 18.0 5688 2.0410 0.5608 0.5852 0.5608
2.8221 19.0 6004 1.9460 0.5741 0.5967 0.5741
2.728 20.0 6320 1.8412 0.5935 0.6158 0.5935
2.6948 21.0 6636 1.7873 0.6009 0.6276 0.6009
2.4883 22.0 6952 1.7102 0.6154 0.6438 0.6154
2.517 23.0 7268 1.6517 0.6253 0.6496 0.6253
2.4333 24.0 7584 1.6176 0.6298 0.6548 0.6298
2.353 25.0 7900 1.5679 0.6401 0.6688 0.6401
2.2862 26.0 8216 1.5282 0.6463 0.6745 0.6463
2.3052 27.0 8532 1.4695 0.6574 0.6824 0.6574
2.1156 28.0 8848 1.4635 0.6595 0.6846 0.6595
2.126 29.0 9164 1.4364 0.6598 0.6859 0.6598
2.1957 30.0 9480 1.4060 0.6706 0.6948 0.6706
2.0033 31.0 9796 1.3580 0.6820 0.7072 0.6820
2.0278 32.0 10112 1.3474 0.6809 0.7038 0.6809
2.0128 33.0 10428 1.3391 0.6837 0.7113 0.6837
2.0176 34.0 10744 1.3162 0.6871 0.7125 0.6871
1.9308 35.0 11060 1.2937 0.6901 0.7150 0.6901
2.0256 36.0 11376 1.2763 0.6960 0.7251 0.6960
1.8052 37.0 11692 1.2545 0.7003 0.7252 0.7003
1.8055 38.0 12008 1.2543 0.6990 0.7256 0.6990
1.8889 39.0 12324 1.2371 0.7026 0.7292 0.7026
1.8205 40.0 12640 1.2237 0.7053 0.7289 0.7053
1.7679 41.0 12956 1.1919 0.7129 0.7368 0.7129
1.7532 42.0 13272 1.1966 0.7101 0.7334 0.7101
1.7701 43.0 13588 1.1882 0.7133 0.7379 0.7133
1.7272 44.0 13904 1.1786 0.7179 0.7425 0.7179
1.6642 45.0 14220 1.1660 0.7199 0.7445 0.7199
1.6463 46.0 14536 1.1609 0.7197 0.7431 0.7197
1.6629 47.0 14852 1.1418 0.7228 0.7481 0.7228
1.6612 48.0 15168 1.1496 0.7228 0.7462 0.7228
1.6032 49.0 15484 1.1317 0.7280 0.7503 0.7280
1.5918 50.0 15800 1.1229 0.7315 0.7571 0.7315
1.5362 51.0 16116 1.1341 0.7269 0.7510 0.7269
1.6448 52.0 16432 1.1116 0.7335 0.7572 0.7335
1.4685 53.0 16748 1.1258 0.7312 0.7549 0.7312
1.6576 54.0 17064 1.1095 0.7327 0.7572 0.7327
1.6566 55.0 17380 1.1104 0.7330 0.7553 0.7330
1.4888 56.0 17696 1.1040 0.7331 0.7535 0.7331
1.4835 57.0 18012 1.0956 0.7387 0.7606 0.7387
1.5224 58.0 18328 1.0980 0.7407 0.7634 0.7407
1.5324 59.0 18644 1.1112 0.7336 0.7577 0.7336
1.5162 60.0 18960 1.0971 0.7354 0.7619 0.7354
1.4942 61.0 19276 1.0899 0.7436 0.7672 0.7436
1.4682 62.0 19592 1.0822 0.7429 0.7671 0.7429
1.5173 63.0 19908 1.0962 0.7403 0.7626 0.7403
1.4293 64.0 20224 1.0856 0.7412 0.7651 0.7412
1.4511 65.0 20540 1.0834 0.7422 0.7671 0.7422
1.5094 66.0 20856 1.0783 0.7457 0.7697 0.7457
1.3379 67.0 21172 1.0804 0.7469 0.7713 0.7469
1.375 68.0 21488 1.0876 0.7438 0.7680 0.7438
1.3573 69.0 21804 1.0713 0.7474 0.7733 0.7474
1.2803 70.0 22120 1.0862 0.7429 0.7681 0.7429
1.331 71.0 22436 1.0800 0.7466 0.7722 0.7466
1.3896 72.0 22752 1.0616 0.7491 0.7732 0.7491
1.3718 73.0 23068 1.0678 0.7513 0.7762 0.7513
1.3099 74.0 23384 1.0672 0.7541 0.7770 0.7541
1.2683 75.0 23700 1.0631 0.7502 0.7743 0.7502
1.2928 76.0 24016 1.0622 0.7505 0.7739 0.7505
1.3567 77.0 24332 1.0507 0.7531 0.7754 0.7531
1.3422 78.0 24648 1.0574 0.7498 0.7728 0.7498
1.2405 79.0 24964 1.0649 0.7544 0.7786 0.7544
1.281 80.0 25280 1.0635 0.7525 0.7753 0.7525
1.3208 81.0 25596 1.0596 0.7540 0.7775 0.7540
1.3675 82.0 25912 1.0506 0.7545 0.7776 0.7545
1.3179 83.0 26228 1.0621 0.7524 0.7755 0.7524
1.2691 84.0 26544 1.0617 0.7535 0.7752 0.7535
1.3058 85.0 26860 1.0624 0.7548 0.7775 0.7548
1.2305 86.0 27176 1.0663 0.7546 0.7766 0.7546
1.2468 87.0 27492 1.0663 0.7522 0.7748 0.7522

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
0
Safetensors
Model size
4.55M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support