math_question_grade_detection_v12-15-24
This model is a fine-tuned version of allenai/scibert_scivocab_uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8318
- Accuracy: 0.7810
- Precision: 0.7846
- Recall: 0.7810
- F1: 0.7777
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 4000
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
No log | 0.0855 | 50 | 3.1015 | 0.1873 | 0.1326 | 0.1873 | 0.1319 |
No log | 0.1709 | 100 | 2.7556 | 0.2498 | 0.3005 | 0.2498 | 0.1722 |
No log | 0.2564 | 150 | 2.4127 | 0.3689 | 0.3262 | 0.3689 | 0.2958 |
No log | 0.3419 | 200 | 2.1657 | 0.4102 | 0.3757 | 0.4102 | 0.3555 |
No log | 0.4274 | 250 | 1.9785 | 0.4659 | 0.3911 | 0.4659 | 0.4013 |
No log | 0.5128 | 300 | 1.8232 | 0.5014 | 0.4838 | 0.5014 | 0.4474 |
No log | 0.5983 | 350 | 1.7162 | 0.5139 | 0.4598 | 0.5139 | 0.4605 |
No log | 0.6838 | 400 | 1.6189 | 0.5274 | 0.5116 | 0.5274 | 0.4710 |
No log | 0.7692 | 450 | 1.5503 | 0.5408 | 0.5207 | 0.5408 | 0.4957 |
2.1698 | 0.8547 | 500 | 1.5007 | 0.5552 | 0.5385 | 0.5552 | 0.5075 |
2.1698 | 0.9402 | 550 | 1.4402 | 0.5514 | 0.5148 | 0.5514 | 0.5036 |
2.1698 | 1.0256 | 600 | 1.4091 | 0.5812 | 0.5586 | 0.5812 | 0.5472 |
2.1698 | 1.1111 | 650 | 1.3141 | 0.6042 | 0.5869 | 0.6042 | 0.5613 |
2.1698 | 1.1966 | 700 | 1.2728 | 0.6167 | 0.6115 | 0.6167 | 0.5899 |
2.1698 | 1.2821 | 750 | 1.2387 | 0.6119 | 0.5836 | 0.6119 | 0.5788 |
2.1698 | 1.3675 | 800 | 1.1677 | 0.6436 | 0.6098 | 0.6436 | 0.6134 |
2.1698 | 1.4530 | 850 | 1.1942 | 0.6378 | 0.6137 | 0.6378 | 0.6068 |
2.1698 | 1.5385 | 900 | 1.1496 | 0.6551 | 0.6176 | 0.6551 | 0.6252 |
2.1698 | 1.6239 | 950 | 1.0941 | 0.6513 | 0.6178 | 0.6513 | 0.6200 |
1.1604 | 1.7094 | 1000 | 1.0942 | 0.6561 | 0.6194 | 0.6561 | 0.6241 |
1.1604 | 1.7949 | 1050 | 1.0372 | 0.6734 | 0.6388 | 0.6734 | 0.6466 |
1.1604 | 1.8803 | 1100 | 1.0042 | 0.6926 | 0.6532 | 0.6926 | 0.6637 |
1.1604 | 1.9658 | 1150 | 1.0170 | 0.6763 | 0.6470 | 0.6763 | 0.6508 |
1.1604 | 2.0513 | 1200 | 1.0328 | 0.6801 | 0.6412 | 0.6801 | 0.6507 |
1.1604 | 2.1368 | 1250 | 0.9747 | 0.6849 | 0.6691 | 0.6849 | 0.6608 |
1.1604 | 2.2222 | 1300 | 0.9740 | 0.6840 | 0.6913 | 0.6840 | 0.6654 |
1.1604 | 2.3077 | 1350 | 0.9806 | 0.6984 | 0.7010 | 0.6984 | 0.6793 |
1.1604 | 2.3932 | 1400 | 0.9327 | 0.7157 | 0.7259 | 0.7157 | 0.6943 |
1.1604 | 2.4786 | 1450 | 0.9040 | 0.7109 | 0.7128 | 0.7109 | 0.6961 |
0.8113 | 2.5641 | 1500 | 0.9086 | 0.7243 | 0.7340 | 0.7243 | 0.7086 |
0.8113 | 2.6496 | 1550 | 0.9190 | 0.7099 | 0.7230 | 0.7099 | 0.6925 |
0.8113 | 2.7350 | 1600 | 0.9221 | 0.7099 | 0.7067 | 0.7099 | 0.6913 |
0.8113 | 2.8205 | 1650 | 0.9628 | 0.7070 | 0.7256 | 0.7070 | 0.6914 |
0.8113 | 2.9060 | 1700 | 0.9023 | 0.7166 | 0.7246 | 0.7166 | 0.6991 |
0.8113 | 2.9915 | 1750 | 0.9015 | 0.7243 | 0.7205 | 0.7243 | 0.7083 |
0.8113 | 3.0769 | 1800 | 0.8744 | 0.7368 | 0.7546 | 0.7368 | 0.7257 |
0.8113 | 3.1624 | 1850 | 0.8957 | 0.7301 | 0.7385 | 0.7301 | 0.7158 |
0.8113 | 3.2479 | 1900 | 0.8821 | 0.7176 | 0.7236 | 0.7176 | 0.7040 |
0.8113 | 3.3333 | 1950 | 0.8755 | 0.7291 | 0.7392 | 0.7291 | 0.7153 |
0.5683 | 3.4188 | 2000 | 0.8881 | 0.7358 | 0.7385 | 0.7358 | 0.7233 |
0.5683 | 3.5043 | 2050 | 0.8840 | 0.7291 | 0.7427 | 0.7291 | 0.7210 |
0.5683 | 3.5897 | 2100 | 0.8520 | 0.7281 | 0.7382 | 0.7281 | 0.7172 |
0.5683 | 3.6752 | 2150 | 0.8350 | 0.7349 | 0.7422 | 0.7349 | 0.7237 |
0.5683 | 3.7607 | 2200 | 0.8230 | 0.7435 | 0.7394 | 0.7435 | 0.7289 |
0.5683 | 3.8462 | 2250 | 0.8454 | 0.7474 | 0.7513 | 0.7474 | 0.7384 |
0.5683 | 3.9316 | 2300 | 0.8278 | 0.7541 | 0.7573 | 0.7541 | 0.7454 |
0.5683 | 4.0171 | 2350 | 0.8252 | 0.7589 | 0.7618 | 0.7589 | 0.7520 |
0.5683 | 4.1026 | 2400 | 0.8250 | 0.7608 | 0.7584 | 0.7608 | 0.7525 |
0.5683 | 4.1880 | 2450 | 0.8608 | 0.7493 | 0.7549 | 0.7493 | 0.7425 |
0.3936 | 4.2735 | 2500 | 0.8701 | 0.7435 | 0.7434 | 0.7435 | 0.7322 |
0.3936 | 4.3590 | 2550 | 0.8313 | 0.7541 | 0.7591 | 0.7541 | 0.7476 |
0.3936 | 4.4444 | 2600 | 0.8121 | 0.7627 | 0.7677 | 0.7627 | 0.7572 |
0.3936 | 4.5299 | 2650 | 0.8364 | 0.7627 | 0.7641 | 0.7627 | 0.7579 |
0.3936 | 4.6154 | 2700 | 0.8454 | 0.7656 | 0.7737 | 0.7656 | 0.7622 |
0.3936 | 4.7009 | 2750 | 0.8403 | 0.7646 | 0.7716 | 0.7646 | 0.7591 |
0.3936 | 4.7863 | 2800 | 0.8026 | 0.7714 | 0.7810 | 0.7714 | 0.7683 |
0.3936 | 4.8718 | 2850 | 0.8119 | 0.7819 | 0.7866 | 0.7819 | 0.7765 |
0.3936 | 4.9573 | 2900 | 0.7979 | 0.7723 | 0.7771 | 0.7723 | 0.7693 |
0.3936 | 5.0427 | 2950 | 0.8000 | 0.7656 | 0.7696 | 0.7656 | 0.7594 |
0.277 | 5.1282 | 3000 | 0.8160 | 0.7656 | 0.7754 | 0.7656 | 0.7610 |
0.277 | 5.2137 | 3050 | 0.8320 | 0.7589 | 0.7620 | 0.7589 | 0.7535 |
0.277 | 5.2991 | 3100 | 0.8460 | 0.7608 | 0.7665 | 0.7608 | 0.7559 |
0.277 | 5.3846 | 3150 | 0.8424 | 0.7608 | 0.7692 | 0.7608 | 0.7570 |
0.277 | 5.4701 | 3200 | 0.8251 | 0.7637 | 0.7698 | 0.7637 | 0.7588 |
0.277 | 5.5556 | 3250 | 0.8343 | 0.7743 | 0.7778 | 0.7743 | 0.7694 |
0.277 | 5.6410 | 3300 | 0.8514 | 0.7714 | 0.7776 | 0.7714 | 0.7674 |
0.277 | 5.7265 | 3350 | 0.8347 | 0.7695 | 0.7784 | 0.7695 | 0.7661 |
0.277 | 5.8120 | 3400 | 0.8309 | 0.7685 | 0.7716 | 0.7685 | 0.7651 |
0.277 | 5.8974 | 3450 | 0.8116 | 0.7704 | 0.7789 | 0.7704 | 0.7682 |
0.1904 | 5.9829 | 3500 | 0.8127 | 0.7819 | 0.7838 | 0.7819 | 0.7779 |
0.1904 | 6.0684 | 3550 | 0.8193 | 0.7781 | 0.7780 | 0.7781 | 0.7732 |
0.1904 | 6.1538 | 3600 | 0.8268 | 0.7819 | 0.7872 | 0.7819 | 0.7787 |
0.1904 | 6.2393 | 3650 | 0.8217 | 0.7829 | 0.7863 | 0.7829 | 0.7793 |
0.1904 | 6.3248 | 3700 | 0.8345 | 0.7781 | 0.7816 | 0.7781 | 0.7741 |
0.1904 | 6.4103 | 3750 | 0.8293 | 0.7839 | 0.7895 | 0.7839 | 0.7808 |
0.1904 | 6.4957 | 3800 | 0.8248 | 0.7800 | 0.7838 | 0.7800 | 0.7769 |
0.1904 | 6.5812 | 3850 | 0.8255 | 0.7762 | 0.7820 | 0.7762 | 0.7732 |
0.1904 | 6.6667 | 3900 | 0.8292 | 0.7791 | 0.7843 | 0.7791 | 0.7763 |
0.1904 | 6.7521 | 3950 | 0.8304 | 0.7819 | 0.7864 | 0.7819 | 0.7788 |
0.1479 | 6.8376 | 4000 | 0.8318 | 0.7810 | 0.7846 | 0.7810 | 0.7777 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 104
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for nzm97/math_question_grade_detection_v12-15-24
Base model
allenai/scibert_scivocab_uncased