fine-tuned-DatasetQAS-IDK-MRC-with-indobert-large-p2-with-ITTL-without-freeze-LR-1e-05
This model is a fine-tuned version of indobenchmark/indobert-large-p2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.4144
- Exact Match: 54.9738
- F1: 61.7773
- Precision: 63.1273
- Recall: 66.0715
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 16
Training results
Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
3.4922 | 0.49 | 73 | 2.1228 | 17.8010 | 26.7821 | 24.6611 | 46.2405 |
2.3015 | 0.99 | 146 | 1.7236 | 31.9372 | 39.3632 | 39.3151 | 50.4983 |
1.5627 | 1.49 | 219 | 1.3562 | 43.9791 | 50.3363 | 50.6305 | 59.6967 |
1.3703 | 1.98 | 292 | 1.3352 | 43.0628 | 50.9390 | 51.7207 | 58.2556 |
1.0433 | 2.48 | 365 | 1.2210 | 46.7277 | 54.3203 | 55.5971 | 60.2780 |
1.0456 | 2.97 | 438 | 1.1553 | 50.3927 | 58.4862 | 59.5577 | 65.4513 |
0.8656 | 3.47 | 511 | 1.1815 | 50.3927 | 57.6228 | 58.5436 | 62.8284 |
0.8838 | 3.97 | 584 | 1.2030 | 49.0838 | 56.4395 | 57.7457 | 61.5960 |
0.6994 | 4.47 | 657 | 1.1820 | 51.9634 | 59.1479 | 59.9674 | 64.7123 |
0.7335 | 4.96 | 730 | 1.1825 | 52.6178 | 60.0014 | 61.3988 | 64.7995 |
0.596 | 5.46 | 803 | 1.2962 | 52.2251 | 59.6942 | 61.1135 | 63.7633 |
0.6165 | 5.95 | 876 | 1.2169 | 53.0105 | 60.3582 | 61.5312 | 65.2088 |
0.5917 | 6.45 | 949 | 1.3939 | 53.0105 | 60.1105 | 61.5127 | 64.4837 |
0.5275 | 6.95 | 1022 | 1.3169 | 54.8429 | 62.1060 | 63.5898 | 66.5208 |
0.5058 | 7.45 | 1095 | 1.3237 | 55.6283 | 62.4607 | 63.7170 | 67.3387 |
0.4651 | 7.94 | 1168 | 1.3677 | 53.0105 | 59.7708 | 60.9283 | 64.5730 |
0.4616 | 8.44 | 1241 | 1.4120 | 57.4607 | 63.9364 | 65.2036 | 67.4919 |
0.4053 | 8.93 | 1314 | 1.3799 | 56.2827 | 62.8043 | 63.9601 | 66.7283 |
0.4061 | 9.43 | 1387 | 1.4736 | 55.7592 | 62.3147 | 63.7404 | 66.0129 |
0.4037 | 9.93 | 1460 | 1.4144 | 54.9738 | 61.7773 | 63.1273 | 66.0715 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu117
- Datasets 2.2.0
- Tokenizers 0.13.2
- Downloads last month
- 4