roberta-large-ner-ghtk-cs-6-label-new-data-3090-11Sep-1
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1990
- Tk: {'precision': 0.8227848101265823, 'recall': 0.5603448275862069, 'f1': 0.6666666666666667, 'number': 116}
- Gày: {'precision': 0.7692307692307693, 'recall': 0.8823529411764706, 'f1': 0.8219178082191781, 'number': 34}
- Gày trừu tượng: {'precision': 0.9166666666666666, 'recall': 0.9241803278688525, 'f1': 0.9204081632653061, 'number': 488}
- Ã đơn: {'precision': 0.8413461538461539, 'recall': 0.8620689655172413, 'f1': 0.8515815085158149, 'number': 203}
- Đt: {'precision': 0.9330453563714903, 'recall': 0.9840546697038725, 'f1': 0.9578713968957872, 'number': 878}
- Đt trừu tượng: {'precision': 0.8502024291497976, 'recall': 0.9012875536480687, 'f1': 0.875, 'number': 233}
- Overall Precision: 0.9016
- Overall Recall: 0.9196
- Overall F1: 0.9105
- Overall Accuracy: 0.9689
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 467 | 0.1481 | {'precision': 1.0, 'recall': 0.04310344827586207, 'f1': 0.08264462809917356, 'number': 116} | {'precision': 0.5263157894736842, 'recall': 0.8823529411764706, 'f1': 0.6593406593406594, 'number': 34} | {'precision': 0.864476386036961, 'recall': 0.8627049180327869, 'f1': 0.8635897435897436, 'number': 488} | {'precision': 0.7544642857142857, 'recall': 0.8325123152709359, 'f1': 0.791569086651054, 'number': 203} | {'precision': 0.8652694610778443, 'recall': 0.9874715261958997, 'f1': 0.9223404255319149, 'number': 878} | {'precision': 0.8465346534653465, 'recall': 0.7339055793991416, 'f1': 0.7862068965517242, 'number': 233} | 0.8412 | 0.8519 | 0.8465 | 0.9507 |
0.2706 | 2.0 | 934 | 0.1772 | {'precision': 0.4626865671641791, 'recall': 0.2672413793103448, 'f1': 0.33879781420765026, 'number': 116} | {'precision': 0.5625, 'recall': 0.7941176470588235, 'f1': 0.6585365853658537, 'number': 34} | {'precision': 0.8574144486692015, 'recall': 0.9241803278688525, 'f1': 0.8895463510848126, 'number': 488} | {'precision': 0.8043478260869565, 'recall': 0.3645320197044335, 'f1': 0.5016949152542373, 'number': 203} | {'precision': 0.8913934426229508, 'recall': 0.9908883826879271, 'f1': 0.9385113268608414, 'number': 878} | {'precision': 0.6782006920415224, 'recall': 0.8412017167381974, 'f1': 0.7509578544061303, 'number': 233} | 0.8253 | 0.8448 | 0.8349 | 0.9411 |
0.1019 | 3.0 | 1401 | 0.1537 | {'precision': 0.84, 'recall': 0.1810344827586207, 'f1': 0.2978723404255319, 'number': 116} | {'precision': 0.6304347826086957, 'recall': 0.8529411764705882, 'f1': 0.725, 'number': 34} | {'precision': 0.8383458646616542, 'recall': 0.9139344262295082, 'f1': 0.8745098039215686, 'number': 488} | {'precision': 0.8407960199004975, 'recall': 0.8325123152709359, 'f1': 0.8366336633663367, 'number': 203} | {'precision': 0.88259526261586, 'recall': 0.9760820045558086, 'f1': 0.9269875608436994, 'number': 878} | {'precision': 0.8245614035087719, 'recall': 0.8068669527896996, 'f1': 0.8156182212581344, 'number': 233} | 0.8537 | 0.8760 | 0.8647 | 0.9577 |
0.0747 | 4.0 | 1868 | 0.1361 | {'precision': 0.7058823529411765, 'recall': 0.3103448275862069, 'f1': 0.4311377245508982, 'number': 116} | {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} | {'precision': 0.8975903614457831, 'recall': 0.9159836065573771, 'f1': 0.9066937119675456, 'number': 488} | {'precision': 0.8907103825136612, 'recall': 0.8029556650246306, 'f1': 0.844559585492228, 'number': 203} | {'precision': 0.903125, 'recall': 0.9874715261958997, 'f1': 0.94341675734494, 'number': 878} | {'precision': 0.6449704142011834, 'recall': 0.9356223175965666, 'f1': 0.7635726795096323, 'number': 233} | 0.8508 | 0.9027 | 0.8760 | 0.9610 |
0.0481 | 5.0 | 2335 | 0.1297 | {'precision': 0.7428571428571429, 'recall': 0.4482758620689655, 'f1': 0.5591397849462366, 'number': 116} | {'precision': 0.7692307692307693, 'recall': 0.8823529411764706, 'f1': 0.8219178082191781, 'number': 34} | {'precision': 0.8933601609657947, 'recall': 0.9098360655737705, 'f1': 0.9015228426395939, 'number': 488} | {'precision': 0.9010416666666666, 'recall': 0.8522167487684729, 'f1': 0.8759493670886076, 'number': 203} | {'precision': 0.8815261044176707, 'recall': 1.0, 'f1': 0.9370330843116329, 'number': 878} | {'precision': 0.8477366255144033, 'recall': 0.8841201716738197, 'f1': 0.865546218487395, 'number': 233} | 0.8753 | 0.9134 | 0.8940 | 0.9670 |
0.0367 | 6.0 | 2802 | 0.1397 | {'precision': 0.8709677419354839, 'recall': 0.46551724137931033, 'f1': 0.6067415730337079, 'number': 116} | {'precision': 0.6829268292682927, 'recall': 0.8235294117647058, 'f1': 0.7466666666666667, 'number': 34} | {'precision': 0.898989898989899, 'recall': 0.9118852459016393, 'f1': 0.9053916581892167, 'number': 488} | {'precision': 0.8871794871794871, 'recall': 0.8522167487684729, 'f1': 0.8693467336683417, 'number': 203} | {'precision': 0.9268817204301075, 'recall': 0.9817767653758542, 'f1': 0.9535398230088497, 'number': 878} | {'precision': 0.8014981273408239, 'recall': 0.9184549356223176, 'f1': 0.856, 'number': 233} | 0.8925 | 0.9098 | 0.9011 | 0.9681 |
0.0228 | 7.0 | 3269 | 0.1445 | {'precision': 0.8372093023255814, 'recall': 0.6206896551724138, 'f1': 0.7128712871287128, 'number': 116} | {'precision': 0.7804878048780488, 'recall': 0.9411764705882353, 'f1': 0.8533333333333334, 'number': 34} | {'precision': 0.907070707070707, 'recall': 0.9200819672131147, 'f1': 0.9135300101729399, 'number': 488} | {'precision': 0.8271028037383178, 'recall': 0.8719211822660099, 'f1': 0.8489208633093526, 'number': 203} | {'precision': 0.9314775160599572, 'recall': 0.9908883826879271, 'f1': 0.9602649006622518, 'number': 878} | {'precision': 0.8708333333333333, 'recall': 0.8969957081545065, 'f1': 0.8837209302325582, 'number': 233} | 0.9 | 0.9267 | 0.9132 | 0.9692 |
0.0162 | 8.0 | 3736 | 0.1854 | {'precision': 0.9014084507042254, 'recall': 0.5517241379310345, 'f1': 0.6844919786096257, 'number': 116} | {'precision': 0.7222222222222222, 'recall': 0.7647058823529411, 'f1': 0.7428571428571428, 'number': 34} | {'precision': 0.9238683127572016, 'recall': 0.9200819672131147, 'f1': 0.9219712525667352, 'number': 488} | {'precision': 0.8293838862559242, 'recall': 0.8620689655172413, 'f1': 0.8454106280193237, 'number': 203} | {'precision': 0.9128151260504201, 'recall': 0.989749430523918, 'f1': 0.9497267759562841, 'number': 878} | {'precision': 0.8546255506607929, 'recall': 0.8326180257510729, 'f1': 0.8434782608695652, 'number': 233} | 0.8961 | 0.9103 | 0.9032 | 0.9672 |
0.0084 | 9.0 | 4203 | 0.1853 | {'precision': 0.8144329896907216, 'recall': 0.6810344827586207, 'f1': 0.7417840375586854, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.9129554655870445, 'recall': 0.9241803278688525, 'f1': 0.9185336048879837, 'number': 488} | {'precision': 0.8333333333333334, 'recall': 0.8620689655172413, 'f1': 0.847457627118644, 'number': 203} | {'precision': 0.9492273730684326, 'recall': 0.979498861047836, 'f1': 0.9641255605381165, 'number': 878} | {'precision': 0.82421875, 'recall': 0.9055793991416309, 'f1': 0.8629856850715747, 'number': 233} | 0.9016 | 0.9247 | 0.9130 | 0.9691 |
0.0057 | 10.0 | 4670 | 0.1990 | {'precision': 0.8227848101265823, 'recall': 0.5603448275862069, 'f1': 0.6666666666666667, 'number': 116} | {'precision': 0.7692307692307693, 'recall': 0.8823529411764706, 'f1': 0.8219178082191781, 'number': 34} | {'precision': 0.9166666666666666, 'recall': 0.9241803278688525, 'f1': 0.9204081632653061, 'number': 488} | {'precision': 0.8413461538461539, 'recall': 0.8620689655172413, 'f1': 0.8515815085158149, 'number': 203} | {'precision': 0.9330453563714903, 'recall': 0.9840546697038725, 'f1': 0.9578713968957872, 'number': 878} | {'precision': 0.8502024291497976, 'recall': 0.9012875536480687, 'f1': 0.875, 'number': 233} | 0.9016 | 0.9196 | 0.9105 | 0.9689 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 0
Model tree for Kudod/roberta-large-ner-ghtk-cs-6-label-new-data-3090-11Sep-1
Base model
FacebookAI/xlm-roberta-large