Edit model card

roberta-large-ner-ghtk-cs-add-label-new-data-3090-6Sep-1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2725
  • Tk: {'precision': 0.8137254901960784, 'recall': 0.7155172413793104, 'f1': 0.761467889908257, 'number': 116}
  • Gày: {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34}
  • Gày trừu tượng: {'precision': 0.9090909090909091, 'recall': 0.9221311475409836, 'f1': 0.9155645981688708, 'number': 488}
  • Iờ: {'precision': 0.6046511627906976, 'recall': 0.6842105263157895, 'f1': 0.6419753086419753, 'number': 38}
  • Ã đơn: {'precision': 0.8826530612244898, 'recall': 0.8522167487684729, 'f1': 0.8671679197994987, 'number': 203}
  • Đt: {'precision': 0.927427961579509, 'recall': 0.989749430523918, 'f1': 0.9575757575757575, 'number': 878}
  • Đt trừu tượng: {'precision': 0.7936507936507936, 'recall': 0.8583690987124464, 'f1': 0.8247422680412371, 'number': 233}
  • Overall Precision: 0.8867
  • Overall Recall: 0.9201
  • Overall F1: 0.9031
  • Overall Accuracy: 0.9630

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk Gày Gày trừu tượng Iờ Ã đơn Đt Đt trừu tượng Overall Precision Overall Recall Overall F1 Overall Accuracy
0.0911 1.0 531 0.1990 {'precision': 0.825, 'recall': 0.853448275862069, 'f1': 0.8389830508474576, 'number': 116} {'precision': 0.6521739130434783, 'recall': 0.8823529411764706, 'f1': 0.75, 'number': 34} {'precision': 0.8799212598425197, 'recall': 0.9159836065573771, 'f1': 0.8975903614457832, 'number': 488} {'precision': 0.4827586206896552, 'recall': 0.7368421052631579, 'f1': 0.5833333333333334, 'number': 38} {'precision': 0.9166666666666666, 'recall': 0.7586206896551724, 'f1': 0.830188679245283, 'number': 203} {'precision': 0.9448648648648649, 'recall': 0.9954441913439636, 'f1': 0.9694952856350527, 'number': 878} {'precision': 0.6496815286624203, 'recall': 0.8755364806866953, 'f1': 0.7458866544789762, 'number': 233} 0.8583 0.9226 0.8893 0.9583
0.0516 2.0 1062 0.1639 {'precision': 0.8947368421052632, 'recall': 0.7327586206896551, 'f1': 0.8056872037914692, 'number': 116} {'precision': 0.7317073170731707, 'recall': 0.8823529411764706, 'f1': 0.8, 'number': 34} {'precision': 0.9048582995951417, 'recall': 0.9159836065573771, 'f1': 0.910386965376782, 'number': 488} {'precision': 0.6170212765957447, 'recall': 0.7631578947368421, 'f1': 0.6823529411764706, 'number': 38} {'precision': 0.8848167539267016, 'recall': 0.8325123152709359, 'f1': 0.8578680203045685, 'number': 203} {'precision': 0.9480088495575221, 'recall': 0.9760820045558086, 'f1': 0.961840628507295, 'number': 878} {'precision': 0.8507462686567164, 'recall': 0.7339055793991416, 'f1': 0.7880184331797234, 'number': 233} 0.9062 0.8985 0.9023 0.9647
0.0387 3.0 1593 0.1922 {'precision': 0.8979591836734694, 'recall': 0.7586206896551724, 'f1': 0.822429906542056, 'number': 116} {'precision': 0.7027027027027027, 'recall': 0.7647058823529411, 'f1': 0.7323943661971832, 'number': 34} {'precision': 0.9072164948453608, 'recall': 0.9016393442622951, 'f1': 0.9044193216855086, 'number': 488} {'precision': 0.7142857142857143, 'recall': 0.2631578947368421, 'f1': 0.3846153846153846, 'number': 38} {'precision': 0.8717948717948718, 'recall': 0.8374384236453202, 'f1': 0.8542713567839195, 'number': 203} {'precision': 0.9443231441048034, 'recall': 0.9851936218678815, 'f1': 0.9643255295429208, 'number': 878} {'precision': 0.7418181818181818, 'recall': 0.8755364806866953, 'f1': 0.8031496062992125, 'number': 233} 0.8926 0.9060 0.8993 0.9610
0.0305 4.0 2124 0.1799 {'precision': 0.875, 'recall': 0.6637931034482759, 'f1': 0.7549019607843138, 'number': 116} {'precision': 0.725, 'recall': 0.8529411764705882, 'f1': 0.7837837837837837, 'number': 34} {'precision': 0.88671875, 'recall': 0.930327868852459, 'f1': 0.908, 'number': 488} {'precision': 0.6530612244897959, 'recall': 0.8421052631578947, 'f1': 0.735632183908046, 'number': 38} {'precision': 0.8963730569948186, 'recall': 0.8522167487684729, 'f1': 0.8737373737373737, 'number': 203} {'precision': 0.9558011049723757, 'recall': 0.9851936218678815, 'f1': 0.9702748177229389, 'number': 878} {'precision': 0.8353909465020576, 'recall': 0.871244635193133, 'f1': 0.8529411764705882, 'number': 233} 0.9030 0.9211 0.9119 0.9655
0.024 5.0 2655 0.2310 {'precision': 0.7757009345794392, 'recall': 0.7155172413793104, 'f1': 0.7443946188340808, 'number': 116} {'precision': 0.6666666666666666, 'recall': 0.9411764705882353, 'f1': 0.7804878048780487, 'number': 34} {'precision': 0.9256900212314225, 'recall': 0.8934426229508197, 'f1': 0.9092805005213765, 'number': 488} {'precision': 0.6111111111111112, 'recall': 0.868421052631579, 'f1': 0.7173913043478262, 'number': 38} {'precision': 0.8543689320388349, 'recall': 0.8669950738916257, 'f1': 0.8606356968215159, 'number': 203} {'precision': 0.9294117647058824, 'recall': 0.989749430523918, 'f1': 0.9586321014892444, 'number': 878} {'precision': 0.788235294117647, 'recall': 0.8626609442060086, 'f1': 0.8237704918032787, 'number': 233} 0.8815 0.9196 0.9001 0.9622
0.0149 6.0 3186 0.2271 {'precision': 0.8295454545454546, 'recall': 0.6293103448275862, 'f1': 0.7156862745098039, 'number': 116} {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34} {'precision': 0.9203354297693921, 'recall': 0.8995901639344263, 'f1': 0.9098445595854923, 'number': 488} {'precision': 0.6842105263157895, 'recall': 0.6842105263157895, 'f1': 0.6842105263157895, 'number': 38} {'precision': 0.8153153153153153, 'recall': 0.8916256157635468, 'f1': 0.851764705882353, 'number': 203} {'precision': 0.9324034334763949, 'recall': 0.989749430523918, 'f1': 0.9602209944751382, 'number': 878} {'precision': 0.7914893617021277, 'recall': 0.7982832618025751, 'f1': 0.7948717948717949, 'number': 233} 0.8878 0.9065 0.8971 0.9621
0.0136 7.0 3717 0.2599 {'precision': 0.8230088495575221, 'recall': 0.8017241379310345, 'f1': 0.8122270742358078, 'number': 116} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.8960784313725491, 'recall': 0.9364754098360656, 'f1': 0.9158316633266533, 'number': 488} {'precision': 0.5681818181818182, 'recall': 0.6578947368421053, 'f1': 0.6097560975609756, 'number': 38} {'precision': 0.8592964824120602, 'recall': 0.8423645320197044, 'f1': 0.8507462686567163, 'number': 203} {'precision': 0.9515951595159516, 'recall': 0.9851936218678815, 'f1': 0.9681029658645776, 'number': 878} {'precision': 0.8381742738589212, 'recall': 0.8669527896995708, 'f1': 0.8523206751054854, 'number': 233} 0.8961 0.9271 0.9113 0.9636
0.0056 8.0 4248 0.2680 {'precision': 0.8230088495575221, 'recall': 0.8017241379310345, 'f1': 0.8122270742358078, 'number': 116} {'precision': 0.7619047619047619, 'recall': 0.9411764705882353, 'f1': 0.8421052631578947, 'number': 34} {'precision': 0.9192546583850931, 'recall': 0.9098360655737705, 'f1': 0.9145211122554068, 'number': 488} {'precision': 0.6190476190476191, 'recall': 0.6842105263157895, 'f1': 0.6500000000000001, 'number': 38} {'precision': 0.8507462686567164, 'recall': 0.8423645320197044, 'f1': 0.8465346534653465, 'number': 203} {'precision': 0.941240478781284, 'recall': 0.9851936218678815, 'f1': 0.9627156371730663, 'number': 878} {'precision': 0.7703703703703704, 'recall': 0.8927038626609443, 'f1': 0.827037773359841, 'number': 233} 0.8884 0.9241 0.9059 0.9619
0.0064 9.0 4779 0.2708 {'precision': 0.8469387755102041, 'recall': 0.7155172413793104, 'f1': 0.7757009345794393, 'number': 116} {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} {'precision': 0.9023904382470119, 'recall': 0.9282786885245902, 'f1': 0.9151515151515152, 'number': 488} {'precision': 0.6052631578947368, 'recall': 0.6052631578947368, 'f1': 0.6052631578947368, 'number': 38} {'precision': 0.8871794871794871, 'recall': 0.8522167487684729, 'f1': 0.8693467336683417, 'number': 203} {'precision': 0.927427961579509, 'recall': 0.989749430523918, 'f1': 0.9575757575757575, 'number': 878} {'precision': 0.7846153846153846, 'recall': 0.8755364806866953, 'f1': 0.8275862068965517, 'number': 233} 0.8865 0.9226 0.9042 0.9632
0.0033 10.0 5310 0.2725 {'precision': 0.8137254901960784, 'recall': 0.7155172413793104, 'f1': 0.761467889908257, 'number': 116} {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34} {'precision': 0.9090909090909091, 'recall': 0.9221311475409836, 'f1': 0.9155645981688708, 'number': 488} {'precision': 0.6046511627906976, 'recall': 0.6842105263157895, 'f1': 0.6419753086419753, 'number': 38} {'precision': 0.8826530612244898, 'recall': 0.8522167487684729, 'f1': 0.8671679197994987, 'number': 203} {'precision': 0.927427961579509, 'recall': 0.989749430523918, 'f1': 0.9575757575757575, 'number': 878} {'precision': 0.7936507936507936, 'recall': 0.8583690987124464, 'f1': 0.8247422680412371, 'number': 233} 0.8867 0.9201 0.9031 0.9630

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
559M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .