Kudod commited on
Commit
bf72fae
1 Parent(s): 152b420

Training complete

Browse files
Files changed (1) hide show
  1. README.md +76 -0
README.md ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: FacebookAI/xlm-roberta-large
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: facebook-roberta-large-finetuned-ner-vlsp2021-3090-14June
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # facebook-roberta-large-finetuned-ner-vlsp2021-3090-14June
15
+
16
+ This model is a fine-tuned version of [FacebookAI/xlm-roberta-large](https://huggingface.co/FacebookAI/xlm-roberta-large) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0999
19
+ - Atetime: {'precision': 0.8815399802566634, 'recall': 0.8912175648702595, 'f1': 0.8863523573200993, 'number': 1002}
20
+ - Ddress: {'precision': 0.7941176470588235, 'recall': 0.9310344827586207, 'f1': 0.8571428571428571, 'number': 29}
21
+ - Erson: {'precision': 0.9600840336134454, 'recall': 0.9626119010005266, 'f1': 0.9613463055482514, 'number': 1899}
22
+ - Ersontype: {'precision': 0.7473684210526316, 'recall': 0.7266081871345029, 'f1': 0.736842105263158, 'number': 684}
23
+ - Honenumber: {'precision': 0.8888888888888888, 'recall': 0.8888888888888888, 'f1': 0.8888888888888888, 'number': 9}
24
+ - Iscellaneous: {'precision': 0.5126582278481012, 'recall': 0.5094339622641509, 'f1': 0.5110410094637223, 'number': 159}
25
+ - Mail: {'precision': 1.0, 'recall': 0.9803921568627451, 'f1': 0.99009900990099, 'number': 51}
26
+ - Ocation: {'precision': 0.8736842105263158, 'recall': 0.8931591083781706, 'f1': 0.8833143291524135, 'number': 1301}
27
+ - P: {'precision': 0.75, 'recall': 0.8181818181818182, 'f1': 0.7826086956521738, 'number': 11}
28
+ - Rl: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 15}
29
+ - Roduct: {'precision': 0.7059773828756059, 'recall': 0.6992, 'f1': 0.702572347266881, 'number': 625}
30
+ - Overall Precision: 0.8619
31
+ - Overall Recall: 0.8655
32
+ - Overall F1: 0.8637
33
+ - Overall Accuracy: 0.9810
34
+
35
+ ## Model description
36
+
37
+ More information needed
38
+
39
+ ## Intended uses & limitations
40
+
41
+ More information needed
42
+
43
+ ## Training and evaluation data
44
+
45
+ More information needed
46
+
47
+ ## Training procedure
48
+
49
+ ### Training hyperparameters
50
+
51
+ The following hyperparameters were used during training:
52
+ - learning_rate: 2e-05
53
+ - train_batch_size: 4
54
+ - eval_batch_size: 4
55
+ - seed: 42
56
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
57
+ - lr_scheduler_type: linear
58
+ - num_epochs: 5
59
+
60
+ ### Training results
61
+
62
+ | Training Loss | Epoch | Step | Validation Loss | Atetime | Ddress | Erson | Ersontype | Honenumber | Iscellaneous | Mail | Ocation | P | Rl | Roduct | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
63
+ |:-------------:|:-----:|:-----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
64
+ | 0.097 | 1.0 | 3263 | 0.0814 | {'precision': 0.8521825396825397, 'recall': 0.8572854291417166, 'f1': 0.854726368159204, 'number': 1002} | {'precision': 0.574468085106383, 'recall': 0.9310344827586207, 'f1': 0.7105263157894737, 'number': 29} | {'precision': 0.9606681034482759, 'recall': 0.9389152185360716, 'f1': 0.9496671105193077, 'number': 1899} | {'precision': 0.7643097643097643, 'recall': 0.6637426900584795, 'f1': 0.7104851330203442, 'number': 684} | {'precision': 0.7272727272727273, 'recall': 0.8888888888888888, 'f1': 0.7999999999999999, 'number': 9} | {'precision': 0.4517766497461929, 'recall': 0.559748427672956, 'f1': 0.5, 'number': 159} | {'precision': 1.0, 'recall': 0.9411764705882353, 'f1': 0.9696969696969697, 'number': 51} | {'precision': 0.8556390977443609, 'recall': 0.8747117601844735, 'f1': 0.8650703154694033, 'number': 1301} | {'precision': 0.6666666666666666, 'recall': 0.7272727272727273, 'f1': 0.6956521739130435, 'number': 11} | {'precision': 0.7222222222222222, 'recall': 0.8666666666666667, 'f1': 0.7878787878787877, 'number': 15} | {'precision': 0.5155555555555555, 'recall': 0.5568, 'f1': 0.5353846153846155, 'number': 625} | 0.8238 | 0.8254 | 0.8246 | 0.9769 |
65
+ | 0.0596 | 2.0 | 6526 | 0.0905 | {'precision': 0.8613569321533924, 'recall': 0.874251497005988, 'f1': 0.8677563150074294, 'number': 1002} | {'precision': 0.5111111111111111, 'recall': 0.7931034482758621, 'f1': 0.6216216216216216, 'number': 29} | {'precision': 0.967741935483871, 'recall': 0.9478672985781991, 'f1': 0.9577015163607343, 'number': 1899} | {'precision': 0.8167770419426048, 'recall': 0.5409356725146199, 'f1': 0.6508355321020229, 'number': 684} | {'precision': 0.8, 'recall': 0.8888888888888888, 'f1': 0.8421052631578948, 'number': 9} | {'precision': 0.51875, 'recall': 0.5220125786163522, 'f1': 0.5203761755485894, 'number': 159} | {'precision': 1.0, 'recall': 0.9607843137254902, 'f1': 0.98, 'number': 51} | {'precision': 0.8402323892519971, 'recall': 0.889315910837817, 'f1': 0.8640776699029126, 'number': 1301} | {'precision': 0.5833333333333334, 'recall': 0.6363636363636364, 'f1': 0.6086956521739131, 'number': 11} | {'precision': 0.5555555555555556, 'recall': 0.6666666666666666, 'f1': 0.606060606060606, 'number': 15} | {'precision': 0.6890595009596929, 'recall': 0.5744, 'f1': 0.6265270506108203, 'number': 625} | 0.8587 | 0.8197 | 0.8388 | 0.9779 |
66
+ | 0.0395 | 3.0 | 9789 | 0.0885 | {'precision': 0.8682877406281662, 'recall': 0.8552894211576846, 'f1': 0.8617395676219206, 'number': 1002} | {'precision': 0.6, 'recall': 0.8275862068965517, 'f1': 0.6956521739130435, 'number': 29} | {'precision': 0.9590643274853801, 'recall': 0.9499736703528173, 'f1': 0.9544973544973545, 'number': 1899} | {'precision': 0.7365930599369085, 'recall': 0.6827485380116959, 'f1': 0.7086494688922609, 'number': 684} | {'precision': 0.8, 'recall': 0.8888888888888888, 'f1': 0.8421052631578948, 'number': 9} | {'precision': 0.5212121212121212, 'recall': 0.5408805031446541, 'f1': 0.5308641975308641, 'number': 159} | {'precision': 1.0, 'recall': 0.9607843137254902, 'f1': 0.98, 'number': 51} | {'precision': 0.8641881638846738, 'recall': 0.8754803996925442, 'f1': 0.8697976326842306, 'number': 1301} | {'precision': 0.75, 'recall': 0.8181818181818182, 'f1': 0.7826086956521738, 'number': 11} | {'precision': 0.7647058823529411, 'recall': 0.8666666666666667, 'f1': 0.8125, 'number': 15} | {'precision': 0.6871880199667221, 'recall': 0.6608, 'f1': 0.6737357259380099, 'number': 625} | 0.8521 | 0.8417 | 0.8469 | 0.9794 |
67
+ | 0.0248 | 4.0 | 13052 | 0.0953 | {'precision': 0.8802395209580839, 'recall': 0.8802395209580839, 'f1': 0.8802395209580839, 'number': 1002} | {'precision': 0.7428571428571429, 'recall': 0.896551724137931, 'f1': 0.8125, 'number': 29} | {'precision': 0.9623741388447271, 'recall': 0.956292785676672, 'f1': 0.9593238246170102, 'number': 1899} | {'precision': 0.7728758169934641, 'recall': 0.6915204678362573, 'f1': 0.7299382716049383, 'number': 684} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 9} | {'precision': 0.4857142857142857, 'recall': 0.5345911949685535, 'f1': 0.5089820359281437, 'number': 159} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 51} | {'precision': 0.8614591009579956, 'recall': 0.8985395849346657, 'f1': 0.8796087283671934, 'number': 1301} | {'precision': 0.8181818181818182, 'recall': 0.8181818181818182, 'f1': 0.8181818181818182, 'number': 11} | {'precision': 0.9333333333333333, 'recall': 0.9333333333333333, 'f1': 0.9333333333333333, 'number': 15} | {'precision': 0.7482876712328768, 'recall': 0.6992, 'f1': 0.7229114971050455, 'number': 625} | 0.8663 | 0.8593 | 0.8628 | 0.9806 |
68
+ | 0.0161 | 5.0 | 16315 | 0.0999 | {'precision': 0.8815399802566634, 'recall': 0.8912175648702595, 'f1': 0.8863523573200993, 'number': 1002} | {'precision': 0.7941176470588235, 'recall': 0.9310344827586207, 'f1': 0.8571428571428571, 'number': 29} | {'precision': 0.9600840336134454, 'recall': 0.9626119010005266, 'f1': 0.9613463055482514, 'number': 1899} | {'precision': 0.7473684210526316, 'recall': 0.7266081871345029, 'f1': 0.736842105263158, 'number': 684} | {'precision': 0.8888888888888888, 'recall': 0.8888888888888888, 'f1': 0.8888888888888888, 'number': 9} | {'precision': 0.5126582278481012, 'recall': 0.5094339622641509, 'f1': 0.5110410094637223, 'number': 159} | {'precision': 1.0, 'recall': 0.9803921568627451, 'f1': 0.99009900990099, 'number': 51} | {'precision': 0.8736842105263158, 'recall': 0.8931591083781706, 'f1': 0.8833143291524135, 'number': 1301} | {'precision': 0.75, 'recall': 0.8181818181818182, 'f1': 0.7826086956521738, 'number': 11} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 15} | {'precision': 0.7059773828756059, 'recall': 0.6992, 'f1': 0.702572347266881, 'number': 625} | 0.8619 | 0.8655 | 0.8637 | 0.9810 |
69
+
70
+
71
+ ### Framework versions
72
+
73
+ - Transformers 4.40.2
74
+ - Pytorch 2.3.1+cu121
75
+ - Datasets 2.19.1
76
+ - Tokenizers 0.19.1