Proccyon commited on
Commit
d540d82
1 Parent(s): bbb67e8

Training complete

Browse files
Files changed (1) hide show
  1. README.md +78 -5
README.md CHANGED
@@ -3,6 +3,11 @@ license: apache-2.0
3
  base_model: bert-base-uncased
4
  tags:
5
  - generated_from_trainer
 
 
 
 
 
6
  model-index:
7
  - name: bert-finetuned-ner
8
  results: []
@@ -14,6 +19,72 @@ should probably proofread and complete it, then remove this comment. -->
14
  # bert-finetuned-ner
15
 
16
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
17
 
18
  ## Model description
19
 
@@ -32,19 +103,21 @@ More information needed
32
  ### Training hyperparameters
33
 
34
  The following hyperparameters were used during training:
35
- - learning_rate: 2e-05
36
- - train_batch_size: 8
37
- - eval_batch_size: 8
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: linear
41
- - num_epochs: 1
42
 
43
  ### Training results
44
 
45
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | B-location-precision | B-location-recall | B-location-f1 | I-location-precision | I-location-recall | I-location-f1 | B-group-precision | B-group-recall | B-group-f1 | I-group-precision | I-group-recall | I-group-f1 | B-corporation-precision | B-corporation-recall | B-corporation-f1 | I-corporation-precision | I-corporation-recall | I-corporation-f1 | B-person-precision | B-person-recall | B-person-f1 | I-person-precision | I-person-recall | I-person-f1 | B-creative-work-precision | B-creative-work-recall | B-creative-work-f1 | I-creative-work-precision | I-creative-work-recall | I-creative-work-f1 | B-product-precision | B-product-recall | B-product-f1 | I-product-precision | I-product-recall | I-product-f1 | Corporation-precision | Corporation-recall | Corporation-f1 | Corporation-number | Creative-work-precision | Creative-work-recall | Creative-work-f1 | Creative-work-number | Group-precision | Group-recall | Group-f1 | Group-number | Location-precision | Location-recall | Location-f1 | Location-number | Person-precision | Person-recall | Person-f1 | Person-number | Product-precision | Product-recall | Product-f1 | Product-number |
46
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:--------------------:|:-----------------:|:-------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:-----------------------:|:--------------------:|:----------------:|:-----------------------:|:--------------------:|:----------------:|:------------------:|:---------------:|:-----------:|:------------------:|:---------------:|:-----------:|:-------------------------:|:----------------------:|:------------------:|:-------------------------:|:----------------------:|:------------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:---------------:|:------------:|:--------:|:------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:--------------:|
47
- | No log | 1.0 | 425 | 0.1275 | 0.5171 | 0.3838 | 0.4406 | 0.9687 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.0 | 0.0 | 221 | 0.0 | 0.0 | 0.0 | 140 | 0.0 | 0.0 | 0.0 | 264 | 0.4054 | 0.4690 | 0.4349 | 548 | 0.6234 | 0.7348 | 0.6745 | 660 | 0.2963 | 0.1127 | 0.1633 | 142 |
 
 
48
 
49
 
50
  ### Framework versions
 
3
  base_model: bert-base-uncased
4
  tags:
5
  - generated_from_trainer
6
+ metrics:
7
+ - precision
8
+ - recall
9
+ - f1
10
+ - accuracy
11
  model-index:
12
  - name: bert-finetuned-ner
13
  results: []
 
19
  # bert-finetuned-ner
20
 
21
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.0209
24
+ - Precision: 0.8249
25
+ - Recall: 0.8825
26
+ - F1: 0.8527
27
+ - Accuracy: 0.9946
28
+ - B-location-precision: 0.9446
29
+ - B-location-recall: 0.9653
30
+ - B-location-f1: 0.9549
31
+ - I-location-precision: 0.9358
32
+ - I-location-recall: 0.9745
33
+ - I-location-f1: 0.9548
34
+ - B-group-precision: 0.8819
35
+ - B-group-recall: 0.8485
36
+ - B-group-f1: 0.8649
37
+ - I-group-precision: 0.8879
38
+ - I-group-recall: 0.8358
39
+ - I-group-f1: 0.8610
40
+ - B-corporation-precision: 0.8475
41
+ - B-corporation-recall: 0.8552
42
+ - B-corporation-f1: 0.8514
43
+ - I-corporation-precision: 0.8158
44
+ - I-corporation-recall: 0.7294
45
+ - I-corporation-f1: 0.7702
46
+ - B-person-precision: 0.9583
47
+ - B-person-recall: 0.9742
48
+ - B-person-f1: 0.9662
49
+ - I-person-precision: 0.9596
50
+ - I-person-recall: 0.95
51
+ - I-person-f1: 0.9548
52
+ - B-creative-work-precision: 0.8102
53
+ - B-creative-work-recall: 0.7929
54
+ - B-creative-work-f1: 0.8014
55
+ - I-creative-work-precision: 0.8131
56
+ - I-creative-work-recall: 0.8354
57
+ - I-creative-work-f1: 0.8241
58
+ - B-product-precision: 0.8682
59
+ - B-product-recall: 0.7887
60
+ - B-product-f1: 0.8266
61
+ - I-product-precision: 0.8862
62
+ - I-product-recall: 0.8886
63
+ - I-product-f1: 0.8874
64
+ - Corporation-precision: 0.6972
65
+ - Corporation-recall: 0.7919
66
+ - Corporation-f1: 0.7415
67
+ - Corporation-number: 221
68
+ - Creative-work-precision: 0.6433
69
+ - Creative-work-recall: 0.7214
70
+ - Creative-work-f1: 0.6801
71
+ - Creative-work-number: 140
72
+ - Group-precision: 0.7465
73
+ - Group-recall: 0.8144
74
+ - Group-f1: 0.7790
75
+ - Group-number: 264
76
+ - Location-precision: 0.9026
77
+ - Location-recall: 0.9471
78
+ - Location-f1: 0.9243
79
+ - Location-number: 548
80
+ - Person-precision: 0.9101
81
+ - Person-recall: 0.9515
82
+ - Person-f1: 0.9304
83
+ - Person-number: 660
84
+ - Product-precision: 0.6908
85
+ - Product-recall: 0.7394
86
+ - Product-f1: 0.7143
87
+ - Product-number: 142
88
 
89
  ## Model description
90
 
 
103
  ### Training hyperparameters
104
 
105
  The following hyperparameters were used during training:
106
+ - learning_rate: 0.0002
107
+ - train_batch_size: 32
108
+ - eval_batch_size: 32
109
  - seed: 42
110
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
111
  - lr_scheduler_type: linear
112
+ - num_epochs: 3
113
 
114
  ### Training results
115
 
116
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | B-location-precision | B-location-recall | B-location-f1 | I-location-precision | I-location-recall | I-location-f1 | B-group-precision | B-group-recall | B-group-f1 | I-group-precision | I-group-recall | I-group-f1 | B-corporation-precision | B-corporation-recall | B-corporation-f1 | I-corporation-precision | I-corporation-recall | I-corporation-f1 | B-person-precision | B-person-recall | B-person-f1 | I-person-precision | I-person-recall | I-person-f1 | B-creative-work-precision | B-creative-work-recall | B-creative-work-f1 | I-creative-work-precision | I-creative-work-recall | I-creative-work-f1 | B-product-precision | B-product-recall | B-product-f1 | I-product-precision | I-product-recall | I-product-f1 | Corporation-precision | Corporation-recall | Corporation-f1 | Corporation-number | Creative-work-precision | Creative-work-recall | Creative-work-f1 | Creative-work-number | Group-precision | Group-recall | Group-f1 | Group-number | Location-precision | Location-recall | Location-f1 | Location-number | Person-precision | Person-recall | Person-f1 | Person-number | Product-precision | Product-recall | Product-f1 | Product-number |
117
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:--------------------:|:-----------------:|:-------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:-----------------------:|:--------------------:|:----------------:|:-----------------------:|:--------------------:|:----------------:|:------------------:|:---------------:|:-----------:|:------------------:|:---------------:|:-----------:|:-------------------------:|:----------------------:|:------------------:|:-------------------------:|:----------------------:|:------------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:---------------:|:------------:|:--------:|:------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:--------------:|
118
+ | No log | 1.0 | 107 | 0.1175 | 0.5693 | 0.4076 | 0.4751 | 0.9701 | 0.6320 | 0.7646 | 0.6920 | 0.7752 | 0.3929 | 0.5215 | 1.0 | 0.0114 | 0.0225 | 0.6667 | 0.0176 | 0.0343 | 0.9787 | 0.2081 | 0.3433 | nan | 0.0 | nan | 0.8123 | 0.7409 | 0.7750 | 0.9117 | 0.555 | 0.6900 | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | nan | 0.9787 | 0.2081 | 0.3433 | 221 | 0.0 | 0.0 | 0.0 | 140 | 0.3333 | 0.0152 | 0.0290 | 264 | 0.4682 | 0.6040 | 0.5275 | 548 | 0.6543 | 0.6424 | 0.6483 | 660 | 0.0 | 0.0 | 0.0 | 142 |
119
+ | No log | 2.0 | 214 | 0.0411 | 0.6931 | 0.7489 | 0.7199 | 0.9886 | 0.8194 | 0.9270 | 0.8699 | 0.8701 | 0.9214 | 0.8950 | 0.7919 | 0.5909 | 0.6768 | 0.6897 | 0.7625 | 0.7242 | 0.8297 | 0.6833 | 0.7494 | 0.8548 | 0.3118 | 0.4569 | 0.9139 | 0.9485 | 0.9309 | 0.8996 | 0.9075 | 0.9035 | 0.7541 | 0.3286 | 0.4577 | 0.7952 | 0.5091 | 0.6208 | 0.7407 | 0.5634 | 0.64 | 0.6740 | 0.8315 | 0.7445 | 0.6515 | 0.5837 | 0.6158 | 221 | 0.2941 | 0.2143 | 0.2479 | 140 | 0.5051 | 0.5682 | 0.5348 | 264 | 0.7617 | 0.8923 | 0.8218 | 548 | 0.8470 | 0.9227 | 0.8832 | 660 | 0.4091 | 0.5070 | 0.4528 | 142 |
120
+ | No log | 3.0 | 321 | 0.0209 | 0.8249 | 0.8825 | 0.8527 | 0.9946 | 0.9446 | 0.9653 | 0.9549 | 0.9358 | 0.9745 | 0.9548 | 0.8819 | 0.8485 | 0.8649 | 0.8879 | 0.8358 | 0.8610 | 0.8475 | 0.8552 | 0.8514 | 0.8158 | 0.7294 | 0.7702 | 0.9583 | 0.9742 | 0.9662 | 0.9596 | 0.95 | 0.9548 | 0.8102 | 0.7929 | 0.8014 | 0.8131 | 0.8354 | 0.8241 | 0.8682 | 0.7887 | 0.8266 | 0.8862 | 0.8886 | 0.8874 | 0.6972 | 0.7919 | 0.7415 | 221 | 0.6433 | 0.7214 | 0.6801 | 140 | 0.7465 | 0.8144 | 0.7790 | 264 | 0.9026 | 0.9471 | 0.9243 | 548 | 0.9101 | 0.9515 | 0.9304 | 660 | 0.6908 | 0.7394 | 0.7143 | 142 |
121
 
122
 
123
  ### Framework versions