bert-finetuned-ner / README.md
csNoHug's picture
Training complete
ca56827
---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
datasets:
- wnut_17
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: wnut_17
type: wnut_17
config: wnut_17
split: test
args: wnut_17
metrics:
- name: Precision
type: precision
value: 0.5254237288135594
- name: Recall
type: recall
value: 0.3160333642261353
- name: F1
type: f1
value: 0.3946759259259259
- name: Accuracy
type: accuracy
value: 0.9350753768844221
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the wnut_17 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4362
- Precision: 0.5254
- Recall: 0.3160
- F1: 0.3947
- Accuracy: 0.9351
- Corporation Precision: 0.1833
- Corporation Recall: 0.1667
- Corporation F1: 0.1746
- Creative-work Precision: 0.4308
- Creative-work Recall: 0.1972
- Creative-work F1: 0.2705
- Group Precision: 0.3467
- Group Recall: 0.1576
- Group F1: 0.2167
- Location Precision: 0.55
- Location Recall: 0.44
- Location F1: 0.4889
- Person Precision: 0.8008
- Person Recall: 0.4592
- Person F1: 0.5837
- Product Precision: 0.1566
- Product Recall: 0.1024
- Product F1: 0.1238
- B-corporation Precision: 0.3256
- B-corporation Recall: 0.2121
- B-corporation F1: 0.2569
- B-creative-work Precision: 0.76
- B-creative-work Recall: 0.2676
- B-creative-work F1: 0.3958
- B-group Precision: 0.5179
- B-group Recall: 0.1758
- B-group F1: 0.2624
- B-location Precision: 0.6792
- B-location Recall: 0.48
- B-location F1: 0.5625
- B-person Precision: 0.8615
- B-person Recall: 0.4639
- B-person F1: 0.6030
- B-product Precision: 0.4468
- B-product Recall: 0.1654
- B-product F1: 0.2414
- I-corporation Precision: 0.2889
- I-corporation Recall: 0.2364
- I-corporation F1: 0.26
- I-creative-work Precision: 0.45
- I-creative-work Recall: 0.2093
- I-creative-work F1: 0.2857
- I-group Precision: 0.2549
- I-group Recall: 0.1150
- I-group F1: 0.1585
- I-location Precision: 0.5606
- I-location Recall: 0.3895
- I-location F1: 0.4596
- I-person Precision: 0.7564
- I-person Recall: 0.3512
- I-person F1: 0.4797
- I-product Precision: 0.1972
- I-product Recall: 0.1157
- I-product F1: 0.1458
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Corporation Precision | Corporation Recall | Corporation F1 | Creative-work Precision | Creative-work Recall | Creative-work F1 | Group Precision | Group Recall | Group F1 | Location Precision | Location Recall | Location F1 | Person Precision | Person Recall | Person F1 | Product Precision | Product Recall | Product F1 | B-corporation Precision | B-corporation Recall | B-corporation F1 | B-creative-work Precision | B-creative-work Recall | B-creative-work F1 | B-group Precision | B-group Recall | B-group F1 | B-location Precision | B-location Recall | B-location F1 | B-person Precision | B-person Recall | B-person F1 | B-product Precision | B-product Recall | B-product F1 | I-corporation Precision | I-corporation Recall | I-corporation F1 | I-creative-work Precision | I-creative-work Recall | I-creative-work F1 | I-group Precision | I-group Recall | I-group F1 | I-location Precision | I-location Recall | I-location F1 | I-person Precision | I-person Recall | I-person F1 | I-product Precision | I-product Recall | I-product F1 |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:---------------------:|:------------------:|:--------------:|:-----------------------:|:--------------------:|:----------------:|:---------------:|:------------:|:--------:|:------------------:|:---------------:|:-----------:|:----------------:|:-------------:|:---------:|:-----------------:|:--------------:|:----------:|:-----------------------:|:--------------------:|:----------------:|:-------------------------:|:----------------------:|:------------------:|:-----------------:|:--------------:|:----------:|:--------------------:|:-----------------:|:-------------:|:------------------:|:---------------:|:-----------:|:-------------------:|:----------------:|:------------:|:-----------------------:|:--------------------:|:----------------:|:-------------------------:|:----------------------:|:------------------:|:-----------------:|:--------------:|:----------:|:--------------------:|:-----------------:|:-------------:|:------------------:|:---------------:|:-----------:|:-------------------:|:----------------:|:------------:|
| No log | 1.0 | 425 | 0.3879 | 0.5038 | 0.2484 | 0.3327 | 0.9296 | 0.0714 | 0.0455 | 0.0556 | 0.1429 | 0.0070 | 0.0134 | 0.1667 | 0.0909 | 0.1176 | 0.4583 | 0.3667 | 0.4074 | 0.7569 | 0.4499 | 0.5643 | 0.0556 | 0.0079 | 0.0138 | 0.3333 | 0.1364 | 0.1935 | 1.0 | 0.0282 | 0.0548 | 0.4722 | 0.1030 | 0.1692 | 0.6162 | 0.4067 | 0.4900 | 0.9037 | 0.4592 | 0.6090 | 0.5 | 0.0157 | 0.0305 | 0.1111 | 0.0545 | 0.0732 | 0.5 | 0.0155 | 0.0301 | 0.12 | 0.0796 | 0.0957 | 0.4595 | 0.3579 | 0.4024 | 0.7108 | 0.3512 | 0.4701 | 0.125 | 0.0165 | 0.0292 |
| 0.196 | 2.0 | 850 | 0.4338 | 0.5712 | 0.2864 | 0.3815 | 0.9328 | 0.2174 | 0.2273 | 0.2222 | 0.4762 | 0.1408 | 0.2174 | 0.35 | 0.0848 | 0.1366 | 0.5727 | 0.42 | 0.4846 | 0.7992 | 0.4452 | 0.5719 | 0.1463 | 0.0472 | 0.0714 | 0.3208 | 0.2576 | 0.2857 | 0.8065 | 0.1761 | 0.2890 | 0.6 | 0.0909 | 0.1579 | 0.7216 | 0.4667 | 0.5668 | 0.8807 | 0.4476 | 0.5935 | 0.6522 | 0.1181 | 0.2 | 0.2917 | 0.2545 | 0.2718 | 0.6 | 0.1860 | 0.2840 | 0.2857 | 0.0708 | 0.1135 | 0.5625 | 0.3789 | 0.4528 | 0.7566 | 0.3423 | 0.4713 | 0.1765 | 0.0496 | 0.0774 |
| 0.0785 | 3.0 | 1275 | 0.4362 | 0.5254 | 0.3160 | 0.3947 | 0.9351 | 0.1833 | 0.1667 | 0.1746 | 0.4308 | 0.1972 | 0.2705 | 0.3467 | 0.1576 | 0.2167 | 0.55 | 0.44 | 0.4889 | 0.8008 | 0.4592 | 0.5837 | 0.1566 | 0.1024 | 0.1238 | 0.3256 | 0.2121 | 0.2569 | 0.76 | 0.2676 | 0.3958 | 0.5179 | 0.1758 | 0.2624 | 0.6792 | 0.48 | 0.5625 | 0.8615 | 0.4639 | 0.6030 | 0.4468 | 0.1654 | 0.2414 | 0.2889 | 0.2364 | 0.26 | 0.45 | 0.2093 | 0.2857 | 0.2549 | 0.1150 | 0.1585 | 0.5606 | 0.3895 | 0.4596 | 0.7564 | 0.3512 | 0.4797 | 0.1972 | 0.1157 | 0.1458 |
### Framework versions
- Transformers 4.35.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1