en-multinerd-ner-more-training
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0563
- Precision: 0.8942
- Recall: 0.9050
- F1: 0.8996
- Accuracy: 0.9836
- Per-precision: 0.9952
- Per-recall: 0.9965
- Per-f1: 0.9959
- Org-precision: 0.9412
- Org-recall: 0.9474
- Org-f1: 0.9443
- Loc-precision: 0.9691
- Loc-recall: 0.9752
- Loc-f1: 0.9721
- Dis-precision: 0.7157
- Dis-recall: 0.7440
- Dis-f1: 0.7295
- Anim-precision: 0.6870
- Anim-recall: 0.7240
- Anim-f1: 0.7050
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Per-precision | Per-recall | Per-f1 | Org-precision | Org-recall | Org-f1 | Loc-precision | Loc-recall | Loc-f1 | Dis-precision | Dis-recall | Dis-f1 | Anim-precision | Anim-recall | Anim-f1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.0447 | 1.0 | 8205 | 0.0524 | 0.8583 | 0.9010 | 0.8791 | 0.9808 | 0.9939 | 0.9959 | 0.9949 | 0.8846 | 0.9547 | 0.9183 | 0.9626 | 0.9727 | 0.9676 | 0.6472 | 0.7457 | 0.6930 | 0.6607 | 0.7522 | 0.7035 |
0.0309 | 2.0 | 16410 | 0.0503 | 0.8800 | 0.9028 | 0.8913 | 0.9825 | 0.9940 | 0.9965 | 0.9953 | 0.9304 | 0.9474 | 0.9388 | 0.9640 | 0.9712 | 0.9676 | 0.7072 | 0.7369 | 0.7218 | 0.6765 | 0.7469 | 0.7100 |
0.0209 | 3.0 | 24615 | 0.0520 | 0.8908 | 0.9042 | 0.8975 | 0.9836 | 0.9953 | 0.9967 | 0.9960 | 0.9365 | 0.9478 | 0.9421 | 0.9680 | 0.9750 | 0.9715 | 0.7163 | 0.7268 | 0.7215 | 0.7013 | 0.7434 | 0.7217 |
0.0138 | 4.0 | 32820 | 0.0563 | 0.8942 | 0.9050 | 0.8996 | 0.9836 | 0.9952 | 0.9965 | 0.9959 | 0.9412 | 0.9474 | 0.9443 | 0.9691 | 0.9752 | 0.9721 | 0.7157 | 0.7440 | 0.7295 | 0.6870 | 0.7240 | 0.7050 |
Framework versions
- Transformers 4.36.1
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 1