bert-finetuned-ner / README.md
Proccyon's picture
Training complete
7f3fca4
|
raw
history blame
4.8 kB
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
model-index:
  - name: bert-finetuned-ner
    results: []

bert-finetuned-ner

This model is a fine-tuned version of bert-base-uncased on an unknown dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy B-location-precision B-location-recall B-location-f1 I-location-precision I-location-recall I-location-f1 B-group-precision B-group-recall B-group-f1 I-group-precision I-group-recall I-group-f1 B-corporation-precision B-corporation-recall B-corporation-f1 I-corporation-precision I-corporation-recall I-corporation-f1 B-person-precision B-person-recall B-person-f1 I-person-precision I-person-recall I-person-f1 B-creative-work-precision B-creative-work-recall B-creative-work-f1 I-creative-work-precision I-creative-work-recall I-creative-work-f1 B-product-precision B-product-recall B-product-f1 I-product-precision I-product-recall I-product-f1 Corporation-precision Corporation-recall Corporation-f1 Corporation-number Creative-work-precision Creative-work-recall Creative-work-f1 Creative-work-number Group-precision Group-recall Group-f1 Group-number Location-precision Location-recall Location-f1 Location-number Person-precision Person-recall Person-f1 Person-number Product-precision Product-recall Product-f1 Product-number
No log 1.0 425 0.1275 0.5171 0.3838 0.4406 0.9687 nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan 0.0 0.0 0.0 221 0.0 0.0 0.0 140 0.0 0.0 0.0 264 0.4054 0.4690 0.4349 548 0.6234 0.7348 0.6745 660 0.2963 0.1127 0.1633 142

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cpu
  • Datasets 2.14.6
  • Tokenizers 0.14.1