chinese-address-ner / README.md
jiaqianjing's picture
lalalala...
728d601
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model_index:
  - name: chinese-address-ner
    results:
      - task:
          name: Token Classification
          type: token-classification
        metric:
          name: Accuracy
          type: accuracy
          value: 0.9852459016393442

chinese-address-ner

This model is a fine-tuned version of hfl/chinese-roberta-wwm-ext on an unkown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0999
  • Precision: 0.9739
  • Recall: 0.9849
  • F1: 0.9794
  • Accuracy: 0.9852

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 50
  • eval_batch_size: 50
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
0.0656 0.14 1 0.1061 0.9665 0.9811 0.9738 0.9844
0.1305 0.29 2 0.1096 0.9630 0.9811 0.9720 0.9836
0.1009 0.43 3 0.0999 0.9739 0.9849 0.9794 0.9852
0.0844 0.57 4 0.0911 0.9739 0.9849 0.9794 0.9852
0.0773 0.71 5 0.0858 0.9703 0.9849 0.9775 0.9852
0.0997 0.86 6 0.0815 0.9739 0.9849 0.9794 0.9861
0.0904 1.0 7 0.0795 0.9739 0.9849 0.9794 0.9861

Framework versions

  • Transformers 4.8.2
  • Pytorch 1.7.0
  • Datasets 1.9.0
  • Tokenizers 0.10.3