chinese-address-ner / README.md
librarian-bot's picture
Librarian Bot: Add base_model information to model
efb319b
|
raw
history blame
6.84 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model_index:
  - name: chinese-address-ner
    results:
      - task:
          name: Token Classification
          type: token-classification
        metric:
          name: Accuracy
          type: accuracy
          value: 0.975825946817083
base_model: hfl/chinese-roberta-wwm-ext

chinese-address-ner

This model is a fine-tuned version of hfl/chinese-roberta-wwm-ext on an unkown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1080
  • Precision: 0.9664
  • Recall: 0.9774
  • F1: 0.9719
  • Accuracy: 0.9758

Model description

输入一串地址中文信息,比如快递单:北京市海淀区西北旺东路10号院(马连洼街道西北旺社区东北方向),按照行政级别(总有 7 级)抽取地址信息,返回每个 token 的类别。具体类别含义表示如下:

返回类别 BIO 体系 解释
LABEL_0 O 忽略信息
LABEL_1 B-A1 第一级地址(头)
LABEL_2 I-A1 第一级地址(其余部分)
... ... ...

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 50
  • eval_batch_size: 50
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
2.5055 1.0 7 1.6719 0.1977 0.2604 0.2248 0.5649
1.837 2.0 14 1.0719 0.4676 0.6 0.5256 0.7421
1.0661 3.0 21 0.7306 0.6266 0.7472 0.6816 0.8106
0.8373 4.0 28 0.5197 0.6456 0.8113 0.7191 0.8614
0.522 5.0 35 0.3830 0.7667 0.8679 0.8142 0.9001
0.4295 6.0 42 0.3104 0.8138 0.8906 0.8505 0.9178
0.3483 7.0 49 0.2453 0.8462 0.9132 0.8784 0.9404
0.2471 8.0 56 0.2081 0.8403 0.9132 0.8752 0.9428
0.2299 9.0 63 0.1979 0.8419 0.9245 0.8813 0.9420
0.1761 10.0 70 0.1823 0.8830 0.9396 0.9104 0.9500
0.1434 11.0 77 0.1480 0.9036 0.9547 0.9284 0.9629
0.134 12.0 84 0.1341 0.9173 0.9623 0.9392 0.9678
0.128 13.0 91 0.1365 0.9375 0.9623 0.9497 0.9694
0.0824 14.0 98 0.1159 0.9557 0.9774 0.9664 0.9734
0.0744 15.0 105 0.1092 0.9591 0.9736 0.9663 0.9766
0.0569 16.0 112 0.1117 0.9556 0.9736 0.9645 0.9742
0.0559 17.0 119 0.1040 0.9628 0.9774 0.9700 0.9790
0.0456 18.0 126 0.1052 0.9593 0.9774 0.9682 0.9782
0.0405 19.0 133 0.1133 0.9590 0.9698 0.9644 0.9718
0.0315 20.0 140 0.1060 0.9591 0.9736 0.9663 0.9750
0.0262 21.0 147 0.1087 0.9554 0.9698 0.9625 0.9718
0.0338 22.0 154 0.1183 0.9625 0.9698 0.9662 0.9726
0.0225 23.0 161 0.1080 0.9664 0.9774 0.9719 0.9758
0.028 24.0 168 0.1057 0.9591 0.9736 0.9663 0.9742
0.0202 25.0 175 0.1062 0.9628 0.9774 0.9700 0.9766
0.0168 26.0 182 0.1097 0.9664 0.9774 0.9719 0.9758
0.0173 27.0 189 0.1093 0.9628 0.9774 0.9700 0.9774
0.0151 28.0 196 0.1162 0.9628 0.9774 0.9700 0.9766
0.0135 29.0 203 0.1126 0.9483 0.9698 0.9590 0.9758
0.0179 30.0 210 0.1100 0.9449 0.9698 0.9572 0.9774
0.0161 31.0 217 0.1098 0.9449 0.9698 0.9572 0.9766
0.0158 32.0 224 0.1191 0.9483 0.9698 0.9590 0.9734
0.0151 33.0 231 0.1058 0.9483 0.9698 0.9590 0.9750
0.0121 34.0 238 0.0990 0.9593 0.9774 0.9682 0.9790
0.0092 35.0 245 0.1128 0.9519 0.9698 0.9607 0.9774
0.0097 36.0 252 0.1181 0.9627 0.9736 0.9681 0.9766
0.0118 37.0 259 0.1185 0.9591 0.9736 0.9663 0.9782
0.0118 38.0 266 0.1021 0.9557 0.9774 0.9664 0.9823
0.0099 39.0 273 0.1000 0.9559 0.9811 0.9683 0.9815
0.0102 40.0 280 0.1025 0.9559 0.9811 0.9683 0.9815
0.0068 41.0 287 0.1080 0.9522 0.9774 0.9646 0.9807
0.0105 42.0 294 0.1157 0.9449 0.9698 0.9572 0.9766
0.0083 43.0 301 0.1207 0.9380 0.9698 0.9536 0.9766
0.0077 44.0 308 0.1208 0.9483 0.9698 0.9590 0.9766
0.0077 45.0 315 0.1176 0.9483 0.9698 0.9590 0.9774
0.0071 46.0 322 0.1137 0.9483 0.9698 0.9590 0.9790
0.0075 47.0 329 0.1144 0.9483 0.9698 0.9590 0.9782
0.0084 48.0 336 0.1198 0.9483 0.9698 0.9590 0.9766
0.0103 49.0 343 0.1217 0.9519 0.9698 0.9607 0.9766
0.0087 50.0 350 0.1230 0.9519 0.9698 0.9607 0.9766

Framework versions

  • Transformers 4.8.2
  • Pytorch 1.8.0
  • Datasets 1.9.0
  • Tokenizers 0.10.3