--- tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: bert-base-chinese-david-ner results: [] --- # bert-base-chinese-david-ner This model is a fine-tuned version of [bert-base-chinese](https://huggingface.co/bert-base-chinese) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0588 - Precision: 0.9447 - Recall: 0.9479 - F1: 0.9463 - Accuracy: 0.9861 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.9539 | 0.09 | 100 | 0.4089 | 0.2768 | 0.2401 | 0.2571 | 0.8587 | | 0.2552 | 0.18 | 200 | 0.1580 | 0.7110 | 0.7746 | 0.7415 | 0.9493 | | 0.1586 | 0.27 | 300 | 0.1327 | 0.7357 | 0.8448 | 0.7865 | 0.9569 | | 0.1322 | 0.35 | 400 | 0.1132 | 0.7820 | 0.8652 | 0.8215 | 0.9637 | | 0.1171 | 0.44 | 500 | 0.1135 | 0.8522 | 0.8618 | 0.8570 | 0.9681 | | 0.1 | 0.53 | 600 | 0.1008 | 0.8280 | 0.8720 | 0.8494 | 0.9685 | | 0.0925 | 0.62 | 700 | 0.0983 | 0.8519 | 0.8924 | 0.8717 | 0.9730 | | 0.0932 | 0.71 | 800 | 0.0720 | 0.8775 | 0.9003 | 0.8888 | 0.9771 | | 0.0846 | 0.8 | 900 | 0.0754 | 0.8879 | 0.9060 | 0.8969 | 0.9766 | | 0.0719 | 0.88 | 1000 | 0.0834 | 0.8713 | 0.8969 | 0.8839 | 0.9766 | | 0.0854 | 0.97 | 1100 | 0.0710 | 0.8970 | 0.9173 | 0.9071 | 0.9795 | | 0.0582 | 1.06 | 1200 | 0.0900 | 0.8760 | 0.8958 | 0.8858 | 0.9736 | | 0.0487 | 1.15 | 1300 | 0.0880 | 0.9027 | 0.9241 | 0.9133 | 0.9795 | | 0.0554 | 1.24 | 1400 | 0.0612 | 0.9132 | 0.9298 | 0.9214 | 0.9822 | | 0.0423 | 1.33 | 1500 | 0.0686 | 0.8958 | 0.9253 | 0.9103 | 0.9805 | | 0.0532 | 1.41 | 1600 | 0.0663 | 0.9070 | 0.9275 | 0.9171 | 0.9812 | | 0.0433 | 1.5 | 1700 | 0.0644 | 0.9270 | 0.9343 | 0.9306 | 0.9833 | | 0.0486 | 1.59 | 1800 | 0.0613 | 0.9099 | 0.9264 | 0.9181 | 0.9820 | | 0.0528 | 1.68 | 1900 | 0.0601 | 0.9201 | 0.9264 | 0.9233 | 0.9842 | | 0.0382 | 1.77 | 2000 | 0.0667 | 0.9172 | 0.9287 | 0.9229 | 0.9835 | | 0.0517 | 1.86 | 2100 | 0.0607 | 0.9260 | 0.9354 | 0.9307 | 0.9835 | | 0.0455 | 1.94 | 2200 | 0.0591 | 0.9147 | 0.9354 | 0.9250 | 0.9830 | | 0.0377 | 2.03 | 2300 | 0.0679 | 0.9238 | 0.9332 | 0.9285 | 0.9828 | | 0.0239 | 2.12 | 2400 | 0.0604 | 0.9246 | 0.9445 | 0.9345 | 0.9851 | | 0.0237 | 2.21 | 2500 | 0.0700 | 0.9233 | 0.9411 | 0.9321 | 0.9838 | | 0.0233 | 2.3 | 2600 | 0.0639 | 0.9201 | 0.9388 | 0.9294 | 0.9835 | | 0.0196 | 2.39 | 2700 | 0.0589 | 0.9352 | 0.9479 | 0.9415 | 0.9864 | | 0.0259 | 2.47 | 2800 | 0.0617 | 0.9337 | 0.9411 | 0.9374 | 0.9856 | | 0.0244 | 2.56 | 2900 | 0.0609 | 0.9379 | 0.9411 | 0.9395 | 0.9855 | | 0.0231 | 2.65 | 3000 | 0.0653 | 0.9383 | 0.9479 | 0.9431 | 0.9859 | | 0.0326 | 2.74 | 3100 | 0.0588 | 0.9447 | 0.9479 | 0.9463 | 0.9861 | | 0.0313 | 2.83 | 3200 | 0.0552 | 0.9446 | 0.9456 | 0.9451 | 0.9871 | | 0.0227 | 2.92 | 3300 | 0.0517 | 0.9394 | 0.9479 | 0.9436 | 0.9871 | | 0.0244 | 3.0 | 3400 | 0.0588 | 0.9259 | 0.9479 | 0.9368 | 0.9855 | | 0.0205 | 3.09 | 3500 | 0.0607 | 0.9224 | 0.9422 | 0.9322 | 0.9857 | | 0.0181 | 3.18 | 3600 | 0.0601 | 0.9266 | 0.9434 | 0.9349 | 0.9856 | | 0.0097 | 3.27 | 3700 | 0.0649 | 0.9360 | 0.9434 | 0.9397 | 0.9854 | | 0.0137 | 3.36 | 3800 | 0.0662 | 0.9372 | 0.9468 | 0.9420 | 0.9851 | | 0.0131 | 3.45 | 3900 | 0.0657 | 0.9353 | 0.9502 | 0.9427 | 0.9858 | | 0.0119 | 3.53 | 4000 | 0.0639 | 0.9373 | 0.9479 | 0.9426 | 0.9860 | | 0.0189 | 3.62 | 4100 | 0.0625 | 0.9371 | 0.9456 | 0.9414 | 0.9858 | | 0.0179 | 3.71 | 4200 | 0.0609 | 0.9385 | 0.9502 | 0.9443 | 0.9860 | | 0.0111 | 3.8 | 4300 | 0.0609 | 0.9362 | 0.9479 | 0.9420 | 0.9864 | | 0.0102 | 3.89 | 4400 | 0.0607 | 0.9383 | 0.9479 | 0.9431 | 0.9860 | | 0.0166 | 3.98 | 4500 | 0.0606 | 0.9395 | 0.9490 | 0.9442 | 0.9859 | ### Framework versions - Transformers 4.29.0.dev0 - Pytorch 1.10.1+cu113 - Datasets 2.11.0 - Tokenizers 0.13.3