--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: bert-finetuned-ner results: [] --- # bert-finetuned-ner This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.9029 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 71 | 8.8951 | | No log | 2.0 | 142 | 8.8532 | | No log | 3.0 | 213 | 8.7305 | | No log | 4.0 | 284 | 8.5186 | | No log | 5.0 | 355 | 8.3212 | | No log | 6.0 | 426 | 8.1309 | | No log | 7.0 | 497 | 7.9420 | | 8.6328 | 8.0 | 568 | 7.7689 | | 8.6328 | 9.0 | 639 | 7.6050 | | 8.6328 | 10.0 | 710 | 7.4483 | | 8.6328 | 11.0 | 781 | 7.2902 | | 8.6328 | 12.0 | 852 | 7.1421 | | 8.6328 | 13.0 | 923 | 6.9918 | | 8.6328 | 14.0 | 994 | 6.8501 | | 7.5715 | 15.0 | 1065 | 6.7115 | | 7.5715 | 16.0 | 1136 | 6.5774 | | 7.5715 | 17.0 | 1207 | 6.4431 | | 7.5715 | 18.0 | 1278 | 6.3216 | | 7.5715 | 19.0 | 1349 | 6.1973 | | 7.5715 | 20.0 | 1420 | 6.0735 | | 7.5715 | 21.0 | 1491 | 5.9483 | | 6.6466 | 22.0 | 1562 | 5.8385 | | 6.6466 | 23.0 | 1633 | 5.7199 | | 6.6466 | 24.0 | 1704 | 5.6047 | | 6.6466 | 25.0 | 1775 | 5.5008 | | 6.6466 | 26.0 | 1846 | 5.3816 | | 6.6466 | 27.0 | 1917 | 5.2802 | | 6.6466 | 28.0 | 1988 | 5.1727 | | 5.8483 | 29.0 | 2059 | 5.0741 | | 5.8483 | 30.0 | 2130 | 4.9724 | | 5.8483 | 31.0 | 2201 | 4.8772 | | 5.8483 | 32.0 | 2272 | 4.7812 | | 5.8483 | 33.0 | 2343 | 4.6847 | | 5.8483 | 34.0 | 2414 | 4.5938 | | 5.8483 | 35.0 | 2485 | 4.5092 | | 5.1726 | 36.0 | 2556 | 4.4214 | | 5.1726 | 37.0 | 2627 | 4.3330 | | 5.1726 | 38.0 | 2698 | 4.2479 | | 5.1726 | 39.0 | 2769 | 4.1682 | | 5.1726 | 40.0 | 2840 | 4.0822 | | 5.1726 | 41.0 | 2911 | 4.0075 | | 5.1726 | 42.0 | 2982 | 3.9296 | | 4.5503 | 43.0 | 3053 | 3.8511 | | 4.5503 | 44.0 | 3124 | 3.7752 | | 4.5503 | 45.0 | 3195 | 3.7009 | | 4.5503 | 46.0 | 3266 | 3.6335 | | 4.5503 | 47.0 | 3337 | 3.5647 | | 4.5503 | 48.0 | 3408 | 3.4937 | | 4.5503 | 49.0 | 3479 | 3.4242 | | 4.009 | 50.0 | 3550 | 3.3637 | | 4.009 | 51.0 | 3621 | 3.3031 | | 4.009 | 52.0 | 3692 | 3.2395 | | 4.009 | 53.0 | 3763 | 3.1766 | | 4.009 | 54.0 | 3834 | 3.1246 | | 4.009 | 55.0 | 3905 | 3.0673 | | 4.009 | 56.0 | 3976 | 3.0090 | | 3.5477 | 57.0 | 4047 | 2.9559 | | 3.5477 | 58.0 | 4118 | 2.9098 | | 3.5477 | 59.0 | 4189 | 2.8603 | | 3.5477 | 60.0 | 4260 | 2.8132 | | 3.5477 | 61.0 | 4331 | 2.7637 | | 3.5477 | 62.0 | 4402 | 2.7219 | | 3.5477 | 63.0 | 4473 | 2.6734 | | 3.1822 | 64.0 | 4544 | 2.6286 | | 3.1822 | 65.0 | 4615 | 2.5867 | | 3.1822 | 66.0 | 4686 | 2.5491 | | 3.1822 | 67.0 | 4757 | 2.5076 | | 3.1822 | 68.0 | 4828 | 2.4720 | | 3.1822 | 69.0 | 4899 | 2.4349 | | 3.1822 | 70.0 | 4970 | 2.4026 | | 2.8647 | 71.0 | 5041 | 2.3661 | | 2.8647 | 72.0 | 5112 | 2.3343 | | 2.8647 | 73.0 | 5183 | 2.3081 | | 2.8647 | 74.0 | 5254 | 2.2751 | | 2.8647 | 75.0 | 5325 | 2.2461 | | 2.8647 | 76.0 | 5396 | 2.2165 | | 2.8647 | 77.0 | 5467 | 2.1938 | | 2.6306 | 78.0 | 5538 | 2.1681 | | 2.6306 | 79.0 | 5609 | 2.1438 | | 2.6306 | 80.0 | 5680 | 2.1232 | | 2.6306 | 81.0 | 5751 | 2.1022 | | 2.6306 | 82.0 | 5822 | 2.0800 | | 2.6306 | 83.0 | 5893 | 2.0605 | | 2.6306 | 84.0 | 5964 | 2.0427 | | 2.4436 | 85.0 | 6035 | 2.0273 | | 2.4436 | 86.0 | 6106 | 2.0109 | | 2.4436 | 87.0 | 6177 | 1.9975 | | 2.4436 | 88.0 | 6248 | 1.9827 | | 2.4436 | 89.0 | 6319 | 1.9700 | | 2.4436 | 90.0 | 6390 | 1.9584 | | 2.4436 | 91.0 | 6461 | 1.9502 | | 2.3168 | 92.0 | 6532 | 1.9394 | | 2.3168 | 93.0 | 6603 | 1.9307 | | 2.3168 | 94.0 | 6674 | 1.9221 | | 2.3168 | 95.0 | 6745 | 1.9167 | | 2.3168 | 96.0 | 6816 | 1.9122 | | 2.3168 | 97.0 | 6887 | 1.9078 | | 2.3168 | 98.0 | 6958 | 1.9055 | | 2.2563 | 99.0 | 7029 | 1.9039 | | 2.2563 | 100.0 | 7100 | 1.9029 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2