--- license: apache-2.0 tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: distilbert-base-uncased-finetuned-ner results: [] --- # distilbert-base-uncased-finetuned-ner This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1877 - Precision: 0.9674 - Recall: 0.9649 - F1: 0.9662 - Accuracy: 0.9754 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.1465 | 1.0 | 4367 | 0.1766 | 0.9258 | 0.9374 | 0.9316 | 0.9498 | | 0.0851 | 2.0 | 8734 | 0.1438 | 0.9421 | 0.9499 | 0.9460 | 0.9635 | | 0.0474 | 3.0 | 13101 | 0.1626 | 0.9507 | 0.9499 | 0.9503 | 0.9695 | | 0.0316 | 4.0 | 17468 | 0.1539 | 0.9535 | 0.9591 | 0.9563 | 0.9693 | | 0.0197 | 5.0 | 21835 | 0.1689 | 0.9568 | 0.9616 | 0.9592 | 0.9664 | | 0.0143 | 6.0 | 26202 | 0.1967 | 0.9632 | 0.9608 | 0.9620 | 0.9689 | | 0.0111 | 7.0 | 30569 | 0.1872 | 0.9552 | 0.9608 | 0.9580 | 0.9677 | | 0.0158 | 8.0 | 34936 | 0.1921 | 0.9619 | 0.9683 | 0.9651 | 0.9733 | | 0.0066 | 9.0 | 39303 | 0.2145 | 0.9594 | 0.9674 | 0.9634 | 0.9704 | | 0.0056 | 10.0 | 43670 | 0.1856 | 0.9617 | 0.9633 | 0.9625 | 0.9738 | | 0.0031 | 11.0 | 48037 | 0.1561 | 0.9657 | 0.9649 | 0.9653 | 0.9792 | | 0.0029 | 12.0 | 52404 | 0.1850 | 0.9683 | 0.9683 | 0.9683 | 0.9788 | | 0.0034 | 13.0 | 56771 | 0.2072 | 0.9640 | 0.9616 | 0.9628 | 0.9723 | | 0.0015 | 14.0 | 61138 | 0.2165 | 0.9641 | 0.9641 | 0.9641 | 0.9719 | | 0.0011 | 15.0 | 65505 | 0.1877 | 0.9674 | 0.9649 | 0.9662 | 0.9754 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.1 - Datasets 2.12.0 - Tokenizers 0.13.3