--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: distilBERT_without_preprocessing_grid_search results: [] --- # distilBERT_without_preprocessing_grid_search This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6731 - Precision: 0.8400 - Recall: 0.8427 - F1: 0.8407 - Accuracy: 0.8779 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 257 | 0.6542 | 0.7446 | 0.8052 | 0.7657 | 0.8350 | | 0.8635 | 2.0 | 514 | 0.5548 | 0.7961 | 0.8277 | 0.8056 | 0.8540 | | 0.8635 | 3.0 | 771 | 0.4839 | 0.7912 | 0.8427 | 0.8115 | 0.8589 | | 0.3097 | 4.0 | 1028 | 0.5256 | 0.8148 | 0.8544 | 0.8315 | 0.8667 | | 0.3097 | 5.0 | 1285 | 0.5657 | 0.8346 | 0.8494 | 0.8413 | 0.8764 | | 0.1839 | 6.0 | 1542 | 0.6005 | 0.8208 | 0.8430 | 0.8304 | 0.8710 | | 0.1839 | 7.0 | 1799 | 0.6580 | 0.8319 | 0.8349 | 0.8314 | 0.8706 | | 0.1254 | 8.0 | 2056 | 0.6348 | 0.8342 | 0.8515 | 0.8423 | 0.8774 | | 0.1254 | 9.0 | 2313 | 0.6601 | 0.8314 | 0.8394 | 0.8348 | 0.8745 | | 0.0935 | 10.0 | 2570 | 0.6731 | 0.8400 | 0.8427 | 0.8407 | 0.8779 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3