--- license: apache-2.0 base_model: google-bert/bert-base-cased tags: - generated_from_trainer metrics: - accuracy model-index: - name: results results: [] --- # results This model is a fine-tuned version of [google-bert/bert-base-cased](https://huggingface.co/google-bert/bert-base-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.0462 - Accuracy: 0.565 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 15 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 1.6966 | 0.12 | 30 | 1.6482 | 0.205 | | 1.6111 | 0.24 | 60 | 1.5500 | 0.312 | | 1.5634 | 0.36 | 90 | 1.4253 | 0.391 | | 1.4389 | 0.48 | 120 | 1.2866 | 0.429 | | 1.3507 | 0.6 | 150 | 1.2292 | 0.466 | | 1.3072 | 0.72 | 180 | 1.2251 | 0.435 | | 1.2346 | 0.84 | 210 | 1.3498 | 0.412 | | 1.3884 | 0.96 | 240 | 1.1693 | 0.515 | | 1.0748 | 1.08 | 270 | 1.2255 | 0.474 | | 1.02 | 1.2 | 300 | 1.2691 | 0.475 | | 1.0354 | 1.32 | 330 | 1.1937 | 0.48 | | 1.0622 | 1.44 | 360 | 1.1304 | 0.512 | | 1.0289 | 1.56 | 390 | 1.2823 | 0.465 | | 1.1433 | 1.6800 | 420 | 1.0603 | 0.527 | | 1.0125 | 1.8 | 450 | 1.0753 | 0.522 | | 0.8716 | 1.92 | 480 | 1.0901 | 0.532 | | 0.8761 | 2.04 | 510 | 1.0462 | 0.565 | | 0.6857 | 2.16 | 540 | 1.0626 | 0.555 | | 0.7674 | 2.2800 | 570 | 1.0799 | 0.545 | | 0.6676 | 2.4 | 600 | 1.0843 | 0.546 | | 0.6254 | 2.52 | 630 | 1.1148 | 0.551 | | 0.6813 | 2.64 | 660 | 1.1227 | 0.553 | | 0.7043 | 2.76 | 690 | 1.1267 | 0.558 | | 0.4643 | 2.88 | 720 | 1.1227 | 0.551 | | 0.6665 | 3.0 | 750 | 1.1222 | 0.557 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1