--- license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: scenario-KD-SCR-DIV2-data-glue-qnli-model-bert-base-uncased-run-1 results: [] --- # scenario-KD-SCR-DIV2-data-glue-qnli-model-bert-base-uncased-run-1 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.7514 - Accuracy: 0.8627 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 6969 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 2.4263 | 1.0 | 3273 | 1.6907 | 0.8545 | | 1.7748 | 2.0 | 6547 | 1.8491 | 0.8499 | | 1.1414 | 3.0 | 9820 | 1.9422 | 0.8545 | | 0.8965 | 4.0 | 13094 | 1.7533 | 0.8552 | | 0.7756 | 5.0 | 16367 | 1.7103 | 0.8570 | | 0.6527 | 6.0 | 19641 | 1.6665 | 0.8569 | | 0.6056 | 7.0 | 22914 | 1.5879 | 0.8620 | | 0.5559 | 8.0 | 26188 | 1.6570 | 0.8618 | | 0.5154 | 9.0 | 29461 | 1.5519 | 0.8658 | | 0.4752 | 10.0 | 32735 | 1.6905 | 0.8612 | | 0.4581 | 11.0 | 36008 | 1.6075 | 0.8644 | | 0.4322 | 12.0 | 39282 | 1.6963 | 0.8614 | | 0.3969 | 13.0 | 42555 | 1.6467 | 0.8660 | | 0.393 | 14.0 | 45829 | 1.6735 | 0.8680 | | 0.3651 | 15.0 | 49102 | 1.7631 | 0.8614 | | 0.3464 | 16.0 | 52376 | 1.7957 | 0.8645 | | 0.3455 | 17.0 | 55649 | 1.7008 | 0.8680 | | 0.3276 | 18.0 | 58923 | 1.7183 | 0.8669 | | 0.3239 | 19.0 | 62196 | 1.7514 | 0.8627 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.2 - Datasets 2.16.0 - Tokenizers 0.15.0