--- language: - en license: apache-2.0 tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: first_try results: - task: name: Text Classification type: text-classification dataset: name: GLUE QQP type: glue config: qqp split: validation args: qqp metrics: - name: Accuracy type: accuracy value: 0.9094978976007915 - name: F1 type: f1 value: 0.8781916841439461 --- # first_try This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the GLUE QQP dataset. It achieves the following results on the evaluation set: - Loss: 0.2975 - Accuracy: 0.9095 - F1: 0.8782 - Combined Score: 0.8938 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 6 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score | | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:--------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:| | 0.3347 | 1.0 | 11371 | 0.2781 | 0.8986 | 0.8645 | 0.8816 | OrderedDict([(, {0: 320, 1: 256, 2: 320, 3: 192, 4: 256, 5: 256, 6: 192, 7: 256, 8: 64, 9: 192, 10: 192, 11: 512, 12: 1675, 13: 1666, 14: 1787, 15: 1791, 16: 1772, 17: 1751, 18: 1709, 19: 1590, 20: 1320, 21: 762, 22: 348, 23: 115})]) | | 0.3347 | 1.0 | 11371 | 0.2633 | 0.9022 | 0.8709 | 0.8865 | OrderedDict([(, {0: 768, 1: 768, 2: 768, 3: 768, 4: 768, 5: 768, 6: 768, 7: 768, 8: 768, 9: 768, 10: 768, 11: 768, 12: 3072, 13: 3072, 14: 3072, 15: 3072, 16: 3072, 17: 3072, 18: 3072, 19: 3072, 20: 3072, 21: 3072, 22: 3072, 23: 3072})]) | | 0.1664 | 2.0 | 22742 | 0.2724 | 0.9048 | 0.8736 | 0.8892 | OrderedDict([(, {0: 320, 1: 256, 2: 320, 3: 192, 4: 256, 5: 256, 6: 192, 7: 256, 8: 64, 9: 192, 10: 192, 11: 512, 12: 1675, 13: 1666, 14: 1787, 15: 1791, 16: 1772, 17: 1751, 18: 1709, 19: 1590, 20: 1320, 21: 762, 22: 348, 23: 115})]) | | 0.1664 | 2.0 | 22742 | 0.2665 | 0.9106 | 0.8809 | 0.8958 | OrderedDict([(, {0: 768, 1: 768, 2: 768, 3: 768, 4: 768, 5: 768, 6: 768, 7: 768, 8: 768, 9: 768, 10: 768, 11: 768, 12: 3072, 13: 3072, 14: 3072, 15: 3072, 16: 3072, 17: 3072, 18: 3072, 19: 3072, 20: 3072, 21: 3072, 22: 3072, 23: 3072})]) | | 0.092 | 3.0 | 34113 | 0.2872 | 0.9094 | 0.8786 | 0.8940 | OrderedDict([(, {0: 320, 1: 256, 2: 320, 3: 192, 4: 256, 5: 256, 6: 192, 7: 256, 8: 64, 9: 192, 10: 192, 11: 512, 12: 1675, 13: 1666, 14: 1787, 15: 1791, 16: 1772, 17: 1751, 18: 1709, 19: 1590, 20: 1320, 21: 762, 22: 348, 23: 115})]) | | 0.092 | 3.0 | 34113 | 0.2708 | 0.9141 | 0.8846 | 0.8994 | OrderedDict([(, {0: 768, 1: 768, 2: 768, 3: 768, 4: 768, 5: 768, 6: 768, 7: 768, 8: 768, 9: 768, 10: 768, 11: 768, 12: 3072, 13: 3072, 14: 3072, 15: 3072, 16: 3072, 17: 3072, 18: 3072, 19: 3072, 20: 3072, 21: 3072, 22: 3072, 23: 3072})]) | | 0.0693 | 4.0 | 45484 | 0.2966 | 0.9088 | 0.8771 | 0.8930 | OrderedDict([(, {0: 320, 1: 256, 2: 320, 3: 192, 4: 256, 5: 256, 6: 192, 7: 256, 8: 64, 9: 192, 10: 192, 11: 512, 12: 1675, 13: 1666, 14: 1787, 15: 1791, 16: 1772, 17: 1751, 18: 1709, 19: 1590, 20: 1320, 21: 762, 22: 348, 23: 115})]) | | 0.0693 | 4.0 | 45484 | 0.2779 | 0.9144 | 0.8846 | 0.8995 | OrderedDict([(, {0: 768, 1: 768, 2: 768, 3: 768, 4: 768, 5: 768, 6: 768, 7: 768, 8: 768, 9: 768, 10: 768, 11: 768, 12: 3072, 13: 3072, 14: 3072, 15: 3072, 16: 3072, 17: 3072, 18: 3072, 19: 3072, 20: 3072, 21: 3072, 22: 3072, 23: 3072})]) | ### Framework versions - Transformers 4.29.1 - Pytorch 1.12.1 - Datasets 2.13.1 - Tokenizers 0.13.3