--- language: - en license: apache-2.0 tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: mobilebert_sa_GLUE_Experiment_mrpc_256 results: - task: name: Text Classification type: text-classification dataset: name: GLUE MRPC type: glue config: mrpc split: validation args: mrpc metrics: - name: Accuracy type: accuracy value: 0.6911764705882353 - name: F1 type: f1 value: 0.7947882736156351 --- # mobilebert_sa_GLUE_Experiment_mrpc_256 This model is a fine-tuned version of [google/mobilebert-uncased](https://huggingface.co/google/mobilebert-uncased) on the GLUE MRPC dataset. It achieves the following results on the evaluation set: - Loss: 0.6111 - Accuracy: 0.6912 - F1: 0.7948 - Combined Score: 0.7430 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 128 - eval_batch_size: 128 - seed: 10 - distributed_type: multi-GPU - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:| | 0.6431 | 1.0 | 29 | 0.6261 | 0.6838 | 0.8122 | 0.7480 | | 0.6296 | 2.0 | 58 | 0.6235 | 0.6838 | 0.8122 | 0.7480 | | 0.6306 | 3.0 | 87 | 0.6237 | 0.6838 | 0.8122 | 0.7480 | | 0.6297 | 4.0 | 116 | 0.6238 | 0.6838 | 0.8122 | 0.7480 | | 0.6276 | 5.0 | 145 | 0.6207 | 0.6838 | 0.8122 | 0.7480 | | 0.6197 | 6.0 | 174 | 0.6213 | 0.6838 | 0.8122 | 0.7480 | | 0.6065 | 7.0 | 203 | 0.6284 | 0.6912 | 0.8043 | 0.7478 | | 0.5258 | 8.0 | 232 | 0.6111 | 0.6912 | 0.7948 | 0.7430 | | 0.4596 | 9.0 | 261 | 0.6506 | 0.7034 | 0.8052 | 0.7543 | | 0.3953 | 10.0 | 290 | 0.7271 | 0.7034 | 0.7932 | 0.7483 | | 0.3426 | 11.0 | 319 | 0.9509 | 0.6740 | 0.7542 | 0.7141 | | 0.2821 | 12.0 | 348 | 1.0021 | 0.6863 | 0.7808 | 0.7335 | | 0.2177 | 13.0 | 377 | 1.0359 | 0.6691 | 0.7676 | 0.7184 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.14.0a0+410ce96 - Datasets 2.8.0 - Tokenizers 0.13.2