--- library_name: transformers language: - en base_model: gokulsrinivasagan/bert_tiny_lda_20_v1_book tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: bert_tiny_lda_20_v1_book_mrpc results: - task: name: Text Classification type: text-classification dataset: name: GLUE MRPC type: glue args: mrpc metrics: - name: Accuracy type: accuracy value: 0.7132352941176471 - name: F1 type: f1 value: 0.8186046511627908 --- # bert_tiny_lda_20_v1_book_mrpc This model is a fine-tuned version of [gokulsrinivasagan/bert_tiny_lda_20_v1_book](https://huggingface.co/gokulsrinivasagan/bert_tiny_lda_20_v1_book) on the GLUE MRPC dataset. It achieves the following results on the evaluation set: - Loss: 0.5678 - Accuracy: 0.7132 - F1: 0.8186 - Combined Score: 0.7659 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 256 - eval_batch_size: 256 - seed: 10 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:| | 0.6236 | 1.0 | 15 | 0.5941 | 0.6936 | 0.8056 | 0.7496 | | 0.5876 | 2.0 | 30 | 0.5767 | 0.7083 | 0.8194 | 0.7639 | | 0.5497 | 3.0 | 45 | 0.5678 | 0.7132 | 0.8186 | 0.7659 | | 0.503 | 4.0 | 60 | 0.5999 | 0.7157 | 0.8182 | 0.7669 | | 0.4554 | 5.0 | 75 | 0.5997 | 0.7157 | 0.8193 | 0.7675 | | 0.3873 | 6.0 | 90 | 0.6210 | 0.7034 | 0.7763 | 0.7399 | | 0.3092 | 7.0 | 105 | 0.7192 | 0.7304 | 0.8308 | 0.7806 | | 0.2575 | 8.0 | 120 | 0.7418 | 0.6593 | 0.7203 | 0.6898 | ### Framework versions - Transformers 4.46.3 - Pytorch 2.2.1+cu118 - Datasets 2.17.0 - Tokenizers 0.20.3