--- license: apache-2.0 library_name: peft tags: - generated_from_trainer base_model: google/flan-t5-base model-index: - name: flan-t5-base-AR-LORA-V1 results: [] --- # flan-t5-base-AR-LORA-V1 This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7342 - Exact Match: 30.9591 - Gen Len: 3.6062 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Exact Match | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:-----------:|:-------:| | 0.8221 | 1.0 | 4246 | 0.8269 | 20.5823 | 3.8789 | | 0.7459 | 2.0 | 8492 | 0.7917 | 26.3525 | 3.6663 | | 0.7727 | 3.0 | 12738 | 0.7775 | 30.8587 | 3.4217 | | 0.758 | 4.0 | 16984 | 0.7652 | 26.5946 | 3.7380 | | 0.7465 | 5.0 | 21230 | 0.7551 | 28.2601 | 3.6873 | | 0.6693 | 6.0 | 25476 | 0.7524 | 30.0201 | 3.6045 | | 0.6364 | 7.0 | 29722 | 0.7529 | 28.7208 | 3.6432 | | 0.6907 | 8.0 | 33968 | 0.7474 | 29.9965 | 3.6177 | | 0.8167 | 9.0 | 38214 | 0.7400 | 30.528 | 3.5908 | | 0.7631 | 10.0 | 42460 | 0.7407 | 31.095 | 3.5620 | | 0.7106 | 11.0 | 46706 | 0.7374 | 31.3962 | 3.5518 | | 0.7018 | 12.0 | 50952 | 0.7383 | 30.4394 | 3.6223 | | 0.6446 | 13.0 | 55198 | 0.7360 | 29.4354 | 3.6789 | | 0.7872 | 14.0 | 59444 | 0.7355 | 30.2209 | 3.6359 | | 0.8111 | 15.0 | 63690 | 0.7364 | 30.2622 | 3.6182 | | 0.7027 | 16.0 | 67936 | 0.7346 | 29.8429 | 3.6637 | | 0.7077 | 17.0 | 72182 | 0.7371 | 30.5398 | 3.6260 | | 0.7806 | 18.0 | 76428 | 0.7342 | 31.03 | 3.5911 | | 0.762 | 19.0 | 80674 | 0.7354 | 31.095 | 3.6043 | | 0.6805 | 20.0 | 84920 | 0.7342 | 30.9591 | 3.6062 | ### Framework versions - PEFT 0.11.1 - Transformers 4.41.2 - Pytorch 2.2.1 - Datasets 2.19.1 - Tokenizers 0.19.1