--- license: apache-2.0 tags: - generated_from_trainer base_model: google/t5-efficient-tiny datasets: - generator metrics: - accuracy - precision - recall - f1 model-index: - name: salt_language_Classification results: - task: type: text-classification name: Text Classification dataset: name: generator type: generator config: default split: train args: default metrics: - type: accuracy value: 0.9781586021505376 name: Accuracy - type: precision value: 0.9786579334649282 name: Precision - type: recall value: 0.9781586021505376 name: Recall - type: f1 value: 0.97818824673623 name: F1 --- # salt_language_Classification This model is a fine-tuned version of [google/t5-efficient-tiny](https://huggingface.co/google/t5-efficient-tiny) on the generator dataset. It achieves the following results on the evaluation set: - Loss: 0.0615 - Accuracy: 0.9782 - Precision: 0.9787 - Recall: 0.9782 - F1: 0.9782 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 10 - training_steps: 20000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.2011 | 0.025 | 500 | 0.4979 | 0.8733 | 0.9001 | 0.8733 | 0.8714 | | 0.234 | 0.05 | 1000 | 0.1886 | 0.9345 | 0.9354 | 0.9345 | 0.9345 | | 0.2083 | 0.075 | 1500 | 0.1833 | 0.9328 | 0.9391 | 0.9328 | 0.9328 | | 0.1838 | 0.1 | 2000 | 0.1457 | 0.9476 | 0.9479 | 0.9476 | 0.9475 | | 0.1737 | 0.125 | 2500 | 0.1659 | 0.9409 | 0.9438 | 0.9409 | 0.9411 | | 0.1591 | 0.15 | 3000 | 0.1450 | 0.9516 | 0.9524 | 0.9516 | 0.9517 | | 0.1571 | 0.175 | 3500 | 0.1351 | 0.9459 | 0.9485 | 0.9459 | 0.9461 | | 0.1513 | 0.2 | 4000 | 0.1510 | 0.9456 | 0.9515 | 0.9456 | 0.9460 | | 0.1439 | 0.225 | 4500 | 0.1339 | 0.9546 | 0.9578 | 0.9546 | 0.9547 | | 0.1394 | 0.25 | 5000 | 0.1052 | 0.9657 | 0.9658 | 0.9657 | 0.9656 | | 0.1472 | 0.275 | 5500 | 0.1088 | 0.9610 | 0.9629 | 0.9610 | 0.9609 | | 0.1385 | 0.3 | 6000 | 0.0792 | 0.9694 | 0.9696 | 0.9694 | 0.9694 | | 0.1349 | 0.325 | 6500 | 0.1063 | 0.9610 | 0.9632 | 0.9610 | 0.9613 | | 0.1215 | 0.35 | 7000 | 0.0855 | 0.9688 | 0.9694 | 0.9688 | 0.9687 | | 0.133 | 0.375 | 7500 | 0.1049 | 0.9630 | 0.9640 | 0.9630 | 0.9630 | | 0.1226 | 0.4 | 8000 | 0.0938 | 0.9667 | 0.9675 | 0.9667 | 0.9667 | | 0.1222 | 0.425 | 8500 | 0.1134 | 0.9570 | 0.9604 | 0.9570 | 0.9573 | | 0.1165 | 0.45 | 9000 | 0.0997 | 0.9688 | 0.9697 | 0.9688 | 0.9687 | | 0.1174 | 0.475 | 9500 | 0.1002 | 0.9661 | 0.9680 | 0.9661 | 0.9659 | | 0.1165 | 0.5 | 10000 | 0.0807 | 0.9728 | 0.9728 | 0.9728 | 0.9728 | | 0.1065 | 0.525 | 10500 | 0.0750 | 0.9745 | 0.9754 | 0.9745 | 0.9746 | | 0.1089 | 0.55 | 11000 | 0.0896 | 0.9688 | 0.9703 | 0.9688 | 0.9689 | | 0.1125 | 0.575 | 11500 | 0.0632 | 0.9782 | 0.9787 | 0.9782 | 0.9782 | | 0.11 | 0.6 | 12000 | 0.0775 | 0.9691 | 0.9708 | 0.9691 | 0.9692 | | 0.1028 | 0.625 | 12500 | 0.0833 | 0.9698 | 0.9708 | 0.9698 | 0.9698 | | 0.1052 | 0.65 | 13000 | 0.0663 | 0.9751 | 0.9755 | 0.9751 | 0.9751 | | 0.1068 | 0.675 | 13500 | 0.0648 | 0.9772 | 0.9774 | 0.9772 | 0.9772 | | 0.1029 | 0.7 | 14000 | 0.0962 | 0.9688 | 0.9706 | 0.9688 | 0.9689 | | 0.1014 | 0.725 | 14500 | 0.0686 | 0.9772 | 0.9775 | 0.9772 | 0.9771 | | 0.0978 | 0.75 | 15000 | 0.0802 | 0.9745 | 0.9752 | 0.9745 | 0.9745 | | 0.095 | 0.775 | 15500 | 0.0646 | 0.9758 | 0.9763 | 0.9758 | 0.9758 | | 0.0996 | 0.8 | 16000 | 0.0711 | 0.9758 | 0.9761 | 0.9758 | 0.9758 | | 0.0967 | 0.825 | 16500 | 0.0683 | 0.9761 | 0.9768 | 0.9761 | 0.9761 | | 0.0939 | 0.85 | 17000 | 0.0572 | 0.9792 | 0.9795 | 0.9792 | 0.9791 | | 0.0966 | 0.875 | 17500 | 0.0527 | 0.9792 | 0.9794 | 0.9792 | 0.9791 | | 0.0925 | 0.9 | 18000 | 0.0581 | 0.9798 | 0.9802 | 0.9798 | 0.9799 | | 0.0945 | 0.925 | 18500 | 0.0693 | 0.9768 | 0.9776 | 0.9768 | 0.9768 | | 0.0923 | 0.95 | 19000 | 0.0615 | 0.9785 | 0.9790 | 0.9785 | 0.9785 | | 0.0896 | 0.975 | 19500 | 0.0643 | 0.9758 | 0.9766 | 0.9758 | 0.9758 | | 0.0979 | 1.0 | 20000 | 0.0619 | 0.9765 | 0.9770 | 0.9765 | 0.9765 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1