t5_es_weight_2_1 / README.md
CatBarks's picture
CatBarks/t5_es_BCE_weight_2_1
eda0d4d verified
metadata
license: apache-2.0
base_model: google-t5/t5-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: t5_es_weight_2_1
    results: []

t5_es_weight_2_1

This model is a fine-tuned version of google-t5/t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0241
  • Accuracy: 0.997
  • F1: 0.9972

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 64
  • total_train_batch_size: 4096
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.7035 6.8817 50 0.6738 0.7045 0.7288
0.6463 13.7634 100 0.5114 0.8975 0.9015
0.2909 20.6452 150 0.0785 0.977 0.9783
0.0595 27.5269 200 0.0455 0.987 0.9878
0.0286 34.4086 250 0.0283 0.992 0.9925
0.0158 41.2903 300 0.0219 0.9945 0.9948
0.0086 48.1720 350 0.0180 0.996 0.9962
0.0048 55.0538 400 0.0172 0.9955 0.9958
0.0031 61.9355 450 0.0223 0.9955 0.9958
0.002 68.8172 500 0.0199 0.9955 0.9958
0.0012 75.6989 550 0.0201 0.9965 0.9967
0.0008 82.5806 600 0.0190 0.997 0.9972
0.0008 89.4624 650 0.0205 0.997 0.9972
0.0007 96.3441 700 0.0241 0.997 0.9972

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.19.1