metadata
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: flan-t5-base-da-multiwoz2.0_80-loss-ep100
results: []
flan-t5-base-da-multiwoz2.0_80-loss-ep100
This model is a fine-tuned version of google/flan-t5-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4747
- Accuracy: 33.008
- Num: 7358
- Gen Len: 16.5684
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 60
- eval_batch_size: 400
- seed: 1799
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Num | Gen Len |
---|---|---|---|---|---|---|
1.3481 | 20.0 | 200 | 0.5537 | 26.9473 | 7358 | 15.6079 |
0.564 | 40.0 | 400 | 0.4902 | 31.3033 | 7358 | 16.6331 |
0.4647 | 60.0 | 600 | 0.4747 | 33.008 | 7358 | 16.5684 |
0.4171 | 80.0 | 800 | 0.4818 | 33.1704 | 7358 | 16.5529 |
0.4009 | 100.0 | 1000 | 0.4793 | 33.533 | 7358 | 16.4897 |
Framework versions
- Transformers 4.18.0
- Pytorch 1.10.0+cu111
- Datasets 2.5.1
- Tokenizers 0.12.1