Edit model card

distilgpt_new3_0015

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 2.5302
  • Validation Loss: 2.4153
  • Epoch: 14

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
2.5407 2.4254 0
2.5399 2.4247 1
2.5391 2.4238 2
2.5383 2.4232 3
2.5375 2.4210 4
2.5368 2.4210 5
2.5361 2.4197 6
2.5353 2.4193 7
2.5345 2.4191 8
2.5339 2.4177 9
2.5332 2.4188 10
2.5324 2.4160 11
2.5317 2.4164 12
2.5309 2.4145 13
2.5302 2.4153 14

Framework versions

  • Transformers 4.20.1
  • TensorFlow 2.8.2
  • Datasets 2.3.2
  • Tokenizers 0.12.1
Downloads last month
4