distilgpt_new3_0055 / README.md
bigmorning's picture
add model
a443822
|
raw
history blame
3.42 kB
metadata
tags:
  - generated_from_keras_callback
model-index:
  - name: distilgpt_new3_0055
    results: []

distilgpt_new3_0055

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 2.5035
  • Validation Loss: 2.3859
  • Epoch: 54

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
2.5407 2.4254 0
2.5399 2.4247 1
2.5391 2.4238 2
2.5383 2.4232 3
2.5375 2.4210 4
2.5368 2.4210 5
2.5361 2.4197 6
2.5353 2.4193 7
2.5345 2.4191 8
2.5339 2.4177 9
2.5332 2.4188 10
2.5324 2.4160 11
2.5317 2.4164 12
2.5309 2.4145 13
2.5302 2.4153 14
2.5295 2.4139 15
2.5288 2.4134 16
2.5282 2.4123 17
2.5274 2.4116 18
2.5267 2.4110 19
2.5259 2.4106 20
2.5251 2.4097 21
2.5244 2.4074 22
2.5238 2.4078 23
2.5232 2.4072 24
2.5223 2.4062 25
2.5217 2.4054 26
2.5211 2.4057 27
2.5204 2.4044 28
2.5197 2.4026 29
2.5189 2.4017 30
2.5182 2.4026 31
2.5176 2.4012 32
2.5168 2.4013 33
2.5161 2.3990 34
2.5154 2.3999 35
2.5149 2.3978 36
2.5142 2.3981 37
2.5135 2.3981 38
2.5130 2.3972 39
2.5123 2.3957 40
2.5116 2.3940 41
2.5108 2.3933 42
2.5103 2.3927 43
2.5095 2.3923 44
2.5090 2.3918 45
2.5083 2.3914 46
2.5078 2.3905 47
2.5070 2.3888 48
2.5062 2.3894 49
2.5058 2.3898 50
2.5051 2.3868 51
2.5045 2.3873 52
2.5041 2.3872 53
2.5035 2.3859 54

Framework versions

  • Transformers 4.20.1
  • TensorFlow 2.8.2
  • Datasets 2.3.2
  • Tokenizers 0.12.1