File size: 4,243 Bytes
626a693
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---
tags:
- generated_from_keras_callback
model-index:
- name: distilgpt_new3_0075
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# distilgpt_new3_0075

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.4912
- Validation Loss: 2.3729
- Epoch: 74

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.5407     | 2.4254          | 0     |
| 2.5399     | 2.4247          | 1     |
| 2.5391     | 2.4238          | 2     |
| 2.5383     | 2.4232          | 3     |
| 2.5375     | 2.4210          | 4     |
| 2.5368     | 2.4210          | 5     |
| 2.5361     | 2.4197          | 6     |
| 2.5353     | 2.4193          | 7     |
| 2.5345     | 2.4191          | 8     |
| 2.5339     | 2.4177          | 9     |
| 2.5332     | 2.4188          | 10    |
| 2.5324     | 2.4160          | 11    |
| 2.5317     | 2.4164          | 12    |
| 2.5309     | 2.4145          | 13    |
| 2.5302     | 2.4153          | 14    |
| 2.5295     | 2.4139          | 15    |
| 2.5288     | 2.4134          | 16    |
| 2.5282     | 2.4123          | 17    |
| 2.5274     | 2.4116          | 18    |
| 2.5267     | 2.4110          | 19    |
| 2.5259     | 2.4106          | 20    |
| 2.5251     | 2.4097          | 21    |
| 2.5244     | 2.4074          | 22    |
| 2.5238     | 2.4078          | 23    |
| 2.5232     | 2.4072          | 24    |
| 2.5223     | 2.4062          | 25    |
| 2.5217     | 2.4054          | 26    |
| 2.5211     | 2.4057          | 27    |
| 2.5204     | 2.4044          | 28    |
| 2.5197     | 2.4026          | 29    |
| 2.5189     | 2.4017          | 30    |
| 2.5182     | 2.4026          | 31    |
| 2.5176     | 2.4012          | 32    |
| 2.5168     | 2.4013          | 33    |
| 2.5161     | 2.3990          | 34    |
| 2.5154     | 2.3999          | 35    |
| 2.5149     | 2.3978          | 36    |
| 2.5142     | 2.3981          | 37    |
| 2.5135     | 2.3981          | 38    |
| 2.5130     | 2.3972          | 39    |
| 2.5123     | 2.3957          | 40    |
| 2.5116     | 2.3940          | 41    |
| 2.5108     | 2.3933          | 42    |
| 2.5103     | 2.3927          | 43    |
| 2.5095     | 2.3923          | 44    |
| 2.5090     | 2.3918          | 45    |
| 2.5083     | 2.3914          | 46    |
| 2.5078     | 2.3905          | 47    |
| 2.5070     | 2.3888          | 48    |
| 2.5062     | 2.3894          | 49    |
| 2.5058     | 2.3898          | 50    |
| 2.5051     | 2.3868          | 51    |
| 2.5045     | 2.3873          | 52    |
| 2.5041     | 2.3872          | 53    |
| 2.5035     | 2.3859          | 54    |
| 2.5027     | 2.3850          | 55    |
| 2.5020     | 2.3851          | 56    |
| 2.5016     | 2.3833          | 57    |
| 2.5009     | 2.3816          | 58    |
| 2.5002     | 2.3821          | 59    |
| 2.4995     | 2.3813          | 60    |
| 2.4990     | 2.3803          | 61    |
| 2.4984     | 2.3794          | 62    |
| 2.4977     | 2.3798          | 63    |
| 2.4971     | 2.3779          | 64    |
| 2.4964     | 2.3778          | 65    |
| 2.4959     | 2.3778          | 66    |
| 2.4954     | 2.3787          | 67    |
| 2.4947     | 2.3758          | 68    |
| 2.4942     | 2.3751          | 69    |
| 2.4935     | 2.3739          | 70    |
| 2.4929     | 2.3754          | 71    |
| 2.4923     | 2.3750          | 72    |
| 2.4918     | 2.3730          | 73    |
| 2.4912     | 2.3729          | 74    |


### Framework versions

- Transformers 4.20.1
- TensorFlow 2.8.2
- Datasets 2.3.2
- Tokenizers 0.12.1