codeparrot-ds
This model is a fine-tuned version of gpt2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.9260
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
9.4997 | 0.18 | 10 | 8.0307 |
7.1464 | 0.36 | 20 | 7.0639 |
6.7447 | 0.54 | 30 | 6.8856 |
6.4819 | 0.72 | 40 | 6.6801 |
6.2576 | 0.9 | 50 | 6.4703 |
6.0151 | 1.08 | 60 | 6.2236 |
5.7178 | 1.26 | 70 | 6.0277 |
5.5432 | 1.44 | 80 | 5.8780 |
5.4003 | 1.62 | 90 | 5.7475 |
5.2611 | 1.8 | 100 | 5.6555 |
5.1703 | 1.98 | 110 | 5.5344 |
5.0115 | 2.16 | 120 | 5.4731 |
4.9182 | 2.34 | 130 | 5.3787 |
4.8519 | 2.52 | 140 | 5.3300 |
4.7389 | 2.7 | 150 | 5.2398 |
4.6887 | 2.88 | 160 | 5.1947 |
4.5515 | 3.06 | 170 | 5.1424 |
4.4606 | 3.24 | 180 | 5.1048 |
4.4699 | 3.42 | 190 | 5.0667 |
4.3777 | 3.6 | 200 | 5.0200 |
4.3894 | 3.78 | 210 | 4.9892 |
4.3143 | 3.96 | 220 | 4.9688 |
4.2688 | 4.14 | 230 | 4.9515 |
4.2468 | 4.32 | 240 | 4.9410 |
4.2191 | 4.5 | 250 | 4.9304 |
4.2512 | 4.68 | 260 | 4.9270 |
4.217 | 4.86 | 270 | 4.9260 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 1