metadata
library_name: transformers
license: mit
base_model: gpt2
tags:
- generated_from_trainer
model-index:
- name: codeparrot-ds
results: []
codeparrot-ds
This model is a fine-tuned version of gpt2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 5.1374
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
9.5079 | 0.2186 | 10 | 8.1216 |
7.1151 | 0.4372 | 20 | 7.1928 |
6.6137 | 0.6557 | 30 | 6.8735 |
6.2329 | 0.8743 | 40 | 6.5471 |
5.8587 | 1.0929 | 50 | 6.2769 |
5.5239 | 1.3115 | 60 | 6.0781 |
5.3796 | 1.5301 | 70 | 5.9107 |
5.234 | 1.7486 | 80 | 5.7720 |
5.1009 | 1.9672 | 90 | 5.6632 |
4.8991 | 2.1858 | 100 | 5.5841 |
4.807 | 2.4044 | 110 | 5.5124 |
4.7083 | 2.6230 | 120 | 5.4414 |
4.6195 | 2.8415 | 130 | 5.3711 |
4.5424 | 3.0601 | 140 | 5.3166 |
4.3707 | 3.2787 | 150 | 5.2790 |
4.3599 | 3.4973 | 160 | 5.2316 |
4.3317 | 3.7158 | 170 | 5.1933 |
4.2907 | 3.9344 | 180 | 5.1684 |
4.21 | 4.1530 | 190 | 5.1599 |
4.2023 | 4.3716 | 200 | 5.1457 |
4.1793 | 4.5902 | 210 | 5.1398 |
4.1662 | 4.8087 | 220 | 5.1374 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1