oop_and_text_pythia_410m
This model is a fine-tuned version of EleutherAI/pythia-410m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.6151
- Accuracy: 0.2123
- Num Input Tokens Seen: 5873664
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- total_train_batch_size: 8
- total_eval_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Input Tokens Seen |
---|---|---|---|---|---|
No log | 0 | 0 | 3.0231 | 0.2075 | 0 |
1.8656 | 0.2092 | 50 | 2.0404 | 0.2358 | 409600 |
1.8788 | 0.4184 | 100 | 1.8193 | 0.2170 | 819200 |
1.7635 | 0.6276 | 150 | 1.6325 | 0.1887 | 1228800 |
1.6773 | 0.8368 | 200 | 1.6925 | 0.1887 | 1638400 |
1.6309 | 1.0460 | 250 | 1.6849 | 0.1934 | 2048000 |
1.5824 | 1.2552 | 300 | 1.8487 | 0.1840 | 2457600 |
1.8204 | 1.4644 | 350 | 1.6930 | 0.1887 | 2867200 |
1.6639 | 1.6736 | 400 | 1.6967 | 0.2123 | 3276800 |
1.5446 | 1.8828 | 450 | 1.6562 | 0.2217 | 3686400 |
1.569 | 2.0921 | 500 | 1.6151 | 0.2123 | 4096000 |
1.5797 | 2.3013 | 550 | 1.6244 | 0.2311 | 4505600 |
1.5543 | 2.5105 | 600 | 1.6461 | 0.2028 | 4915200 |
1.5691 | 2.7197 | 650 | 1.6240 | 0.2075 | 5324800 |
1.5852 | 2.9289 | 700 | 1.6227 | 0.2170 | 5734400 |
Framework versions
- Transformers 4.43.2
- Pytorch 2.4.0
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 8
Model tree for gbemilekeonilude/oop_and_text_pythia_410m
Base model
EleutherAI/pythia-410m