smolm-autoreg-bpe-counterfactual-babylm-random_removal-1e-3
This model was trained from scratch on the kanishka/counterfactual-babylm-random_removal dataset. It achieves the following results on the evaluation set:
- Loss: 3.4298
- Accuracy: 0.4089
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 32000
- num_epochs: 20.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
3.6092 | 1.0 | 18586 | 3.7513 | 0.3603 |
3.3896 | 2.0 | 37172 | 3.5882 | 0.3803 |
3.2599 | 3.0 | 55758 | 3.4709 | 0.3917 |
3.177 | 4.0 | 74344 | 3.4183 | 0.3981 |
3.1287 | 5.0 | 92930 | 3.3982 | 0.4017 |
3.0841 | 6.0 | 111516 | 3.3841 | 0.4031 |
3.0483 | 7.0 | 130102 | 3.3494 | 0.4065 |
3.0156 | 8.0 | 148688 | 3.3597 | 0.4078 |
2.9911 | 9.0 | 167274 | 3.3719 | 0.4067 |
2.9616 | 10.0 | 185860 | 3.3717 | 0.4078 |
2.9384 | 11.0 | 204446 | 3.3679 | 0.4091 |
2.9133 | 12.0 | 223032 | 3.3673 | 0.4097 |
2.8923 | 13.0 | 241618 | 3.3885 | 0.4088 |
2.8781 | 14.0 | 260204 | 3.3873 | 0.4090 |
2.8563 | 15.0 | 278790 | 3.3848 | 0.4092 |
2.836 | 16.0 | 297376 | 3.3956 | 0.4094 |
2.8162 | 17.0 | 315962 | 3.4023 | 0.4091 |
2.7997 | 18.0 | 334548 | 3.4101 | 0.4093 |
2.7779 | 19.0 | 353134 | 3.4237 | 0.4090 |
2.7645 | 20.0 | 371720 | 3.4298 | 0.4089 |
Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
- Downloads last month
- 11
Dataset used to train kanishka/smolm-autoreg-bpe-counterfactual-babylm-random_removal-1e-3
Evaluation results
- Accuracy on kanishka/counterfactual-babylm-random_removalself-reported0.409