metadata
library_name: transformers
tags:
- trl
- dpo
- generated_from_trainer
model-index:
- name: OpenELM-1_1B-SLiC
results: []
OpenELM-1_1B-SLiC
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.6883
- Rewards/chosen: -4.3438
- Rewards/rejected: -5.3438
- Rewards/accuracies: 0.7344
- Rewards/margins: 0.9922
- Logps/rejected: -824.0
- Logps/chosen: -752.0
- Logits/rejected: -8.75
- Logits/chosen: -10.0625
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
---|---|---|---|---|---|---|---|---|---|---|---|
0.7634 | 0.1047 | 100 | 0.7878 | -0.7461 | -1.0312 | 0.6406 | 0.2832 | -392.0 | -392.0 | -12.9375 | -13.0625 |
0.7498 | 0.2093 | 200 | 0.7468 | -1.1719 | -1.5547 | 0.6719 | 0.3809 | -444.0 | -436.0 | -12.4375 | -12.75 |
0.8142 | 0.3140 | 300 | 0.7466 | -1.8594 | -2.2812 | 0.6914 | 0.4141 | -516.0 | -504.0 | -14.75 | -14.8125 |
0.7764 | 0.4186 | 400 | 0.7499 | -1.9688 | -2.4062 | 0.6699 | 0.4316 | -528.0 | -516.0 | -14.4375 | -14.5625 |
0.731 | 0.5233 | 500 | 0.7240 | -2.4219 | -2.8594 | 0.6914 | 0.4375 | -576.0 | -560.0 | -10.5 | -11.0 |
0.665 | 0.6279 | 600 | 0.7045 | -3.4062 | -4.0625 | 0.6973 | 0.6680 | -696.0 | -660.0 | -10.0625 | -10.75 |
0.6806 | 0.7326 | 700 | 0.6912 | -2.5156 | -3.1562 | 0.7070 | 0.6523 | -604.0 | -568.0 | -13.4375 | -13.875 |
0.6597 | 0.8373 | 800 | 0.7087 | -2.2969 | -2.8594 | 0.6777 | 0.5664 | -576.0 | -548.0 | -13.3125 | -13.5 |
0.7325 | 0.9419 | 900 | 0.6838 | -2.6875 | -3.3594 | 0.7090 | 0.6602 | -624.0 | -588.0 | -13.25 | -14.0 |
0.2677 | 1.0466 | 1000 | 0.6726 | -3.2344 | -4.0 | 0.7070 | 0.7734 | -688.0 | -640.0 | -11.0625 | -12.1875 |
0.2256 | 1.1512 | 1100 | 0.6992 | -3.5938 | -4.375 | 0.7090 | 0.7969 | -728.0 | -676.0 | -10.0625 | -11.125 |
0.1954 | 1.2559 | 1200 | 0.7033 | -3.4688 | -4.3125 | 0.7051 | 0.8477 | -720.0 | -664.0 | -10.125 | -11.3125 |
0.2289 | 1.3605 | 1300 | 0.6722 | -3.7344 | -4.5 | 0.7344 | 0.7852 | -740.0 | -692.0 | -9.9375 | -11.0 |
0.2227 | 1.4652 | 1400 | 0.6925 | -3.5781 | -4.3125 | 0.6953 | 0.7383 | -720.0 | -676.0 | -11.8125 | -12.5 |
0.1902 | 1.5699 | 1500 | 0.6758 | -4.1875 | -5.0312 | 0.7148 | 0.8320 | -792.0 | -736.0 | -11.125 | -12.0625 |
0.2192 | 1.6745 | 1600 | 0.6833 | -3.8438 | -4.625 | 0.7148 | 0.7695 | -748.0 | -704.0 | -12.875 | -13.625 |
0.2137 | 1.7792 | 1700 | 0.6734 | -3.9688 | -4.7812 | 0.7207 | 0.8008 | -764.0 | -716.0 | -11.0 | -11.9375 |
0.2001 | 1.8838 | 1800 | 0.6734 | -3.7344 | -4.5 | 0.7207 | 0.7617 | -740.0 | -692.0 | -11.3125 | -12.125 |
0.1713 | 1.9885 | 1900 | 0.6680 | -3.9375 | -4.8125 | 0.7383 | 0.8789 | -768.0 | -712.0 | -9.25 | -10.4375 |
0.0184 | 2.0931 | 2000 | 0.6845 | -3.8594 | -4.8125 | 0.7305 | 0.9453 | -768.0 | -704.0 | -9.875 | -11.0625 |
0.0313 | 2.1978 | 2100 | 0.6798 | -4.0 | -4.9688 | 0.7402 | 0.9570 | -784.0 | -720.0 | -10.125 | -11.25 |
0.0401 | 2.3025 | 2200 | 0.6865 | -4.1562 | -5.0938 | 0.7363 | 0.9492 | -800.0 | -732.0 | -9.375 | -10.6875 |
0.0211 | 2.4071 | 2300 | 0.6874 | -4.2188 | -5.2188 | 0.7383 | 1.0078 | -812.0 | -740.0 | -8.75 | -10.125 |
0.0239 | 2.5118 | 2400 | 0.6858 | -4.1562 | -5.125 | 0.7383 | 0.9766 | -800.0 | -736.0 | -8.875 | -10.1875 |
0.0188 | 2.6164 | 2500 | 0.6902 | -4.2812 | -5.25 | 0.7324 | 0.9883 | -816.0 | -744.0 | -8.8125 | -10.125 |
0.0145 | 2.7211 | 2600 | 0.6874 | -4.2812 | -5.2812 | 0.7383 | 0.9844 | -816.0 | -748.0 | -8.8125 | -10.125 |
0.0229 | 2.8257 | 2700 | 0.6883 | -4.3438 | -5.3438 | 0.7344 | 0.9922 | -824.0 | -752.0 | -8.75 | -10.0625 |
0.0298 | 2.9304 | 2800 | 0.6883 | -4.3438 | -5.3438 | 0.7344 | 0.9922 | -824.0 | -752.0 | -8.75 | -10.0625 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.3.0
- Datasets 3.0.0
- Tokenizers 0.19.1