metadata
base_model: microsoft/Phi-3-medium-4k-instruct
library_name: peft
license: mit
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: phi-3-medium-MoRA
results: []
phi-3-medium-MoRA
This model is a fine-tuned version of microsoft/Phi-3-medium-4k-instruct on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.7627
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.0968 | 0.1118 | 2500 | 0.7399 |
1.0638 | 0.2237 | 5000 | 0.7234 |
1.0536 | 0.3355 | 7500 | 0.7156 |
1.0581 | 0.4473 | 10000 | 0.7128 |
1.0618 | 0.5592 | 12500 | 0.7125 |
1.0533 | 0.6710 | 15000 | 0.7131 |
1.0664 | 0.7828 | 17500 | 0.7133 |
1.0719 | 0.8947 | 20000 | 0.7160 |
1.0628 | 1.0065 | 22500 | 0.7210 |
0.9341 | 1.1183 | 25000 | 0.7241 |
0.9468 | 1.2301 | 27500 | 0.7235 |
0.9553 | 1.3420 | 30000 | 0.7271 |
0.9557 | 1.4538 | 32500 | 0.7242 |
0.9669 | 1.5656 | 35000 | 0.7244 |
0.9627 | 1.6775 | 37500 | 0.7220 |
0.963 | 1.7893 | 40000 | 0.7215 |
0.9493 | 1.9011 | 42500 | 0.7207 |
0.8938 | 2.0130 | 45000 | 0.7668 |
0.7061 | 2.1248 | 47500 | 0.7739 |
0.7105 | 2.2366 | 50000 | 0.7693 |
0.7046 | 2.3485 | 52500 | 0.7716 |
0.7241 | 2.4603 | 55000 | 0.7713 |
0.7273 | 2.5721 | 57500 | 0.7669 |
0.7443 | 2.6840 | 60000 | 0.7685 |
0.7457 | 2.7958 | 62500 | 0.7664 |
0.7436 | 2.9076 | 65000 | 0.7659 |
0.7525 | 3.0195 | 67500 | 0.8617 |
0.465 | 3.1313 | 70000 | 0.8682 |
0.4823 | 3.2431 | 72500 | 0.8798 |
0.4865 | 3.3550 | 75000 | 0.8763 |
0.4977 | 3.4668 | 77500 | 0.8613 |
0.5088 | 3.5786 | 80000 | 0.8627 |
0.5136 | 3.6904 | 82500 | 0.8681 |
0.5128 | 3.8023 | 85000 | 0.8486 |
0.525 | 3.9141 | 87500 | 0.8585 |
0.3967 | 4.0259 | 90000 | 0.9826 |
0.3016 | 4.1378 | 92500 | 0.9951 |
0.3167 | 4.2496 | 95000 | 1.0293 |
0.3179 | 4.3614 | 97500 | 0.9904 |
0.3292 | 4.4733 | 100000 | 0.9947 |
0.3346 | 4.5851 | 102500 | 0.9932 |
0.3405 | 4.6969 | 105000 | 0.9715 |
0.344 | 4.8088 | 107500 | 0.9974 |
0.3497 | 4.9206 | 110000 | 0.9929 |
0.3226 | 5.0324 | 112500 | 1.1483 |
0.2071 | 5.1443 | 115000 | 1.1669 |
0.2131 | 5.2561 | 117500 | 1.1275 |
0.2204 | 5.3679 | 120000 | 1.1513 |
0.222 | 5.4798 | 122500 | 1.1549 |
0.2287 | 5.5916 | 125000 | 1.1552 |
0.2315 | 5.7034 | 127500 | 1.1370 |
0.2359 | 5.8153 | 130000 | 1.1318 |
0.2362 | 5.9271 | 132500 | 1.1461 |
0.1611 | 6.0389 | 135000 | 1.2983 |
0.1527 | 6.1507 | 137500 | 1.3192 |
0.1593 | 6.2626 | 140000 | 1.3295 |
0.16 | 6.3744 | 142500 | 1.3048 |
0.1647 | 6.4862 | 145000 | 1.3161 |
0.1659 | 6.5981 | 147500 | 1.2908 |
0.1666 | 6.7099 | 150000 | 1.3202 |
0.1692 | 6.8217 | 152500 | 1.3039 |
0.1711 | 6.9336 | 155000 | 1.2895 |
0.1433 | 7.0454 | 157500 | 1.4769 |
0.122 | 7.1572 | 160000 | 1.4877 |
0.1225 | 7.2691 | 162500 | 1.4722 |
0.1261 | 7.3809 | 165000 | 1.4794 |
0.1262 | 7.4927 | 167500 | 1.4749 |
0.1274 | 7.6046 | 170000 | 1.4719 |
0.1287 | 7.7164 | 172500 | 1.4495 |
0.1298 | 7.8282 | 175000 | 1.4753 |
0.1304 | 7.9401 | 177500 | 1.4705 |
0.1 | 8.0519 | 180000 | 1.6185 |
0.1038 | 8.1637 | 182500 | 1.6353 |
0.1053 | 8.2756 | 185000 | 1.6272 |
0.1054 | 8.3874 | 187500 | 1.6138 |
0.1057 | 8.4992 | 190000 | 1.6226 |
0.1061 | 8.6110 | 192500 | 1.6407 |
0.1068 | 8.7229 | 195000 | 1.6334 |
0.1082 | 8.8347 | 197500 | 1.6358 |
0.1063 | 8.9465 | 200000 | 1.6325 |
0.0936 | 9.0584 | 202500 | 1.7572 |
0.091 | 9.1702 | 205000 | 1.7476 |
0.0932 | 9.2820 | 207500 | 1.7529 |
0.0932 | 9.3939 | 210000 | 1.7541 |
0.0935 | 9.5057 | 212500 | 1.7595 |
0.0931 | 9.6175 | 215000 | 1.7609 |
0.0937 | 9.7294 | 217500 | 1.7647 |
0.0922 | 9.8412 | 220000 | 1.7643 |
0.0925 | 9.9530 | 222500 | 1.7627 |
Framework versions
- PEFT 0.9.0
- Transformers 4.42.4
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1