metadata
license: apache-2.0
library_name: peft
tags:
- unsloth
- generated_from_trainer
base_model: unsloth/mistral-7b-v0.3-bnb-4bit
model-index:
- name: mistral_7b_v_Magiccoder_evol_10k_qlora_ortho
results: []
mistral_7b_v_Magiccoder_evol_10k_qlora_ortho
This model is a fine-tuned version of unsloth/mistral-7b-v0.3-bnb-4bit on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1813
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.2034 | 0.0262 | 4 | 1.2458 |
1.1597 | 0.0523 | 8 | 1.2035 |
1.1977 | 0.0785 | 12 | 1.2045 |
1.1152 | 0.1047 | 16 | 1.2144 |
1.1623 | 0.1308 | 20 | 1.2207 |
1.0816 | 0.1570 | 24 | 1.1929 |
1.2421 | 0.1832 | 28 | 1.2018 |
1.1908 | 0.2093 | 32 | 1.2023 |
1.1187 | 0.2355 | 36 | 1.1926 |
1.2034 | 0.2617 | 40 | 1.1915 |
1.2092 | 0.2878 | 44 | 1.1850 |
1.1567 | 0.3140 | 48 | 1.2156 |
1.1722 | 0.3401 | 52 | 1.1912 |
1.162 | 0.3663 | 56 | 1.2044 |
1.1497 | 0.3925 | 60 | 1.1980 |
1.2205 | 0.4186 | 64 | 1.1945 |
1.0966 | 0.4448 | 68 | 1.1971 |
1.123 | 0.4710 | 72 | 1.1945 |
1.1222 | 0.4971 | 76 | 1.1951 |
1.2472 | 0.5233 | 80 | 1.2024 |
1.1078 | 0.5495 | 84 | 1.1941 |
1.1993 | 0.5756 | 88 | 1.2111 |
1.2313 | 0.6018 | 92 | 1.1870 |
1.2431 | 0.6280 | 96 | 1.2047 |
1.1563 | 0.6541 | 100 | 1.1774 |
1.169 | 0.6803 | 104 | 1.2005 |
1.1873 | 0.7065 | 108 | 1.1957 |
1.0478 | 0.7326 | 112 | 1.1760 |
1.1245 | 0.7588 | 116 | 1.1628 |
1.1261 | 0.7850 | 120 | 1.1827 |
1.1876 | 0.8111 | 124 | 1.1869 |
1.1743 | 0.8373 | 128 | 1.1761 |
1.1865 | 0.8635 | 132 | 1.1744 |
1.1202 | 0.8896 | 136 | 1.1768 |
1.2158 | 0.9158 | 140 | 1.1790 |
1.0798 | 0.9419 | 144 | 1.1802 |
1.0996 | 0.9681 | 148 | 1.1814 |
1.2424 | 0.9943 | 152 | 1.1813 |
Framework versions
- PEFT 0.7.1
- Transformers 4.40.2
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1