PocketDoc's picture
Upload 7 files
0ba73ca verified
|
raw
history blame
No virus
8.5 kB
---
license: apache-2.0
base_model: Locutusque/TinyMistral-248M-v2.5
tags:
- generated_from_trainer
model-index:
- name: TinyMistral-FFT
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.0`
```yaml
base_model: Locutusque/TinyMistral-248M-v2.5
model_type: MistralForCausalLM
is_mistral_derived_model: true
load_in_8bit: false
load_in_4bit: false
strict: false
dataset_processes: 20
datasets:
- path: epfl-llm/guidelines
type: completion
field: clean_text
- path: JeanKaddour/minipile
type: completion
field: text
dataset_prepared_path: TinyMistral-FFT-data
val_set_size: 0.001
output_dir: ./TinyMistral-FFT
sequence_len: 2048
sample_packing: false
pad_to_sequence_len: true
adapter:
lora_model_dir:
lora_r:
lora_alpha:
lora_dropout:
lora_target_linear:
lora_fan_in_fan_out:
# wandb configuration
wandb_project: TinyMistral-FFT
wandb_watch:
wandb_run_id:
wandb_log_model:
gradient_accumulation_steps: 2
micro_batch_size: 4
num_epochs: 1
optimizer: paged_adamw_32bit
lr_scheduler: constant
cosine_min_lr_ratio:
learning_rate: 0.00005
train_on_inputs: true
group_by_length: false
bf16: true
fp16: false
tf32: false
gradient_checkpointing: false
early_stopping_patience:
resume_from_checkpoint:
auto_resume_from_checkpoints: True
local_rank:
logging_steps: 1
xformers_attention:
flash_attention: true
flash_attn_cross_entropy: false
flash_attn_rms_norm: true
flash_attn_fuse_qkv: false
flash_attn_fuse_mlp: true
warmup_steps: 10
evals_per_epoch: 100
# eval_steps: 10
eval_table_size:
saves_per_epoch: 50
debug:
deepspeed: #deepspeed/zero2.json # multi-gpu only
weight_decay: 0
# tokens:
special_tokens:
bos_token: "<|bos|>"
eos_token: "<|endoftext|>"
unk_token: "<unk>"
```
</details><br>
# TinyMistral-FFT
This model is a fine-tuned version of [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.9626
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_steps: 10
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:------:|:---------------:|
| 4.5414 | 0.0 | 1 | 4.3416 |
| 4.4364 | 0.01 | 1973 | 3.6048 |
| 3.1588 | 0.02 | 3946 | 3.4869 |
| 3.1823 | 0.03 | 5919 | 3.4237 |
| 2.975 | 0.04 | 7892 | 3.3813 |
| 3.2737 | 0.05 | 9865 | 3.3476 |
| 3.7929 | 0.06 | 11838 | 3.3174 |
| 3.3775 | 0.07 | 13811 | 3.2947 |
| 3.6789 | 0.08 | 15784 | 3.2756 |
| 3.4811 | 0.09 | 17757 | 3.2590 |
| 3.3961 | 0.1 | 19730 | 3.2406 |
| 3.4742 | 0.11 | 21703 | 3.2255 |
| 3.5353 | 0.12 | 23676 | 3.2130 |
| 2.5729 | 0.13 | 25649 | 3.2018 |
| 3.0246 | 0.14 | 27622 | 3.1915 |
| 3.5242 | 0.15 | 29595 | 3.1814 |
| 2.6597 | 0.16 | 31568 | 3.1728 |
| 3.0312 | 0.17 | 33541 | 3.1635 |
| 3.2913 | 0.18 | 35514 | 3.1564 |
| 2.8945 | 0.19 | 37487 | 3.1487 |
| 3.2407 | 0.2 | 39460 | 3.1423 |
| 3.076 | 0.21 | 41433 | 3.1358 |
| 3.4588 | 0.22 | 43406 | 3.1298 |
| 3.1972 | 0.23 | 45379 | 3.1236 |
| 2.8544 | 0.24 | 47352 | 3.1182 |
| 2.949 | 0.25 | 49325 | 3.1116 |
| 3.7614 | 0.26 | 51298 | 3.1078 |
| 2.7729 | 0.27 | 53271 | 3.1022 |
| 3.371 | 0.28 | 55244 | 3.0972 |
| 3.1048 | 0.29 | 57217 | 3.0932 |
| 3.0419 | 0.3 | 59190 | 3.0877 |
| 3.0947 | 0.31 | 61163 | 3.0821 |
| 3.4587 | 0.32 | 63136 | 3.0783 |
| 2.8448 | 0.33 | 65109 | 3.0760 |
| 3.3145 | 0.34 | 67082 | 3.0711 |
| 3.1927 | 0.35 | 69055 | 3.0668 |
| 3.3117 | 0.36 | 71028 | 3.0643 |
| 3.2579 | 0.37 | 73001 | 3.0613 |
| 3.1899 | 0.38 | 74974 | 3.0597 |
| 3.0391 | 0.39 | 76947 | 3.0563 |
| 2.6476 | 0.4 | 78920 | 3.0542 |
| 2.9163 | 0.41 | 80893 | 3.0504 |
| 2.4931 | 0.42 | 82866 | 3.0489 |
| 3.3614 | 0.43 | 84839 | 3.0451 |
| 3.1546 | 0.44 | 86812 | 3.0416 |
| 2.8995 | 0.45 | 88785 | 3.0403 |
| 2.8657 | 0.46 | 90758 | 3.0370 |
| 3.4511 | 0.47 | 92731 | 3.0343 |
| 3.2269 | 0.48 | 94704 | 3.0323 |
| 2.6914 | 0.49 | 96677 | 3.0302 |
| 3.087 | 0.5 | 98650 | 3.0282 |
| 3.3036 | 0.51 | 100623 | 3.0266 |
| 3.2269 | 0.52 | 102596 | 3.0251 |
| 3.1237 | 0.53 | 104569 | 3.0223 |
| 2.9733 | 0.54 | 106542 | 3.0197 |
| 3.0594 | 0.55 | 108515 | 3.0186 |
| 2.9842 | 0.56 | 110488 | 3.0168 |
| 3.0986 | 0.57 | 112461 | 3.0158 |
| 3.0296 | 0.58 | 114434 | 3.0141 |
| 3.0091 | 0.59 | 116407 | 3.0139 |
| 2.7111 | 0.6 | 118380 | 3.0107 |
| 3.115 | 0.61 | 120353 | 3.0080 |
| 3.2585 | 0.62 | 122326 | 3.0063 |
| 3.0651 | 0.63 | 124299 | 3.0038 |
| 2.965 | 0.64 | 126272 | 3.0035 |
| 2.9165 | 0.65 | 128245 | 3.0023 |
| 2.8069 | 0.66 | 130218 | 3.0007 |
| 2.9818 | 0.67 | 132191 | 2.9995 |
| 2.8997 | 0.68 | 134164 | 2.9978 |
| 2.948 | 0.69 | 136137 | 2.9966 |
| 3.034 | 0.7 | 138110 | 2.9953 |
| 3.1774 | 0.71 | 140083 | 2.9936 |
| 3.3357 | 0.72 | 142056 | 2.9919 |
| 3.2333 | 0.73 | 144029 | 2.9897 |
| 3.1183 | 0.74 | 146002 | 2.9889 |
| 3.1148 | 0.75 | 147975 | 2.9887 |
| 2.8678 | 0.76 | 149948 | 2.9867 |
| 2.6597 | 0.77 | 151921 | 2.9850 |
| 3.1122 | 0.78 | 153894 | 2.9842 |
| 3.1959 | 0.79 | 155867 | 2.9825 |
| 2.8623 | 0.8 | 157840 | 2.9808 |
| 2.9416 | 0.81 | 159813 | 2.9809 |
| 3.0551 | 0.82 | 161786 | 2.9792 |
| 2.9538 | 0.83 | 163759 | 2.9777 |
| 2.8278 | 0.84 | 165732 | 2.9767 |
| 3.4942 | 0.85 | 167705 | 2.9762 |
| 2.838 | 0.86 | 169678 | 2.9740 |
| 3.0352 | 0.87 | 171651 | 2.9720 |
| 2.8865 | 0.88 | 173624 | 2.9724 |
| 3.0911 | 0.89 | 175597 | 2.9708 |
| 2.8237 | 0.9 | 177570 | 2.9703 |
| 2.9927 | 0.91 | 179543 | 2.9695 |
| 3.2014 | 0.92 | 181516 | 2.9680 |
| 2.3033 | 0.93 | 183489 | 2.9666 |
| 2.6264 | 0.94 | 185462 | 2.9668 |
| 3.1788 | 0.95 | 187435 | 2.9659 |
| 3.066 | 0.96 | 189408 | 2.9645 |
| 2.5523 | 0.97 | 191381 | 2.9640 |
| 2.4562 | 0.98 | 193354 | 2.9630 |
| 3.3801 | 0.99 | 195327 | 2.9626 |
### Framework versions
- Transformers 4.37.0
- Pytorch 2.0.1+cu117
- Datasets 2.15.0
- Tokenizers 0.15.0