--- license: other base_model: liswei/OpenELM-270M-zh-cp tags: - llama-factory - full - generated_from_trainer model-index: - name: zh-sft results: [] --- # zh-sft This model is a fine-tuned version of [saves/OpenELM-270M/zh-cp-galore](https://huggingface.co/saves/OpenELM-270M/zh-cp-galore) on the en2tw-alignment-sft and the TaiwanChat datasets. It achieves the following results on the evaluation set: - Loss: 1.5575 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 2 - eval_batch_size: 4 - seed: 42 - distributed_type: multi-GPU - num_devices: 2 - gradient_accumulation_steps: 16 - total_train_batch_size: 64 - total_eval_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 1000 - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:-----:|:---------------:| | 2.285 | 0.0588 | 500 | 2.2416 | | 2.0921 | 0.1176 | 1000 | 2.1271 | | 2.1212 | 0.1764 | 1500 | 2.0457 | | 1.9794 | 0.2351 | 2000 | 1.9954 | | 1.8983 | 0.2939 | 2500 | 1.9546 | | 1.8976 | 0.3527 | 3000 | 1.9214 | | 1.9345 | 0.4115 | 3500 | 1.8950 | | 1.8782 | 0.4703 | 4000 | 1.8705 | | 1.806 | 0.5291 | 4500 | 1.8493 | | 1.8282 | 0.5878 | 5000 | 1.8275 | | 1.7949 | 0.6466 | 5500 | 1.8115 | | 1.7408 | 0.7054 | 6000 | 1.7943 | | 1.6978 | 0.7642 | 6500 | 1.7782 | | 1.7152 | 0.8230 | 7000 | 1.7644 | | 1.7186 | 0.8818 | 7500 | 1.7511 | | 1.6821 | 0.9406 | 8000 | 1.7357 | | 1.6238 | 0.9993 | 8500 | 1.7211 | | 1.4753 | 1.0581 | 9000 | 1.7177 | | 1.4412 | 1.1169 | 9500 | 1.7048 | | 1.4273 | 1.1757 | 10000 | 1.6991 | | 1.4464 | 1.2345 | 10500 | 1.6840 | | 1.4484 | 1.2933 | 11000 | 1.6749 | | 1.4752 | 1.3520 | 11500 | 1.6666 | | 1.4023 | 1.4108 | 12000 | 1.6602 | | 1.3717 | 1.4696 | 12500 | 1.6467 | | 1.411 | 1.5284 | 13000 | 1.6376 | | 1.41 | 1.5872 | 13500 | 1.6298 | | 1.4263 | 1.6460 | 14000 | 1.6193 | | 1.3655 | 1.7048 | 14500 | 1.6108 | | 1.3813 | 1.7635 | 15000 | 1.6027 | | 1.3913 | 1.8223 | 15500 | 1.5948 | | 1.4214 | 1.8811 | 16000 | 1.5872 | | 1.3626 | 1.9399 | 16500 | 1.5810 | | 1.4187 | 1.9987 | 17000 | 1.5737 | | 1.154 | 2.0575 | 17500 | 1.5879 | | 1.2142 | 2.1162 | 18000 | 1.5826 | | 1.1634 | 2.1750 | 18500 | 1.5811 | | 1.1774 | 2.2338 | 19000 | 1.5750 | | 1.196 | 2.2926 | 19500 | 1.5732 | | 1.1546 | 2.3514 | 20000 | 1.5697 | | 1.1804 | 2.4102 | 20500 | 1.5666 | | 1.1517 | 2.4690 | 21000 | 1.5646 | | 1.1941 | 2.5277 | 21500 | 1.5633 | | 1.1836 | 2.5865 | 22000 | 1.5611 | | 1.1603 | 2.6453 | 22500 | 1.5599 | | 1.2281 | 2.7041 | 23000 | 1.5588 | | 1.1626 | 2.7629 | 23500 | 1.5578 | | 1.077 | 2.8217 | 24000 | 1.5579 | | 1.1677 | 2.8804 | 24500 | 1.5575 | | 1.1624 | 2.9392 | 25000 | 1.5574 | | 1.217 | 2.9980 | 25500 | 1.5575 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0+cu118 - Datasets 2.18.0 - Tokenizers 0.19.1