--- license: apache-2.0 base_model: mistral-community/Mixtral-8x22B-v0.1 tags: - generated_from_trainer model-index: - name: out results: [] --- [Built with Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl)
See axolotl config axolotl version: `0.4.0` ```yaml base_model: mistral-community/Mixtral-8x22B-v0.1 model_type: AutoModelForCausalLM tokenizer_type: LlamaTokenizer trust_remote_code: true load_in_8bit: false load_in_4bit: false strict: false unfrozen_parameters: - ^lm_head.weight$ - ^model.embed_tokens.weight$ - model.layers.0.self_attn.q_proj - model.layers.1.self_attn.q_proj - model.layers.2.self_attn.q_proj - model.layers.22.self_attn.q_proj - model.layers.27.self_attn.q_proj - model.layers.28.self_attn.q_proj - model.layers.13.self_attn.q_proj - model.layers.21.self_attn.q_proj - model.layers.24.self_attn.q_proj - model.layers.14.self_attn.q_proj - model.layers.15.self_attn.q_proj - model.layers.11.self_attn.q_proj - model.layers.20.self_attn.q_proj - model.layers.23.self_attn.q_proj - model.layers.30.self_attn.k_proj - model.layers.31.self_attn.k_proj - model.layers.25.self_attn.k_proj - model.layers.23.self_attn.k_proj - model.layers.27.self_attn.k_proj - model.layers.26.self_attn.k_proj - model.layers.29.self_attn.k_proj - model.layers.28.self_attn.k_proj - model.layers.24.self_attn.k_proj - model.layers.16.self_attn.k_proj - model.layers.19.self_attn.k_proj - model.layers.22.self_attn.k_proj - model.layers.20.self_attn.k_proj - model.layers.6.self_attn.k_proj - model.layers.22.self_attn.v_proj - model.layers.29.self_attn.v_proj - model.layers.31.self_attn.v_proj - model.layers.5.self_attn.v_proj - model.layers.8.self_attn.v_proj - model.layers.4.self_attn.v_proj - model.layers.25.self_attn.v_proj - model.layers.30.self_attn.v_proj - model.layers.17.self_attn.v_proj - model.layers.23.self_attn.v_proj - model.layers.28.self_attn.v_proj - model.layers.9.self_attn.v_proj - model.layers.26.self_attn.v_proj - model.layers.27.self_attn.v_proj - model.layers.20.self_attn.o_proj - model.layers.19.self_attn.o_proj - model.layers.16.self_attn.o_proj - model.layers.13.self_attn.o_proj - model.layers.18.self_attn.o_proj - model.layers.17.self_attn.o_proj - model.layers.12.self_attn.o_proj - model.layers.15.self_attn.o_proj - model.layers.14.self_attn.o_proj - model.layers.22.self_attn.o_proj - model.layers.23.self_attn.o_proj - model.layers.21.self_attn.o_proj - model.layers.10.self_attn.o_proj - model.layers.0.self_attn.o_proj - model.layers.0.block_sparse_moe.experts.0.w1 - model.layers.1.block_sparse_moe.experts.0.w1 - model.layers.2.block_sparse_moe.experts.0.w1 - model.layers.3.block_sparse_moe.experts.0.w1 - model.layers.4.block_sparse_moe.experts.0.w1 - model.layers.5.block_sparse_moe.experts.0.w1 - model.layers.6.block_sparse_moe.experts.0.w1 - model.layers.7.block_sparse_moe.experts.0.w1 - model.layers.8.block_sparse_moe.experts.0.w1 - model.layers.9.block_sparse_moe.experts.0.w1 - model.layers.10.block_sparse_moe.experts.0.w1 - model.layers.11.block_sparse_moe.experts.0.w1 - model.layers.12.block_sparse_moe.experts.0.w1 - model.layers.13.block_sparse_moe.experts.0.w1 - model.layers.0.block_sparse_moe.experts.0.w2 - model.layers.1.block_sparse_moe.experts.0.w2 - model.layers.2.block_sparse_moe.experts.0.w2 - model.layers.3.block_sparse_moe.experts.0.w2 - model.layers.4.block_sparse_moe.experts.0.w2 - model.layers.5.block_sparse_moe.experts.0.w2 - model.layers.6.block_sparse_moe.experts.0.w2 - model.layers.7.block_sparse_moe.experts.0.w2 - model.layers.8.block_sparse_moe.experts.0.w2 - model.layers.9.block_sparse_moe.experts.0.w2 - model.layers.10.block_sparse_moe.experts.0.w2 - model.layers.11.block_sparse_moe.experts.0.w2 - model.layers.12.block_sparse_moe.experts.0.w2 - model.layers.13.block_sparse_moe.experts.0.w2 - model.layers.0.block_sparse_moe.experts.0.w3 - model.layers.1.block_sparse_moe.experts.0.w3 - model.layers.2.block_sparse_moe.experts.0.w3 - model.layers.3.block_sparse_moe.experts.0.w3 - model.layers.4.block_sparse_moe.experts.0.w3 - model.layers.5.block_sparse_moe.experts.0.w3 - model.layers.6.block_sparse_moe.experts.0.w3 - model.layers.7.block_sparse_moe.experts.0.w3 - model.layers.8.block_sparse_moe.experts.0.w3 - model.layers.9.block_sparse_moe.experts.0.w3 - model.layers.10.block_sparse_moe.experts.0.w3 - model.layers.11.block_sparse_moe.experts.0.w3 - model.layers.12.block_sparse_moe.experts.0.w3 - model.layers.13.block_sparse_moe.experts.0.w3 - model.layers.0.block_sparse_moe.experts.1.w1 - model.layers.1.block_sparse_moe.experts.1.w1 - model.layers.2.block_sparse_moe.experts.1.w1 - model.layers.3.block_sparse_moe.experts.1.w1 - model.layers.4.block_sparse_moe.experts.1.w1 - model.layers.5.block_sparse_moe.experts.1.w1 - model.layers.6.block_sparse_moe.experts.1.w1 - model.layers.7.block_sparse_moe.experts.1.w1 - model.layers.8.block_sparse_moe.experts.1.w1 - model.layers.9.block_sparse_moe.experts.1.w1 - model.layers.10.block_sparse_moe.experts.1.w1 - model.layers.11.block_sparse_moe.experts.1.w1 - model.layers.12.block_sparse_moe.experts.1.w1 - model.layers.13.block_sparse_moe.experts.1.w1 - model.layers.40.block_sparse_moe.experts.1.w2 - model.layers.0.block_sparse_moe.experts.1.w2 - model.layers.1.block_sparse_moe.experts.1.w2 - model.layers.2.block_sparse_moe.experts.1.w2 - model.layers.3.block_sparse_moe.experts.1.w2 - model.layers.4.block_sparse_moe.experts.1.w2 - model.layers.5.block_sparse_moe.experts.1.w2 - model.layers.6.block_sparse_moe.experts.1.w2 - model.layers.7.block_sparse_moe.experts.1.w2 - model.layers.8.block_sparse_moe.experts.1.w2 - model.layers.9.block_sparse_moe.experts.1.w2 - model.layers.10.block_sparse_moe.experts.1.w2 - model.layers.11.block_sparse_moe.experts.1.w2 - model.layers.12.block_sparse_moe.experts.1.w2 - model.layers.5.block_sparse_moe.experts.1.w3 - model.layers.0.block_sparse_moe.experts.1.w3 - model.layers.1.block_sparse_moe.experts.1.w3 - model.layers.2.block_sparse_moe.experts.1.w3 - model.layers.3.block_sparse_moe.experts.1.w3 - model.layers.4.block_sparse_moe.experts.1.w3 - model.layers.6.block_sparse_moe.experts.1.w3 - model.layers.7.block_sparse_moe.experts.1.w3 - model.layers.8.block_sparse_moe.experts.1.w3 - model.layers.9.block_sparse_moe.experts.1.w3 - model.layers.10.block_sparse_moe.experts.1.w3 - model.layers.11.block_sparse_moe.experts.1.w3 - model.layers.12.block_sparse_moe.experts.1.w3 - model.layers.13.block_sparse_moe.experts.1.w3 - model.layers.1.block_sparse_moe.experts.2.w1 - model.layers.0.block_sparse_moe.experts.2.w1 - model.layers.2.block_sparse_moe.experts.2.w1 - model.layers.3.block_sparse_moe.experts.2.w1 - model.layers.4.block_sparse_moe.experts.2.w1 - model.layers.5.block_sparse_moe.experts.2.w1 - model.layers.6.block_sparse_moe.experts.2.w1 - model.layers.7.block_sparse_moe.experts.2.w1 - model.layers.8.block_sparse_moe.experts.2.w1 - model.layers.9.block_sparse_moe.experts.2.w1 - model.layers.10.block_sparse_moe.experts.2.w1 - model.layers.11.block_sparse_moe.experts.2.w1 - model.layers.12.block_sparse_moe.experts.2.w1 - model.layers.13.block_sparse_moe.experts.2.w1 - model.layers.1.block_sparse_moe.experts.2.w2 - model.layers.0.block_sparse_moe.experts.2.w2 - model.layers.2.block_sparse_moe.experts.2.w2 - model.layers.3.block_sparse_moe.experts.2.w2 - model.layers.4.block_sparse_moe.experts.2.w2 - model.layers.5.block_sparse_moe.experts.2.w2 - model.layers.6.block_sparse_moe.experts.2.w2 - model.layers.7.block_sparse_moe.experts.2.w2 - model.layers.8.block_sparse_moe.experts.2.w2 - model.layers.9.block_sparse_moe.experts.2.w2 - model.layers.10.block_sparse_moe.experts.2.w2 - model.layers.11.block_sparse_moe.experts.2.w2 - model.layers.12.block_sparse_moe.experts.2.w2 - model.layers.13.block_sparse_moe.experts.2.w2 - model.layers.1.block_sparse_moe.experts.2.w3 - model.layers.0.block_sparse_moe.experts.2.w3 - model.layers.2.block_sparse_moe.experts.2.w3 - model.layers.3.block_sparse_moe.experts.2.w3 - model.layers.4.block_sparse_moe.experts.2.w3 - model.layers.5.block_sparse_moe.experts.2.w3 - model.layers.6.block_sparse_moe.experts.2.w3 - model.layers.7.block_sparse_moe.experts.2.w3 - model.layers.8.block_sparse_moe.experts.2.w3 - model.layers.9.block_sparse_moe.experts.2.w3 - model.layers.10.block_sparse_moe.experts.2.w3 - model.layers.11.block_sparse_moe.experts.2.w3 - model.layers.12.block_sparse_moe.experts.2.w3 - model.layers.13.block_sparse_moe.experts.2.w3 - model.layers.2.block_sparse_moe.experts.3.w1 - model.layers.1.block_sparse_moe.experts.3.w1 - model.layers.0.block_sparse_moe.experts.3.w1 - model.layers.3.block_sparse_moe.experts.3.w1 - model.layers.4.block_sparse_moe.experts.3.w1 - model.layers.5.block_sparse_moe.experts.3.w1 - model.layers.6.block_sparse_moe.experts.3.w1 - model.layers.7.block_sparse_moe.experts.3.w1 - model.layers.8.block_sparse_moe.experts.3.w1 - model.layers.9.block_sparse_moe.experts.3.w1 - model.layers.10.block_sparse_moe.experts.3.w1 - model.layers.11.block_sparse_moe.experts.3.w1 - model.layers.12.block_sparse_moe.experts.3.w1 - model.layers.13.block_sparse_moe.experts.3.w1 - model.layers.2.block_sparse_moe.experts.3.w2 - model.layers.1.block_sparse_moe.experts.3.w2 - model.layers.0.block_sparse_moe.experts.3.w2 - model.layers.3.block_sparse_moe.experts.3.w2 - model.layers.4.block_sparse_moe.experts.3.w2 - model.layers.5.block_sparse_moe.experts.3.w2 - model.layers.6.block_sparse_moe.experts.3.w2 - model.layers.7.block_sparse_moe.experts.3.w2 - model.layers.8.block_sparse_moe.experts.3.w2 - model.layers.9.block_sparse_moe.experts.3.w2 - model.layers.10.block_sparse_moe.experts.3.w2 - model.layers.11.block_sparse_moe.experts.3.w2 - model.layers.12.block_sparse_moe.experts.3.w2 - model.layers.13.block_sparse_moe.experts.3.w2 - model.layers.2.block_sparse_moe.experts.3.w3 - model.layers.1.block_sparse_moe.experts.3.w3 - model.layers.0.block_sparse_moe.experts.3.w3 - model.layers.3.block_sparse_moe.experts.3.w3 - model.layers.4.block_sparse_moe.experts.3.w3 - model.layers.5.block_sparse_moe.experts.3.w3 - model.layers.6.block_sparse_moe.experts.3.w3 - model.layers.7.block_sparse_moe.experts.3.w3 - model.layers.8.block_sparse_moe.experts.3.w3 - model.layers.9.block_sparse_moe.experts.3.w3 - model.layers.10.block_sparse_moe.experts.3.w3 - model.layers.11.block_sparse_moe.experts.3.w3 - model.layers.12.block_sparse_moe.experts.3.w3 - model.layers.13.block_sparse_moe.experts.3.w3 - model.layers.3.block_sparse_moe.experts.4.w1 - model.layers.2.block_sparse_moe.experts.4.w1 - model.layers.1.block_sparse_moe.experts.4.w1 - model.layers.0.block_sparse_moe.experts.4.w1 - model.layers.4.block_sparse_moe.experts.4.w1 - model.layers.5.block_sparse_moe.experts.4.w1 - model.layers.6.block_sparse_moe.experts.4.w1 - model.layers.7.block_sparse_moe.experts.4.w1 - model.layers.8.block_sparse_moe.experts.4.w1 - model.layers.9.block_sparse_moe.experts.4.w1 - model.layers.10.block_sparse_moe.experts.4.w1 - model.layers.11.block_sparse_moe.experts.4.w1 - model.layers.12.block_sparse_moe.experts.4.w1 - model.layers.13.block_sparse_moe.experts.4.w1 - model.layers.2.block_sparse_moe.experts.4.w2 - model.layers.3.block_sparse_moe.experts.4.w2 - model.layers.1.block_sparse_moe.experts.4.w2 - model.layers.20.block_sparse_moe.experts.4.w2 - model.layers.0.block_sparse_moe.experts.4.w2 - model.layers.4.block_sparse_moe.experts.4.w2 - model.layers.5.block_sparse_moe.experts.4.w2 - model.layers.6.block_sparse_moe.experts.4.w2 - model.layers.7.block_sparse_moe.experts.4.w2 - model.layers.8.block_sparse_moe.experts.4.w2 - model.layers.9.block_sparse_moe.experts.4.w2 - model.layers.10.block_sparse_moe.experts.4.w2 - model.layers.11.block_sparse_moe.experts.4.w2 - model.layers.12.block_sparse_moe.experts.4.w2 - model.layers.3.block_sparse_moe.experts.4.w3 - model.layers.2.block_sparse_moe.experts.4.w3 - model.layers.1.block_sparse_moe.experts.4.w3 - model.layers.0.block_sparse_moe.experts.4.w3 - model.layers.4.block_sparse_moe.experts.4.w3 - model.layers.5.block_sparse_moe.experts.4.w3 - model.layers.6.block_sparse_moe.experts.4.w3 - model.layers.7.block_sparse_moe.experts.4.w3 - model.layers.8.block_sparse_moe.experts.4.w3 - model.layers.9.block_sparse_moe.experts.4.w3 - model.layers.10.block_sparse_moe.experts.4.w3 - model.layers.11.block_sparse_moe.experts.4.w3 - model.layers.12.block_sparse_moe.experts.4.w3 - model.layers.13.block_sparse_moe.experts.4.w3 - model.layers.4.block_sparse_moe.experts.5.w1 - model.layers.3.block_sparse_moe.experts.5.w1 - model.layers.2.block_sparse_moe.experts.5.w1 - model.layers.1.block_sparse_moe.experts.5.w1 - model.layers.0.block_sparse_moe.experts.5.w1 - model.layers.5.block_sparse_moe.experts.5.w1 - model.layers.6.block_sparse_moe.experts.5.w1 - model.layers.7.block_sparse_moe.experts.5.w1 - model.layers.8.block_sparse_moe.experts.5.w1 - model.layers.9.block_sparse_moe.experts.5.w1 - model.layers.10.block_sparse_moe.experts.5.w1 - model.layers.11.block_sparse_moe.experts.5.w1 - model.layers.12.block_sparse_moe.experts.5.w1 - model.layers.13.block_sparse_moe.experts.5.w1 - model.layers.4.block_sparse_moe.experts.5.w2 - model.layers.2.block_sparse_moe.experts.5.w2 - model.layers.3.block_sparse_moe.experts.5.w2 - model.layers.1.block_sparse_moe.experts.5.w2 - model.layers.0.block_sparse_moe.experts.5.w2 - model.layers.5.block_sparse_moe.experts.5.w2 - model.layers.6.block_sparse_moe.experts.5.w2 - model.layers.7.block_sparse_moe.experts.5.w2 - model.layers.8.block_sparse_moe.experts.5.w2 - model.layers.9.block_sparse_moe.experts.5.w2 - model.layers.10.block_sparse_moe.experts.5.w2 - model.layers.11.block_sparse_moe.experts.5.w2 - model.layers.12.block_sparse_moe.experts.5.w2 - model.layers.13.block_sparse_moe.experts.5.w2 - model.layers.4.block_sparse_moe.experts.5.w3 - model.layers.3.block_sparse_moe.experts.5.w3 - model.layers.2.block_sparse_moe.experts.5.w3 - model.layers.1.block_sparse_moe.experts.5.w3 - model.layers.0.block_sparse_moe.experts.5.w3 - model.layers.5.block_sparse_moe.experts.5.w3 - model.layers.6.block_sparse_moe.experts.5.w3 - model.layers.7.block_sparse_moe.experts.5.w3 - model.layers.8.block_sparse_moe.experts.5.w3 - model.layers.9.block_sparse_moe.experts.5.w3 - model.layers.10.block_sparse_moe.experts.5.w3 - model.layers.11.block_sparse_moe.experts.5.w3 - model.layers.12.block_sparse_moe.experts.5.w3 - model.layers.13.block_sparse_moe.experts.5.w3 - model.layers.5.block_sparse_moe.experts.6.w1 - model.layers.4.block_sparse_moe.experts.6.w1 - model.layers.3.block_sparse_moe.experts.6.w1 - model.layers.2.block_sparse_moe.experts.6.w1 - model.layers.1.block_sparse_moe.experts.6.w1 - model.layers.0.block_sparse_moe.experts.6.w1 - model.layers.6.block_sparse_moe.experts.6.w1 - model.layers.7.block_sparse_moe.experts.6.w1 - model.layers.8.block_sparse_moe.experts.6.w1 - model.layers.9.block_sparse_moe.experts.6.w1 - model.layers.10.block_sparse_moe.experts.6.w1 - model.layers.11.block_sparse_moe.experts.6.w1 - model.layers.12.block_sparse_moe.experts.6.w1 - model.layers.13.block_sparse_moe.experts.6.w1 - model.layers.5.block_sparse_moe.experts.6.w2 - model.layers.4.block_sparse_moe.experts.6.w2 - model.layers.2.block_sparse_moe.experts.6.w2 - model.layers.3.block_sparse_moe.experts.6.w2 - model.layers.1.block_sparse_moe.experts.6.w2 - model.layers.0.block_sparse_moe.experts.6.w2 - model.layers.6.block_sparse_moe.experts.6.w2 - model.layers.7.block_sparse_moe.experts.6.w2 - model.layers.8.block_sparse_moe.experts.6.w2 - model.layers.9.block_sparse_moe.experts.6.w2 - model.layers.10.block_sparse_moe.experts.6.w2 - model.layers.11.block_sparse_moe.experts.6.w2 - model.layers.12.block_sparse_moe.experts.6.w2 - model.layers.13.block_sparse_moe.experts.6.w2 - model.layers.5.block_sparse_moe.experts.6.w3 - model.layers.4.block_sparse_moe.experts.6.w3 - model.layers.3.block_sparse_moe.experts.6.w3 - model.layers.2.block_sparse_moe.experts.6.w3 - model.layers.1.block_sparse_moe.experts.6.w3 - model.layers.0.block_sparse_moe.experts.6.w3 - model.layers.6.block_sparse_moe.experts.6.w3 - model.layers.7.block_sparse_moe.experts.6.w3 - model.layers.8.block_sparse_moe.experts.6.w3 - model.layers.9.block_sparse_moe.experts.6.w3 - model.layers.10.block_sparse_moe.experts.6.w3 - model.layers.11.block_sparse_moe.experts.6.w3 - model.layers.12.block_sparse_moe.experts.6.w3 - model.layers.13.block_sparse_moe.experts.6.w3 - model.layers.5.block_sparse_moe.experts.7.w1 - model.layers.6.block_sparse_moe.experts.7.w1 - model.layers.3.block_sparse_moe.experts.7.w1 - model.layers.4.block_sparse_moe.experts.7.w1 - model.layers.2.block_sparse_moe.experts.7.w1 - model.layers.0.block_sparse_moe.experts.7.w1 - model.layers.7.block_sparse_moe.experts.7.w1 - model.layers.8.block_sparse_moe.experts.7.w1 - model.layers.9.block_sparse_moe.experts.7.w1 - model.layers.10.block_sparse_moe.experts.7.w1 - model.layers.11.block_sparse_moe.experts.7.w1 - model.layers.12.block_sparse_moe.experts.7.w1 - model.layers.13.block_sparse_moe.experts.7.w1 - model.layers.14.block_sparse_moe.experts.7.w1 - model.layers.6.block_sparse_moe.experts.7.w2 - model.layers.5.block_sparse_moe.experts.7.w2 - model.layers.4.block_sparse_moe.experts.7.w2 - model.layers.2.block_sparse_moe.experts.7.w2 - model.layers.3.block_sparse_moe.experts.7.w2 - model.layers.1.block_sparse_moe.experts.7.w2 - model.layers.0.block_sparse_moe.experts.7.w2 - model.layers.7.block_sparse_moe.experts.7.w2 - model.layers.8.block_sparse_moe.experts.7.w2 - model.layers.9.block_sparse_moe.experts.7.w2 - model.layers.10.block_sparse_moe.experts.7.w2 - model.layers.11.block_sparse_moe.experts.7.w2 - model.layers.12.block_sparse_moe.experts.7.w2 - model.layers.13.block_sparse_moe.experts.7.w2 - model.layers.6.block_sparse_moe.experts.7.w3 - model.layers.5.block_sparse_moe.experts.7.w3 - model.layers.4.block_sparse_moe.experts.7.w3 - model.layers.3.block_sparse_moe.experts.7.w3 - model.layers.2.block_sparse_moe.experts.7.w3 - model.layers.0.block_sparse_moe.experts.7.w3 - model.layers.7.block_sparse_moe.experts.7.w3 - model.layers.8.block_sparse_moe.experts.7.w3 - model.layers.9.block_sparse_moe.experts.7.w3 - model.layers.10.block_sparse_moe.experts.7.w3 - model.layers.11.block_sparse_moe.experts.7.w3 - model.layers.12.block_sparse_moe.experts.7.w3 - model.layers.13.block_sparse_moe.experts.7.w3 - model.layers.14.block_sparse_moe.experts.7.w3 # - model.layers.0.block_sparse_moe.gate # - model.layers.1.block_sparse_moe.gate # - model.layers.2.block_sparse_moe.gate # - model.layers.3.block_sparse_moe.gate # - model.layers.4.block_sparse_moe.gate # - model.layers.5.block_sparse_moe.gate # - model.layers.6.block_sparse_moe.gate # - model.layers.7.block_sparse_moe.gate # - model.layers.8.block_sparse_moe.gate # - model.layers.9.block_sparse_moe.gate # - model.layers.10.block_sparse_moe.gate # - model.layers.11.block_sparse_moe.gate # - model.layers.12.block_sparse_moe.gate # - model.layers.13.block_sparse_moe.gate model_config: output_router_logits: true datasets: - path: /workspace/datasets/dolphin-2.9/dolphin201-sharegpt2.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/Ultrachat200kunfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/dolphin-coder-translate-sharegpt2.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/dolphin-coder-codegen-sharegpt2.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/m-a-p_Code-Feedback-sharegpt-unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/m-a-p_CodeFeedback-Filtered-Instruction-sharegpt-unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/not_samantha_norefusals.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/Orca-Math-resort-unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/agent_instruct_react_unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/toolbench_instruct_j1s1_3k_unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/toolbench_negative_unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/toolbench_react_10p_unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/toolbench_tflan_cot_30p_unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/openhermes200k_unfiltered.jsonl type: sharegpt conversation: chatml - path: /workspace/datasets/dolphin-2.9/SystemConversations.jsonl type: sharegpt conversation: chatml chat_template: chatml dataset_prepared_path: thingy val_set_size: 0.0002 output_dir: ./out sequence_len: 4096 sample_packing: true pad_to_sequence_len: true gradient_accumulation_steps: 8 micro_batch_size: 4 num_epochs: 3 logging_steps: 1 optimizer: paged_adamw_8bit lr_scheduler: cosine learning_rate: 2.7e-5 wandb_project: dolphin-2.9-mixtral-8x22b wandb_watch: wandb_run_id: wandb_log_model: train_on_inputs: false group_by_length: false bf16: auto fp16: tf32: true gradient_checkpointing: true gradient_checkpointing_kwargs: use_reentrant: false early_stopping_patience: # resume_from_checkpoint: /home/ehartford/axolotl/out/checkpoint-316 local_rank: logging_steps: 1 xformers_attention: flash_attention: true saves_per_epoch: 8 save_total_limit: 2 save_steps: evals_per_epoch: 4 eval_sample_packing: false debug: deepspeed: deepspeed_configs/zero3_bf16_cpuoffload_params.json weight_decay: 0.05 fsdp: fsdp_config: special_tokens: eos_token: "<|im_end|>" tokens: - "<|im_start|>" ```

# out This model is a fine-tuned version of [mistral-community/Mixtral-8x22B-v0.1](https://huggingface.co/mistral-community/Mixtral-8x22B-v0.1) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5217 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2.7e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 8 - total_train_batch_size: 256 - total_eval_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 2 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.7022 | 0.0 | 1 | 0.6989 | | 0.5344 | 0.25 | 238 | 0.5138 | | 0.5204 | 0.5 | 476 | 0.5018 | | 0.5059 | 0.75 | 714 | 0.4951 | | 0.5112 | 1.0 | 952 | 0.4911 | | 0.4561 | 1.24 | 1190 | 0.4978 | | 0.478 | 1.49 | 1428 | 0.4935 | | 0.4714 | 1.74 | 1666 | 0.4899 | | 0.4626 | 1.99 | 1904 | 0.4861 | | 0.3675 | 2.22 | 2142 | 0.5240 | | 0.3595 | 2.47 | 2380 | 0.5229 | | 0.3438 | 2.72 | 2618 | 0.5217 | ### Framework versions - Transformers 4.40.0.dev0 - Pytorch 2.2.2+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0