mixtral dare test

The following were merged with DARE using https://github.com/martyn/safetensors-merge-supermario

Mergelist

mistralai/Mixtral-8x7B-Instruct-v0.1
Open-Orca/Mixtral-SlimOrca-8x7B

Merge command

python3 hf_merge.py to_merge_mixtral0.txt mixtral-0 -p 0.3 -lambda 2.1

Notes

  • This is primarily a test to see if merging mixtral models works.
  • MoE gates are not merged.
Downloads last month
18
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Collection including martyn/mixtral-dare-8x7b-v0