What do you mean by is a merge?

#1
by inkasaras - opened

"speechless-llama2-hermes-orca-platypus-wizardlm-13b is a merge of NousResearch/Nous-Hermes-Llama2-13b, Open-Orca/OpenOrca-Platypus2-13B and WizardLM/WizardLM-13B-V1.2."

What do you mean by is a merge? Did you use MOE? How did you merge them exactly?

We merged two model weights by using the ties merge(https://github.com/cg123/ties-merge.git). This was an early version, and now we fine-tune speechless models on Llama2 and CodeLlama.

This model is super good

inkasaras changed discussion status to closed

Sign up or log in to comment