hello mlabonne, can you share the mergekit's merge.yaml of phixtral-4x2_8

#15
by TomPei - opened

hello mlabonne, can you share the mergekit's merge.yaml of phixtral-4x2_8.
I want to merge some other models for tests.
Thank you very much

It's in the model card

It's in the model card

I tried to merge model using the same yaml file in the model card, but I got this error.
It seems like it's giving me errors since there's nothing in positive_prompts.

root@ubuntu:/workspace# mergekit-moe ./my_setting/phixtral-4x2_8.yml ./models/moe/phixtral-4x2_8_test
ERROR:root:Your positive and negative prompts are identical for all experts. This will not produce a functioning MoE.
ERROR:root:For each expert, `positive_prompts` must contain one or more example prompt reflecting what should be routed to that expert.

In the yaml file of another model, Undi95/Mixtral-8x7B-MoE-RP-Story for example, keywords are listed in positive_prompts and negative_prompts for each base model.

Could you please share details of merge.yaml file?
Or did I do something wrong?

Thanks in advance!

Sign up or log in to comment