ERROR: For each expert, `positive_prompts` must contain one or more example prompt reflecting what should be routed to that expert.

#1
by h2m - opened

"""
base_model: leveldevai/TurdusBeagle-7B
gate_mode: hidden
dtype: bfloat16
experts:

  • source_model: leveldevai/TurdusBeagle-7B
    positive_prompts: [""]
  • source_model: udkai/Turdus
    positive_prompts: [""]
  • source_model: nfaheem/Marcoroni-7b-DPO-Merge
    positive_prompts: [""]
  • source_model: Toten5/Marcoroni-neural-chat-7B-v2
    positive_prompts: [""]
    """

how do you create that model?

Sign up or log in to comment