Merge tags for MoE models.

#533
by moreh-sungmin - opened

I am curious about the criteria for merge tags.
Recently, a merged tag has been applied to models on the leaderboard that achieved high rankings without being directly trained, but rather by merging several models. A similar situation seems to be occurring with MoE models now. It appears we need to establish clear criteria. Should models that are combined without any additional training be removed from the default list? I believe they should. I am interested in hearing the thoughts of the Huggingface staff on this matter. If you agree with my viewpoint, then the models currently in the top 1-3 positions should be removed from the default list. I await your response on this matter, thank you in advance.

@clefourrier I'd like to hear admins opinion. thank you.

Open LLM Leaderboard org

Hi!
We have added 2 filters:

  • one for merges and moerges (models merged as MoE without additional filtering), which be hidden by default on the main view, as long as their metadata is correct
  • one for pretrained or fine-tuned MoEs (like Mixtral), which users can choose to hide if needed.
clefourrier changed discussion status to closed

Sign up or log in to comment