view post Post Current ranking of pre-trained (nonchat) open access LLMs according to the leaderboard. 1-4 are from China-based groups. Does training models with Chinese somehow lead to better metrics? π€ WDYT?1. Qwen/Qwen-72B2. 01-ai/Yi-34B3. internlm/internlm2-20b4. deepseek-ai/deepseek-llm-67b-base5. mistralai/Mixtral-8x7B-v0.1 7 replies Β· β€οΈ 10 10 π 6 6 π€― 3 3 + Reply