[AUTOMATED] Model Memory Requirements
#16 opened 7 months ago
by
model-sizer-bot
4x version
1
#15 opened 8 months ago
by
ehartford
Adding Evaluation Results
#14 opened 8 months ago
by
leaderboard-pr-bot
What's the model architecture
#13 opened 8 months ago
by
JamesShao
base or chat model?
#12 opened 9 months ago
by
horaceai
I am a newbie, how to use the existing open source LLM to train MoE. Thank you
#11 opened 9 months ago
by
EEEmpty
Quantization Please
1
#9 opened 10 months ago
by
bingw5
How many GPU memories that the MoE module needs?
2
#8 opened 10 months ago
by
Jazzlee
Multi-langua?
1
#7 opened 10 months ago
by
oFDz
Perfect MoE's my write up, and help to you for making MoE's
#6 opened 10 months ago
by
rombodawg
Add MOE (mixture of experts) tag
#5 opened 10 months ago
by
davanstrien
What are the merging parameters?
3
#4 opened 10 months ago
by
rombodawg
is this base model or sft model?
1
#3 opened 10 months ago
by
lucasjin
Can VLLM be used for inference acceleration?
2
#2 opened 10 months ago
by
obtion
You are all three top spots on the leaderboard
#1 opened 10 months ago
by
dillfrescott