Mixtral-8x22B-v0.1 / params.json
leafspark's picture
Upload 3 files
aaf4afa verified
raw
history blame contribute delete
274 Bytes
{
"dim": 6144,
"n_layers": 56,
"head_dim": 128,
"hidden_dim": 16384,
"n_heads": 48,
"n_kv_heads": 8,
"rope_theta": 1000000,
"norm_eps": 1e-05,
"vocab_size": 32000,
"moe": {
"num_experts": 8,
"num_experts_per_tok": 2
},
"max_seq_len": 65536
}