requests / zhengr /MixTAO-7Bx2-MoE-v8.1_eval_request_False_bfloat16_Original.json
xuanricheng's picture
Add zhengr/MixTAO-7Bx2-MoE-v8.1 to eval queue
2b31521 verified
raw
history blame
416 Bytes
{
"model": "zhengr/MixTAO-7Bx2-MoE-v8.1",
"base_model": "",
"revision": "main",
"private": false,
"precision": "bfloat16",
"params": 12.879,
"architectures": "MixtralForCausalLM",
"weight_type": "Original",
"status": "PENDING",
"submitted_time": "2024-05-16T06:27:39Z",
"model_type": "\ud83e\udd1d : base merges and moerges",
"job_id": -1,
"job_start_time": null
}