dacorvo's picture
dacorvo HF staff
Rename inference-cache-config/Llama3.1-70B.json to inference-cache-config/Llama3.1-70b.json
a92cfe3 verified
raw
history blame
280 Bytes
{
"meta-llama/Llama-3.1-70B": [
{
"batch_size": 1,
"sequence_length": 4096,
"num_cores": 24,
"auto_cast_type": "bf16"
},
{
"batch_size": 4,
"sequence_length": 4096,
"num_cores": 24,
"auto_cast_type": "bf16"
}
]
}