Llama-3-8B-instruct-layer-mix-bpw-3.0 / generation_config.json
NicoNico6
update
f1661ea
raw
history blame
121 Bytes
{
"_from_model_config": true,
"bos_token_id": 128000,
"eos_token_id": 128001,
"transformers_version": "4.39.2"
}