CausalLM-7B-GPTQ / generation_config.json
TheBloke's picture
GPTQ model commit
af03347
raw
history blame contribute delete
226 Bytes
{
"chat_format": "chatml",
"do_sample": true,
"eos_token_id": 151643,
"max_new_tokens": 512,
"max_window_size": 6144,
"pad_token_id": 151643,
"top_k": 0,
"top_p": 0.5,
"transformers_version": "4.35.0.dev0"
}