qwen-7b-flock-1719198637 / generation_config.json
jfranklin-foundry's picture
Upload Qwen2ForCausalLM
b0d619e verified
raw
history blame contribute delete
117 Bytes
{
"bos_token_id": 151643,
"eos_token_id": 151643,
"max_new_tokens": 2048,
"transformers_version": "4.40.2"
}