LLaMA-7B-HF / generation_config.json
Neko-Institute-of-Science's picture
LLaMA: convert_llama_weights_to_hf.py as of d2ffc3fc48430f629c38c36fa8f308b045d1f715 & add .bin
d48acf8
{
"_from_model_config": true,
"bos_token_id": 1,
"eos_token_id": 2,
"pad_token_id": 0,
"transformers_version": "4.28.1"
}