GPTJ_tweet-to-question_0_epoch_5 / generation_config.json
keonju's picture
Upload GPTJForCausalLM
91594f9
raw
history blame
119 Bytes
{
"_from_model_config": true,
"bos_token_id": 50256,
"eos_token_id": 50256,
"transformers_version": "4.27.4"
}