File size: 130 Bytes
72b790c |
1 2 3 4 5 6 |
{
"_from_model_config": true,
"transformers_version": "4.28.1",
"use_cache": false,
"eos_token_id": [0, 50278]
} |
72b790c |
1 2 3 4 5 6 |
{
"_from_model_config": true,
"transformers_version": "4.28.1",
"use_cache": false,
"eos_token_id": [0, 50278]
} |