Text Generation
Transformers
PyTorch
mistral
openchat
C-RLFT
conversational
Inference Endpoints
text-generation-inference
imone TheBloke commited on
Commit
394df69
1 Parent(s): f85ea3d

EOS should be 32000 (#3)

Browse files

- EOS should be 32000 (a74ef171432c307c8c5f3da002536d184e815269)


Co-authored-by: Tom Jobbins <TheBloke@users.noreply.huggingface.co>

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -4,7 +4,7 @@
4
  "MistralForCausalLM"
5
  ],
6
  "bos_token_id": 1,
7
- "eos_token_id": 2,
8
  "hidden_act": "silu",
9
  "hidden_size": 4096,
10
  "initializer_range": 0.02,
 
4
  "MistralForCausalLM"
5
  ],
6
  "bos_token_id": 1,
7
+ "eos_token_id": 32000,
8
  "hidden_act": "silu",
9
  "hidden_size": 4096,
10
  "initializer_range": 0.02,