OrionZheng commited on
Commit
43447fe
1 Parent(s): 8b07911

Align bos_token_id with umt5

Browse files

https://huggingface.co/google/umt5-small/blob/main/generation_config.json
{
"_from_model_config": true,
"decoder_start_token_id": 0,
"eos_token_id": 1,
"max_new_tokens": 64,
"pad_token_id": 0,
"transformers_version": "4.31.0.dev0"
}

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -6,7 +6,7 @@
6
  "AutoModelForCausalLM": "modeling_openmoe.OpenMoeForCausalLM"
7
  },
8
  "attention_bias": false,
9
- "bos_token_id": 2,
10
  "dropout_rate": 0.0,
11
  "enable_comm_overlap": false,
12
  "enable_hierarchical_alltoall": false,
 
6
  "AutoModelForCausalLM": "modeling_openmoe.OpenMoeForCausalLM"
7
  },
8
  "attention_bias": false,
9
+ "bos_token_id": 0,
10
  "dropout_rate": 0.0,
11
  "enable_comm_overlap": false,
12
  "enable_hierarchical_alltoall": false,