ildodeltaRule commited on
Commit
ea0db78
1 Parent(s): ab10c1a

Update config.json

Browse files

Updating config.json to include right max_position_embeddings size of 512, instead of 514. When using more then 512 tokens a error is generated.

--> Libaries like FastChat use the max_position_embeddings to define the number of tokens that can be used. In this case it should be 512.

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -13,7 +13,7 @@
13
  "initializer_range": 0.02,
14
  "intermediate_size": 4096,
15
  "layer_norm_eps": 1e-05,
16
- "max_position_embeddings": 514,
17
  "model_type": "xlm-roberta",
18
  "num_attention_heads": 16,
19
  "num_hidden_layers": 24,
 
13
  "initializer_range": 0.02,
14
  "intermediate_size": 4096,
15
  "layer_norm_eps": 1e-05,
16
+ "max_position_embeddings": 512,
17
  "model_type": "xlm-roberta",
18
  "num_attention_heads": 16,
19
  "num_hidden_layers": 24,