ildodeltaRule commited on
Commit
526af32
1 Parent(s): ab10c1a

Update config.json

Browse files

max poisitional embeddings seem to be 512 instead of 514, which throws an error while using the transformers libary

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -13,7 +13,7 @@
13
  "initializer_range": 0.02,
14
  "intermediate_size": 4096,
15
  "layer_norm_eps": 1e-05,
16
- "max_position_embeddings": 514,
17
  "model_type": "xlm-roberta",
18
  "num_attention_heads": 16,
19
  "num_hidden_layers": 24,
 
13
  "initializer_range": 0.02,
14
  "intermediate_size": 4096,
15
  "layer_norm_eps": 1e-05,
16
+ "max_position_embeddings": 512,
17
  "model_type": "xlm-roberta",
18
  "num_attention_heads": 16,
19
  "num_hidden_layers": 24,