karan4d commited on
Commit
92ef4bc
1 Parent(s): 4d07599

Update config.json

Browse files

"pretraining_tp": 2 should be 1 otherwise on 4.31 transformers you cant load it in BnB 4-bit anymore
Details here : https://github.com/huggingface/transformers/issues/24961"
~henk717

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -15,7 +15,7 @@
15
  "num_hidden_layers": 40,
16
  "num_key_value_heads": 40,
17
  "pad_token_id": 0,
18
- "pretraining_tp": 2,
19
  "rms_norm_eps": 1e-05,
20
  "rope_scaling": null,
21
  "tie_word_embeddings": false,
 
15
  "num_hidden_layers": 40,
16
  "num_key_value_heads": 40,
17
  "pad_token_id": 0,
18
+ "pretraining_tp": 1,
19
  "rms_norm_eps": 1e-05,
20
  "rope_scaling": null,
21
  "tie_word_embeddings": false,