Safetensors
llama
deepseekrzz llllvvuu commited on
Commit
304287b
1 Parent(s): d0cf27f

fix config.json (#1)

Browse files

- fix config.json (a656ef28a924938e1e80493e1147d4475672ce93)


Co-authored-by: L <llllvvuu@users.noreply.huggingface.co>

Files changed (1) hide show
  1. config.json +0 -4
config.json CHANGED
@@ -16,10 +16,6 @@
16
  "num_hidden_layers": 30,
17
  "num_key_value_heads": 32,
18
  "pretraining_tp": 1,
19
- "quantization": {
20
- "group_size": 64,
21
- "bits": 8
22
- },
23
  "rms_norm_eps": 1e-06,
24
  "rope_scaling": null,
25
  "rope_theta": 10000,
 
16
  "num_hidden_layers": 30,
17
  "num_key_value_heads": 32,
18
  "pretraining_tp": 1,
 
 
 
 
19
  "rms_norm_eps": 1e-06,
20
  "rope_scaling": null,
21
  "rope_theta": 10000,