Safetensors
llama
llllvvuu commited on
Commit
a656ef2
1 Parent(s): d0cf27f

fix config.json

Browse files

This is my mistake. I pasted from wrong file.

Files changed (1) hide show
  1. config.json +0 -4
config.json CHANGED
@@ -16,10 +16,6 @@
16
  "num_hidden_layers": 30,
17
  "num_key_value_heads": 32,
18
  "pretraining_tp": 1,
19
- "quantization": {
20
- "group_size": 64,
21
- "bits": 8
22
- },
23
  "rms_norm_eps": 1e-06,
24
  "rope_scaling": null,
25
  "rope_theta": 10000,
 
16
  "num_hidden_layers": 30,
17
  "num_key_value_heads": 32,
18
  "pretraining_tp": 1,
 
 
 
 
19
  "rms_norm_eps": 1e-06,
20
  "rope_scaling": null,
21
  "rope_theta": 10000,