llama-2-13b / params.json
R
Upload 4 files
60557ce
raw
history blame contribute delete
102 Bytes
{"dim": 4096, "multiple_of": 256, "n_heads": 32, "n_layers": 32, "norm_eps": 1e-05, "vocab_size": -1}