CodeLlama-13b-Instruct / params.json
camenduru's picture
thanks to facebookresearch/codellama ❤
81a636b
raw
history blame contribute delete
163 Bytes
{
"dim": 5120,
"n_layers": 40,
"n_heads": 40,
"multiple_of": 256,
"ffn_dim_multiplier": 1.0,
"norm_eps": 1e-5,
"rope_theta": 1000000
}