olm-chat-7b / open_lm /model_configs /m1b_tiktoken.json
henhenhahi111112's picture
Upload folder using huggingface_hub
af6e330 verified
raw
history blame
169 Bytes
{
"hidden_dim": 2048,
"n_layers": 24,
"n_heads": 16,
"seq_len": 2048,
"vocab_size": 50304,
"post_embed_norm": false,
"weight_tying": false
}