FLOR-760M / config.json
joanllop's picture
Update config.json
5eea88a
raw
history blame
294 Bytes
{
"architectures": [
"BloomForCausalLM"
],
"model_type": "bloom",
"vocab_size": 50257,
"hidden_size": 1536,
"tie_word_embeddings": true,
"n_layer": 24,
"hidden_dropout": 0.0,
"layer_norm_epsilon": 1e-05,
"n_head": 16,
"attention_dropout": 0.0
}