semo_stage1 / encoder_config.json
mjkmain's picture
Upload folder using huggingface_hub
f3cdba6 verified
raw
history blame contribute delete
308 Bytes
{
"sentence_encoder_config": {
"num_hidden_layers": 2,
"hidden_size": 4096,
"rms_norm_eps": 1e-05,
"attention_dropout": 0.0,
"num_attention_heads": 32,
"num_key_value_heads": 8,
"max_position_embeddings": 256,
"rope_theta": 500000.0,
"intermediate_size": 14336,
"hidden_act": "silu"
}
}