model_2lags300epoch.pt / config.json
Aleminutolo's picture
Update config.json
584211e verified
raw
history blame contribute delete
134 Bytes
{
"model_type": "distilgpt2",
"hidden_size": 768,
"num_attention_heads": 12,
"num_hidden_layers": 12,
"vocab_size": 30522
}