dm16 / config.json
jqhoogland's picture
Upload final model (step 75000) and all checkpoints at 2024-10-18T07:05:54.447826
b3912d8 verified
{
"architectures": [
"HFHookedTransformer"
],
"hidden_size": 16,
"num_attention_heads": 8,
"num_hidden_layers": 2,
"torch_dtype": "float32",
"transformers_version": "4.45.2",
"vocab_size": 5000
}