tuned-lens / gpt2-large /config.json
levmckinney's picture
Upload with huggingface_hub
fedd82b
raw
history blame
No virus
266 Bytes
{"dropout": 0.0, "identity_init": true, "include_input": true, "layer_norm": false, "mlp_hidden_sizes": [], "rank": null, "shared_mlp_hidden_sizes": [], "share_weights": false, "sublayers": false, "num_layers": 36, "vocab_size": 50257, "bias": true, "d_model": 1280}