FalconAlpaca-7B / lit_config.json
MichaelPape's picture
Added finetuned adapter and base model
00e0f22
{"block_size": 2048, "vocab_size": 50254, "padding_multiple": 512, "padded_vocab_size": 65024, "n_layer": 32, "n_head": 71, "n_embd": 4544, "rotary_percentage": 1.0, "parallel_residual": true, "bias": false, "n_query_groups": 1, "shared_attention_norm": true}