--- base_model: - meta-llama/Meta-Llama-3-8B library_name: transformers tags: - mergekit - merge - llama3 license: llama3 language: - en --- Meta's Llama 3 8B pruned to 7B parameters(w/ 29 layers). Layers to prune selected using PruneMe repo on Github. - layers_to_skip = 3 - Layer 24 to 27 has the minimum average distance of 0.15680849609375. - [ ] To Do : Post pruning training. ![layers](https://pbs.twimg.com/media/GNNMXnzW8AAHZvM?format=jpg&name=4096x4096) # model This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: meta-llama/Meta-Llama-3-8B layer_range: [0, 24] - sources: - model: meta-llama/Meta-Llama-3-8B layer_range: [27,32] merge_method: passthrough dtype: bfloat16 ```