--- base_model: - refuelai/Llama-3-Refueled library_name: transformers tags: - mergekit - merge license: llama3 datasets: - yahma/alpaca-cleaned language: - en --- ### Pruning Details This is a prune of [Llama 3 Refueled](https://www.huggingface.co/refuelai/llama-3-refueled) using [mergekit](https://github.com/cg123/mergekit) and [PruneMe](https://www.github.com/arcee-ai/PruneMe) The model is untested and still needs some debugging, namely with converting to GGUF, though I am working on that. ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: refuelai/Llama-3-Refueled layer_range: [0, 19] - sources: - model: refuelai/Llama-3-Refueled layer_range: [29, 32] merge_method: passthrough dtype: bfloat16 ```