--- base_model: - unsloth/Llama-3.2-3B - meta-llama/Llama-3.2-3B library_name: transformers tags: - mergekit - merge --- # merge This merge was made of 29 copies of Meta's LLaMa 3.2 3B. Why? Don't ask me. Just say I was compelled to by some greater force. *God wept.* ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * [unsloth/Llama-3.2-3B](https://huggingface.co/unsloth/Llama-3.2-3B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B - model: unsloth/Llama-3.2-3B merge_method: passthrough dtype: float16 ```