--- language: - en library_name: transformers tags: - merge - llm - stablelm inference: true license: other --- This model is a merge/fusion of [Aryanne/Astridboros-3B](https://huggingface.co/Aryanne/Astridboros-3B) and [stabilityai/stablelm-zephyr-3b](https://huggingface.co/stabilityai/stablelm-zephyr-3b) , 28 layers of Zephyr + 12 of Astridboros together(see zephyr-3.43b.yml or below). ```yaml slices: - sources: - model: stabilityai/stablelm-zephyr-3b layer_range: [0, 14] - sources: - model: Aryanne/Astridboros-3B layer_range: [10, 22] - sources: - model: stabilityai/stablelm-zephyr-3b layer_range: [18, 32] merge_method: passthrough dtype: float16 ``` I recommend the use of the Zephyr prompt format. GGUF Quants: [notyet](not yet)