File size: 549 Bytes
42f6d48 dc597f8 c3f7946 42f6d48 c3f7946 42f6d48 c3f7946 42f6d48 74a5f2e 42f6d48 c3f7946 42f6d48 c3f7946 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
library_name: transformers
datasets:
- HuggingFaceH4/ultrachat_200k
- HuggingFaceH4/ultrafeedback_binarized
base_model: mistralai/Mistral-7B-v0.1
license: apache-2.0
---
# A Pruned Mistral model
This model is a pruned Mistral model re-aligned using the Zephyr Recipe
## details
- This model has 2 stages training: SFT and DPO
- The initial model consist on selecting some layers of the mistral model to make a smaller model
- the code can be found here: github.com/tcapelle/shear
## W&B workspace
https://wandb.ai/llm_surgery/shearllama/
|