File size: 2,404 Bytes
6e80091 2aa17f8 6e80091 9a5d5bd 6e80091 9a5d5bd 80a2763 9a5d5bd 6c71482 7514a78 f85d527 7514a78 d2b5e5b 7514a78 61df8b3 9a5d5bd 6e80091 9a5d5bd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 |
---
license: cc-by-nc-4.0
base_model:
- migtissera/Tess-70B-v1.6
- 152334H/miqu-1-70b-sf
- NeverSleep/MiquMaid-v2-70B
- sophosympatheia/Midnight-Miqu-70B-v1.0
library_name: transformers
tags:
- mergekit
- merge
---
# Miqu-MS-70B
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
The new MODEL STOCK merge method was used, see below for more information!
[Feedback](https://huggingface.co/Undi95/Miqu-MS-70B-GGUF/discussions/1) on this model is greatly appreciated! I hope this new merge method will be able to fill some hole Miqu have.
## Others quant
- [EXL2 (5.0 bpw) by lucyknada](https://huggingface.co/lucyknada/Undi95-Miqu-MS-70B-EXL2-5.0bpw) - [measurement.json](https://huggingface.co/lucyknada/Undi95-Miqu-MS-70B-EXL2-5.0bpw/blob/main/measurement.json)
- [Static GGUF by mradermacher](https://huggingface.co/mradermacher/Miqu-MS-70B-GGUF)
- [iMatrix GGUF by mradermacher](https://huggingface.co/mradermacher/Miqu-MS-70B-i1-GGUF) - [imatrix.dat](https://huggingface.co/mradermacher/Miqu-MS-70B-i1-GGUF/blob/main/imatrix.dat)
Thank you all!
## Prompt format
Since it was made with model using different prompt format, the following should work.
### Alpaca
```
### Instruction:
{system prompt}
### Input:
{prompt}
### Response:
{output}
```
### Mistral
```
[INST] {prompt} [/INST]
```
### Vicuna
```
SYSTEM: <ANY SYSTEM CONTEXT>
USER:
ASSISTANT:
```
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) as a base.
### Models Merged
The following models were included in the merge:
* [migtissera/Tess-70B-v1.6](https://huggingface.co/migtissera/Tess-70B-v1.6)
* [NeverSleep/MiquMaid-v2-70B](https://huggingface.co/NeverSleep/MiquMaid-v2-70B)
* [sophosympatheia/Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: NeverSleep/MiquMaid-v2-70B
- model: sophosympatheia/Midnight-Miqu-70B-v1.0
- model: migtissera/Tess-70B-v1.6
- model: 152334H/miqu-1-70b-sf
merge_method: model_stock
base_model: 152334H/miqu-1-70b-sf
dtype: bfloat16
```
## Support
If you want to support me, you can [here](https://ko-fi.com/undiai). |