File size: 1,901 Bytes
7a63122
1d5018b
 
 
 
6d8757a
7a63122
65f0875
7a63122
1d5018b
 
 
 
8a21b9c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1d5018b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8a21b9c
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
---
base_model: [KatyTheCutie/EstopianMaid-13B]
tags:
- mergekit
- merge
license_name: microsoft-research-license
license_link: https://huggingface.co/microsoft/Orca-2-13b/blob/main/LICENSE
pipeline_tag: text-generation
---
# EstopianOrcaMaid-13b

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

The goal of this merge is to create an unusually intelligent and human-like model especially for RP.

The prompt format is Alpaca. You can use the standard format as shown, but for best results, you should customize the system prompt to your specific needs.

```
Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{YOUR MESSAGE HERE}

### Response:
{BOT MESSAGE HERE}


```
### Misc. information
- BOS token is `<s>`
- EOS token is `</s>`
- Native context length is `4096`
- Base model is Llama 2
- Due to the inclusion of Orca-2-13b, the model is subject to the terms of the [Microsoft Research License](https://huggingface.co/microsoft/Orca-2-13b/blob/main/LICENSE)

### Merge Method

This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method.

### Models Merged

The following models were included in the merge:
* [KatyTheCutie/EstopianMaid-13B](https://huggingface.co/KatyTheCutie/EstopianMaid-13B)
* [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: /Volumes/Sabrent/LLMs/EstopianMaid-13B
    parameters:
      weight: 0.8
  - model: "/Volumes/SanDisk/LLM Archive/Orca-2-13b"
    parameters:
      weight: 0.2
merge_method: linear
dtype: float16
```

### Thanks
- Thanks to [Katy Vetteriano](https://huggingface.co/KatyTheCutie) for [EstopianMaid-13B](https://huggingface.co/KatyTheCutie/EstopianMaid-13B)