RichardErkhov commited on
Commit
e1bcf65
·
verified ·
1 Parent(s): c17cb43

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +189 -0
README.md ADDED
@@ -0,0 +1,189 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ Phi-Elothir - bnb 8bits
11
+ - Model creator: https://huggingface.co/Replete-AI/
12
+ - Original model: https://huggingface.co/Replete-AI/Phi-Elothir/
13
+
14
+
15
+
16
+
17
+ Original model description:
18
+ ---
19
+ license: mit
20
+ language:
21
+ - en
22
+ thumbnail: "https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/TqnMpteVAyfiiNHx4lVkU.png"
23
+ ---
24
+ # You are welcome here, traveler.
25
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/TqnMpteVAyfiiNHx4lVkU.png)
26
+
27
+ ### Named after the method used to create it, interleaving the layers of its predecessor to become far larger, giving it much more potential.
28
+
29
+
30
+ [Elothir](https://wowpedia.fandom.com/wiki/Elothir) was an ancient treeant, and I couldn't think of a better naming convention for a model that was created using the passthrough method.
31
+
32
+ By concatenating layers from different LLMs, it can produce models with an exotic number of parameters (e.g., 9B with two 7B parameter models). These models are often referred to as "frankenmerges" or "Frankenstein models" by the community.
33
+
34
+
35
+ Many thanks to [Abacaj](https://huggingface.co/abacaj) for providing the [fine tuned weights](https://huggingface.co/abacaj/phi-2-super) that were used in the creation of this base model...thanks to [KatyTheCutie](https://huggingface.co/KatyTheCutie) for inspring me to test out this script.
36
+
37
+ ## This idea was brought to me by [The Face of Goonery](https://huggingface.co/The-Face-Of-Goonery), also known as Caleb Morgan. I have him to thank if fine-tuning this model turns out to be a success
38
+ # How to run inference:
39
+
40
+ ```python
41
+ import transformers
42
+ import torch
43
+
44
+ if __name__ == "__main__":
45
+ model_name = "Replete-AI/Phi-Elothir"
46
+ tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)
47
+
48
+ model = (
49
+ transformers.AutoModelForCausalLM.from_pretrained(
50
+ model_name,
51
+ )
52
+ .to("cuda:0")
53
+ .eval()
54
+ )
55
+
56
+ messages = [
57
+ {"role": "user", "content": "Hello, who are you?"}
58
+ ]
59
+ inputs = tokenizer.apply_chat_template(messages, return_tensors="pt").to(model.device)
60
+ input_ids_cutoff = inputs.size(dim=1)
61
+
62
+ with torch.no_grad():
63
+ generated_ids = model.generate(
64
+ input_ids=inputs,
65
+ use_cache=True,
66
+ max_new_tokens=512,
67
+ temperature=0.2,
68
+ top_p=0.95,
69
+ do_sample=True,
70
+ eos_token_id=tokenizer.eos_token_id,
71
+ pad_token_id=tokenizer.pad_token_id,
72
+ )
73
+
74
+ completion = tokenizer.decode(
75
+ generated_ids[0][input_ids_cutoff:],
76
+ skip_special_tokens=True,
77
+ )
78
+
79
+ print(completion)
80
+ ```
81
+
82
+ # Chat template
83
+
84
+ The model uses the same chat template as found in Mistral instruct models:
85
+
86
+ # [Join the Replete AI Discord here!](https://discord.gg/tG5aY4EX4T)
87
+
88
+ # The Sauce:
89
+
90
+ ```yml
91
+ dtype: float16
92
+ merge_method: passthrough
93
+ slices:
94
+ - sources:
95
+ - model: abacaj/phi-2-super
96
+ layer_range: [0,2]
97
+ - sources:
98
+ - model: abacaj/phi-2-super
99
+ layer_range: [1,3]
100
+ - sources:
101
+ - model: abacaj/phi-2-super
102
+ layer_range: [2,4]
103
+ - sources:
104
+ - model: abacaj/phi-2-super
105
+ layer_range: [3,5]
106
+ - sources:
107
+ - model: abacaj/phi-2-super
108
+ layer_range: [4,6]
109
+ - sources:
110
+ - model: abacaj/phi-2-super
111
+ layer_range: [5,7]
112
+ - sources:
113
+ - model: abacaj/phi-2-super
114
+ layer_range: [6,8]
115
+ - sources:
116
+ - model: abacaj/phi-2-super
117
+ layer_range: [7,9]
118
+ - sources:
119
+ - model: abacaj/phi-2-super
120
+ layer_range: [8,10]
121
+ - sources:
122
+ - model: abacaj/phi-2-super
123
+ layer_range: [9,11]
124
+ - sources:
125
+ - model: abacaj/phi-2-super
126
+ layer_range: [10,12]
127
+ - sources:
128
+ - model: abacaj/phi-2-super
129
+ layer_range: [11,13]
130
+ - sources:
131
+ - model: abacaj/phi-2-super
132
+ layer_range: [12,14]
133
+ - sources:
134
+ - model: abacaj/phi-2-super
135
+ layer_range: [13,15]
136
+ - sources:
137
+ - model: abacaj/phi-2-super
138
+ layer_range: [14,16]
139
+ - sources:
140
+ - model: abacaj/phi-2-super
141
+ layer_range: [15,17]
142
+ - sources:
143
+ - model: abacaj/phi-2-super
144
+ layer_range: [16,18]
145
+ - sources:
146
+ - model: abacaj/phi-2-super
147
+ layer_range: [17,19]
148
+ - sources:
149
+ - model: abacaj/phi-2-super
150
+ layer_range: [18,20]
151
+ - sources:
152
+ - model: abacaj/phi-2-super
153
+ layer_range: [19,21]
154
+ - sources:
155
+ - model: abacaj/phi-2-super
156
+ layer_range: [20,22]
157
+ - sources:
158
+ - model: abacaj/phi-2-super
159
+ layer_range: [21,23]
160
+ - sources:
161
+ - model: abacaj/phi-2-super
162
+ layer_range: [22,24]
163
+ - sources:
164
+ - model: abacaj/phi-2-super
165
+ layer_range: [23,25]
166
+ - sources:
167
+ - model: abacaj/phi-2-super
168
+ layer_range: [24,26]
169
+ - sources:
170
+ - model: abacaj/phi-2-super
171
+ layer_range: [25,27]
172
+ - sources:
173
+ - model: abacaj/phi-2-super
174
+ layer_range: [26,28]
175
+ - sources:
176
+ - model: abacaj/phi-2-super
177
+ layer_range: [27,29]
178
+ - sources:
179
+ - model: abacaj/phi-2-super
180
+ layer_range: [28,30]
181
+ - sources:
182
+ - model: abacaj/phi-2-super
183
+ layer_range: [29,31]
184
+ - sources:
185
+ - model: abacaj/phi-2-super
186
+ layer_range: [30,32]
187
+ ```
188
+
189
+