metadata
tags:
- mistral
- merge
TripletBoreas-7B-t0.0001
A sequential merge of the following models using a custom NearSwap algorithm:
With fearlessdots/WizardLM-2-7B-abliterated as the base model, Frostwind-v2.1 second and Erebus-Holodeck third. All using t_value 0.0001
Thanks mradermacher for the quants!
https://huggingface.co/v000000/TripletBoreas-7B-t0.0001-Q8_0-GGUF
https://huggingface.co/v000000/TripletBoreas-7B-t0.0001-Q5_K_S-GGUF
#Fixed
def lerp(a, b, t):
return a * (1 - t) + b * t
def nearswap(v0, v1, t):
lweight = np.abs(v0 - v1)
with np.errstate(divide='ignore', invalid='ignore'):
lweight = np.where(lweight != 0, t / lweight, 1.0)
lweight = np.nan_to_num(lweight, nan=1.0, posinf=1.0, neginf=1.0)
np.clip(lweight, a_min=0.0, a_max=1.0, out=lweight)
return lerp(v0, v1, lweight)
Credit Alchemonaut
WizardLM-2 adopts the prompt format from Vicuna and supports multi-turn conversation. The prompt should be as following:
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful,
detailed, and polite answers to the user's questions. USER: Hi ASSISTANT: Hello.</s>
USER: Who are you? ASSISTANT: I am WizardLM.</s>......
Alpaca also works, and may work better in RP because of Frostwind.
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
Take the role of {{char}} in a play where you leave a lasting impression on {{user}}. Never skip or gloss over {{char}}'s actions.
### Instruction:
{prompt}
### Response:
{output}
Add Genre's in a list like this.
[Genre: <genre1>, <genre2>]