File size: 1,793 Bytes
a47fde4
 
02bd8ed
 
a47fde4
1bccc5e
a47fde4
72841b4
a47fde4
 
 
 
 
 
 
 
 
 
 
 
 
f257803
 
a47fde4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
license: cc-by-nc-4.0
language:
- en
---
<p align="center"><font size="5"> <i>Strange quirk: This model seems to need a context size of EXACTLY 4096 ONLY. I'm assuming this is a dares_ties effect?</i> </font></p>
<p align="center"><img src="https://i.ibb.co/pbpJHpk/iambe-sml.png"/><font size="6"> <b>Iambe-20b-DARE-v2</b> </font></p>
<p align="center"><font size="4"> <b>Alpaca prompt formatting</b> </font></p>

### Description

Named after a charming daughter of Echo and Pan in Greek myth, Iambe-20b-DARE-v2 is an improved [DARE](https://github.com/yule-BUAA/MergeLM) merge building on my recent experiments.

Iambe is intended to have the best realistically possible understanding of anatomy and of a scene's state for a 20b merge, while remaining personable and authentic in "voice". 

### Update Methodology

Noromaid and the general "no-robots" vibe didn't come through like I'd hoped in v1. My hypothesis is that the "soul" MythoMax and Noromaid have is probably distributed widely over many low-value deltas, due to the "ephemeral" nature of such a thing. 

My old base model was likely giving DARE conniption fits, so I replaced that with a truly vanilla 20b base model.

CleverGirl was updated to the DARE version, as Sir Hillary said, simply because it was there.

Without a large base of dare_ties models to compare to, I'm basically feeling my way through this intuitively, so here's to good results!

### Recipe
merge_method: dare_ties

  - base_model: athirdpath/BigLlama-20b-v1.1

  - model: Noromaid-20b-v0.1.1
   
      weight: 0.38 / density: 0.60
    
  - model: athirdpath/athirdpath/Eileithyia-20b
   
      weight: 0.22 / density: 0.40

  - model: athirdpath/CleverGirl-20b-Blended-v1.1-DARE
   
      weight: 0.40 / density: 0.33

int8_mask: true

dtype: bfloat16