sophosympatheia commited on
Commit
855d2d6
1 Parent(s): 68f2f55

Update README.md

Browse files

Updating long context information after confirming 32K context performance

Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -14,7 +14,7 @@ tags:
14
  ### Overview
15
 
16
  This is a SLERP merge between [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) and [sophosympatheia/Midnight-Rose-70B-v2.0.3](https://huggingface.co/sophosympatheia/Midnight-Rose-70B-v2.0.3).
17
- I think this model retains much of what made Midnight Rose special while gaining some capabilities from Miqu, including better long-context support. (YMMV)
18
 
19
  This model is uncensored. *You are responsible for whatever you do with it.*
20
 
@@ -22,9 +22,7 @@ This model was designed for roleplaying and storytelling and I think it does wel
22
 
23
  ### Long Context Tips
24
 
25
- You can run this model past 4096 context with alpha_rope set to 1.
26
- I have tested my 5.0bpw exl2 quant of this model out to 16K context using 8-bit cache with alpha_rope 1 and it performs great without any noticable drop in quality as the context size filled from < 4K to the full 16K context.
27
- Miqu can go up to 32K context, so in theory this merge can too. I will test that theory soon.
28
 
29
  ### Sampler Tips
30
 
 
14
  ### Overview
15
 
16
  This is a SLERP merge between [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) and [sophosympatheia/Midnight-Rose-70B-v2.0.3](https://huggingface.co/sophosympatheia/Midnight-Rose-70B-v2.0.3).
17
+ I think this model retains much of what made Midnight Rose special while gaining some capabilities from Miqu, including long-context capabilities.
18
 
19
  This model is uncensored. *You are responsible for whatever you do with it.*
20
 
 
22
 
23
  ### Long Context Tips
24
 
25
+ You can run this model out to 32K context with alpha_rope set to 1, just like with Miqu. Enjoy!
 
 
26
 
27
  ### Sampler Tips
28