ycros commited on
Commit
0f47123
1 Parent(s): ddebaee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -20,10 +20,6 @@ They come in two sizes:
20
  - [128g (standard AWQ)](https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B-AWQ/tree/128g)
21
  - [32g (higher quality, more VRAM)](https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B-AWQ/tree/32g)
22
 
23
- [GGUF versions here](https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B-GGUF)
24
-
25
- Version 2: lowered the mix of Sensualize.
26
-
27
  Bagel, Mixtral Instruct, with extra spices. Give it a taste. Works with Alpaca prompt formats, though the Mistral format should also work.
28
 
29
  ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/63044fa07373aacccd8a7c53/lxNMzXo_dq_JCP9YyUyaw.jpeg)
@@ -34,6 +30,8 @@ Somehow I ended up here, Bagel, Mixtral Instruct, a little bit of LimaRP, a litt
34
 
35
  I've been running (temp last) minP 0.1, dynatemp 0.5-4, rep pen 1.07, rep range 1024. I've been testing Alpaca style Instruction/Response, and Instruction/Input/Response and those seem to work well, I expect Mistral's prompt format would also work well. You may need to add a stopping string on "{{char}}:" for RPs because it can sometimes duplicate those out in responses and waffle on. Seems to hold up and not fall apart at long contexts like Bagel and some other Mixtral tunes seem to, definitely doesn't seem prone to loopyness either. Can be pushed into extravagant prose if the scene/setting calls for it.
36
 
 
 
37
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
38
 
39
  ## Merge Details
 
20
  - [128g (standard AWQ)](https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B-AWQ/tree/128g)
21
  - [32g (higher quality, more VRAM)](https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B-AWQ/tree/32g)
22
 
 
 
 
 
23
  Bagel, Mixtral Instruct, with extra spices. Give it a taste. Works with Alpaca prompt formats, though the Mistral format should also work.
24
 
25
  ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/63044fa07373aacccd8a7c53/lxNMzXo_dq_JCP9YyUyaw.jpeg)
 
30
 
31
  I've been running (temp last) minP 0.1, dynatemp 0.5-4, rep pen 1.07, rep range 1024. I've been testing Alpaca style Instruction/Response, and Instruction/Input/Response and those seem to work well, I expect Mistral's prompt format would also work well. You may need to add a stopping string on "{{char}}:" for RPs because it can sometimes duplicate those out in responses and waffle on. Seems to hold up and not fall apart at long contexts like Bagel and some other Mixtral tunes seem to, definitely doesn't seem prone to loopyness either. Can be pushed into extravagant prose if the scene/setting calls for it.
32
 
33
+ __Version 2:__ lowered the mix of Sensualize.
34
+
35
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
36
 
37
  ## Merge Details