ChuckMcSneed commited on
Commit
5f6a898
·
verified ·
1 Parent(s): b6526e3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -5,7 +5,7 @@ tags:
5
  - mergekit
6
  ---
7
  # BETTER THAN GOLIATH?!
8
- I've merged [Xwin-lora that I made](https://huggingface.co/ChuckMcSneed/Xwin-LM-70B-V0.1-LORA) with [Euryale](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) and then merged it with itself in [goliath-style merge](./config.yml) using [mergekit](https://github.com/arcee-ai/mergekit). The resulting model performs better than [goliath](https://huggingface.co/alpindale/goliath-120b) on my tests(note: performance on tests is not necessarily performance in practice). Test it, have fun with it. I'll upload Xwin-LORAEuryale selfmerge next.
9
  # Prompt format
10
  Alpaca.
11
  # Ideas behind it
 
5
  - mergekit
6
  ---
7
  # BETTER THAN GOLIATH?!
8
+ I've merged [Xwin-lora that I made](https://huggingface.co/ChuckMcSneed/Xwin-LM-70B-V0.1-LORA) with [Euryale](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) and then merged it with itself in [goliath-style merge](./config.yml) using [mergekit](https://github.com/arcee-ai/mergekit). The resulting model performs better than [goliath](https://huggingface.co/alpindale/goliath-120b) on my tests(note: performance on tests is not necessarily performance in practice). Test it, have fun with it. This is a sister model of [Premerge-XE-XE-123B](https://huggingface.co/ChuckMcSneed/Premerge-XE-XE-123B).
9
  # Prompt format
10
  Alpaca.
11
  # Ideas behind it