ChuckMcSneed
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ tags:
|
|
5 |
- mergekit
|
6 |
---
|
7 |
# BETTER THAN GOLIATH?!
|
8 |
-
I've merged [Xwin-lora that I made](https://huggingface.co/ChuckMcSneed/Xwin-LM-70B-V0.1-LORA) with [Euryale](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) and then merged it with itself in [goliath-style merge](./config.yml) using [mergekit](https://github.com/arcee-ai/mergekit). The resulting model performs better than [goliath](https://huggingface.co/alpindale/goliath-120b) on my tests(note: performance on tests is not necessarily performance in practice). Test it, have fun with it.
|
9 |
# Prompt format
|
10 |
Alpaca.
|
11 |
# Ideas behind it
|
|
|
5 |
- mergekit
|
6 |
---
|
7 |
# BETTER THAN GOLIATH?!
|
8 |
+
I've merged [Xwin-lora that I made](https://huggingface.co/ChuckMcSneed/Xwin-LM-70B-V0.1-LORA) with [Euryale](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) and then merged it with itself in [goliath-style merge](./config.yml) using [mergekit](https://github.com/arcee-ai/mergekit). The resulting model performs better than [goliath](https://huggingface.co/alpindale/goliath-120b) on my tests(note: performance on tests is not necessarily performance in practice). Test it, have fun with it. This is a sister model of [Premerge-XE-XE-123B](https://huggingface.co/ChuckMcSneed/Premerge-XE-XE-123B).
|
9 |
# Prompt format
|
10 |
Alpaca.
|
11 |
# Ideas behind it
|