Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,7 @@ tags:
|
|
7 |
---
|
8 |
|
9 |
# Miquella 120B
|
10 |
-
|
11 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
12 |
An attempt at re-creating [goliath-120b](https://huggingface.co/alpindale/goliath-120b) using the new miqu-1-70b model instead of Xwin.
|
13 |
|
|
|
7 |
---
|
8 |
|
9 |
# Miquella 120B
|
10 |
+
## Model has been remade with the [fixed dequantization](https://huggingface.co/152334H/miqu-1-70b-sf) of miqu.
|
11 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
12 |
An attempt at re-creating [goliath-120b](https://huggingface.co/alpindale/goliath-120b) using the new miqu-1-70b model instead of Xwin.
|
13 |
|