miquella-120b / README.md
alpindale's picture
Update README.md
25de83c verified
|
raw
history blame
869 Bytes
metadata
base_model: []
tags:
  - mergekit
  - merge

Miquella 120B

Model has been remade with the fixed dequantization of miqu.

This is a merge of pre-trained language models created using mergekit. An attempt at re-creating goliath-120b using the new miqu-1-70b model instead of Xwin.

The merge ratios are the same as goliath, only that Xwin is swapped with miqu.

Models Merged

The following models were included in the merge:

image/png Miquella the Unalloyed, by @eldrtchmoon