Hexoteric-7B / README.md
digitous's picture
Update README.md
4c229fc
|
raw
history blame
537 Bytes
---
license: apache-2.0
tags:
- mistral
- mix
---
̶F̶u̶l̶l̶ ̶m̶o̶d̶e̶l̶ ̶c̶a̶r̶d̶ ̶s̶o̶o̶n̶.̶ ̶E̶a̶r̶l̶y̶ ̶r̶e̶l̶e̶a̶s̶e̶;̶
Spherical Hexa-Merge of hand-picked Mistrel-7B models.
This is the successor to Naberius-7B, building on its findings.
[11 Dec 2023 UPDATE]
Original compute resource for experiment are inaccessible.
Long story;
https://huggingface.co/CalderaAI/Hexoteric-7B/discussions/2#6576d3e5412ee701851fd567
Stanford Alpaca format works best for instruct test driving this engima.