macadeliccc
commited on
Commit
•
664de93
1
Parent(s):
0ece180
Update README.md
Browse files
README.md
CHANGED
@@ -16,6 +16,9 @@ If this 2x7b model is loaded in 4 bit the hellaswag score is .8270 which is high
|
|
16 |
|
17 |
The process is outlined in this [notebook](https://github.com/cognitivecomputations/laserRMT/blob/main/examples/laser-dolphin-mixtral-2x7b.ipynb)
|
18 |
|
|
|
|
|
|
|
19 |
## Prompt Format
|
20 |
|
21 |
This model follows the same prompt format as the aforementioned model.
|
|
|
16 |
|
17 |
The process is outlined in this [notebook](https://github.com/cognitivecomputations/laserRMT/blob/main/examples/laser-dolphin-mixtral-2x7b.ipynb)
|
18 |
|
19 |
+
Quatizations provided by [TheBloke](https://huggingface.co/TheBloke/laser-dolphin-mixtral-2x7b-dpo-GGUF)
|
20 |
+
|
21 |
+
|
22 |
## Prompt Format
|
23 |
|
24 |
This model follows the same prompt format as the aforementioned model.
|