|
--- |
|
license: apache-2.0 |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# Starling-LM-10.7B-beta |
|
|
|
This is Starling-LM-10.7B-beta, a depth-upscaled version of [Nexusflow/Starling-LM-7B-beta](https://huggingface.co/Nexusflow/Starling-LM-7B-beta). |
|
|
|
This model is intended to be used as a drop-in upgrade from the original 7 billion parameter model. |
|
|
|
# ExLlamaV2 quantizations (courtesy of [blockblockblock](https://huggingface.co/blockblockblock)) |
|
- [2.5 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw2.5) |
|
- [3 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw3) |
|
- [3.5 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw3.5) |
|
- [3.7 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw3.7) |
|
- [4 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw4) |
|
- [4.4 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw4.4) |
|
- [4.6 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw4.6) |
|
- [4.8 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw4.8) |
|
- [5 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw5) |
|
- [5.5 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw5.5) |
|
- [6 bpw](https://huggingface.co/blockblockblock/Starling-LM-10.7B-beta-bpw6) |