mpt-7b-ggml / README.md
LLukas22's picture
Update README.md
4e562e7
|
raw
history blame
2.4 kB
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- llm
- ggml
---
# GGML converted versions of [Mosaic's](https://huggingface.co/mosaicml) MPT Models
## CAUTION: MPT Development is still ongoing and not finished!
- Rust & Python: Rustformers implementation see here: [Implement MPT Model](https://github.com/rustformers/llm/pull/218)
If these implementations are complete i will add instructions on how to run the models and update them if necesary!
## Converted Models:
| Name | Based on | Type | Container |
|-|-|-|-|
| [mpt-7b-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-f16.bin) | [mpt-7b](https://huggingface.co/mosaicml/mpt-7b) | fp16 | GGML |
| [mpt-7b-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-q4_0-ggjt.bin) | [mpt-7b](https://huggingface.co/mosaicml/mpt-7b) | int4 | GGJT |
| [mpt-7b-chat-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-chat-f16.bin) | [mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) | fp16 | GGML |
| [mpt-7b-chat-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-chat-q4_0-ggj.bin) | [mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) | int4 | GGJT |
| [mpt-7b-instruct-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-instruct-f16.bin) | [mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) | fp16 | GGML |
| [mpt-7b-instruct-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-instruct-q4_0-ggjt.bin) | [mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) | int4 | GGJT |
| [mpt-7b-storywriter-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-f16.bin) | [mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) | fp16 | GGML |
| [mpt-7b-storywriter-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-storywriter-q4_0-ggjt.bin) | [mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) | int4 | GGJT |
## Usage
### Rust & Python:
#### TBD See above!
### Via GGML
The `GGML` example only supports the ggml container type!
##### Installation
```
git clone https://github.com/ggerganov/ggml
cd ggml
mkdir build && cd build
cmake ..
make -j4 mpt
```
#### Run inference
```
./bin/mpt -m path/to/model.bin -p "The meaning of life is"
```