GGML converted versions of Mosaic's MPT Models

MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code. This model was trained by MosaicML.

MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.

Converted Models:

⚠️Caution⚠️: mpt-7b-storywriter is still under development!

Usage

Python via llm-rs:

Installation

Via pip: pip install llm-rs

Run inference

from llm_rs import AutoModel

#Load the model, define any model you like from the list above as the `model_file`
model = AutoModel.from_pretrained("rustformers/mpt-7b-ggml",model_file="mpt-7b-q4_0-ggjt.bin")

#Generate
print(model.generate("The meaning of life is"))

Rust via rustformers/llm:

Installation

git clone --recurse-submodules https://github.com/rustformers/llm.git
cd llm
cargo build --release

Run inference

cargo run --release -- mpt infer -m path/to/model.bin  -p "Tell me how cool the Rust programming language is:"

C via GGML

The GGML example only supports the ggml container type!

Installation

git clone https://github.com/ggerganov/ggml
cd ggml
mkdir build && cd build
cmake ..
make -j4 mpt

Run inference

./bin/mpt -m path/to/model.bin -p "The meaning of life is"
Downloads last month
46
Inference Examples
Inference API (serverless) has been turned off for this model.

Datasets used to train rustformers/mpt-7b-ggml

Spaces using rustformers/mpt-7b-ggml 9