Text Generation
Transformers
English
mpt
llm-rs
ggml
text-generation-inference
mpt-7b-ggml / README.md
LLukas22's picture
Update README.md
c993cc9
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
tags:
  - llm
  - ggml

GGML converted versions of Mosaic's MPT Models

MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code. This model was trained by MosaicML.

MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.

Converted Models:

⚠️Caution⚠️: mpt-7b-storywriter is still under development!

Usage

Python via llm-rs:

Installation

Via pip: pip install llm-rs

Run inference

from llm_rs import Mpt
from huggingface_hub import hf_hub_download

#Download the model
hf_hub_download(repo_id="LLukas22/mpt-7b-ggml", filename="mpt-7b-q4_0-ggjt.bin", local_dir=".")

#Load the model
model = Mpt("mpt-7b-q4_0-ggjt.bin")

#Generate
print(model.generate("The meaning of life is"))

Rust via Rustformers/llm:

Installation

git clone --recurse-submodules git@github.com:rustformers/llm.git
cargo build --release

Run inference

cargo run --release -- mpt infer -m path/to/model.bin  -p "Tell me how cool the Rust programming language is:"

C via GGML

The GGML example only supports the ggml container type!

Installation

git clone https://github.com/ggerganov/ggml
cd ggml
mkdir build && cd build
cmake ..
make -j4 mpt

Run inference

./bin/mpt -m path/to/model.bin -p "The meaning of life is"