--- license: apache-2.0 language: - en pipeline_tag: text-generation tags: - llm-rs - ggml datasets: - mc4 - c4 - togethercomputer/RedPajama-Data-1T - bigcode/the-stack - allenai/s2orc inference: false --- # GGML converted versions of [Mosaic's](https://huggingface.co/mosaicml) MPT Models MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code. This model was trained by [MosaicML](https://www.mosaicml.com). MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference. ## Converted Models: $MODELS$ ⚠️Caution⚠️: mpt-7b-storywriter is still under development! ## Usage ### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python): #### Installation Via pip: `pip install llm-rs` #### Run inference ```python from llm_rs import AutoModel #Load the model, define any model you like from the list above as the `model_file` model = AutoModel.from_pretrained("rustformers/mpt-7b-ggml",model_file="mpt-7b-q4_0-ggjt.bin") #Generate print(model.generate("The meaning of life is")) ``` ### Rust via [rustformers/llm](https://github.com/rustformers/llm): #### Installation ``` git clone --recurse-submodules https://github.com/rustformers/llm.git cd llm cargo build --release ``` #### Run inference ``` cargo run --release -- mpt infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:" ``` ### C via [GGML](https://github.com/ggerganov/ggml) The `GGML` example only supports the ggml container type! #### Installation ``` git clone https://github.com/ggerganov/ggml cd ggml mkdir build && cd build cmake .. make -j4 mpt ``` #### Run inference ``` ./bin/mpt -m path/to/model.bin -p "The meaning of life is" ```