Edit model card

GGML converted version of StabilityAI's StableLM models

Description

StableLM-Base-Alpha is a suite of 3B and 7B parameter decoder-only language models pre-trained on a diverse collection of English and Code datasets with a sequence length of 4096 to push beyond the context window limitations of existing open-source language models.

Converted Models

Name Based on Type Container GGML Version
stablelm-base-alpha-3b-f16.bin stabilityai/stablelm-base-alpha-3b F16 GGML V3
stablelm-base-alpha-3b-q4_0.bin stabilityai/stablelm-base-alpha-3b Q4_0 GGML V3
stablelm-base-alpha-3b-q4_0-ggjt.bin stabilityai/stablelm-base-alpha-3b Q4_0 GGJT V3
stablelm-base-alpha-3b-q5_1.bin stabilityai/stablelm-base-alpha-3b Q5_1 GGML V3
stablelm-base-alpha-3b-q5_1-ggjt.bin stabilityai/stablelm-base-alpha-3b Q5_1 GGJT V3
stablelm-base-alpha-7b-f16.bin stabilityai/stablelm-base-alpha-7b F16 GGML V3
stablelm-base-alpha-7b-q4_0.bin stabilityai/stablelm-base-alpha-7b Q4_0 GGML V3
stablelm-base-alpha-7b-q4_0-ggjt.bin stabilityai/stablelm-base-alpha-7b Q4_0 GGJT V3
stablelm-base-alpha-7b-q5_1.bin stabilityai/stablelm-base-alpha-7b Q5_1 GGML V3
stablelm-base-alpha-7b-q5_1-ggjt.bin stabilityai/stablelm-base-alpha-7b Q5_1 GGJT V3
stablelm-tuned-alpha-3b-f16.bin stabilityai/stablelm-tuned-alpha-3b F16 GGML V3
stablelm-tuned-alpha-3b-q4_0.bin stabilityai/stablelm-tuned-alpha-3b Q4_0 GGML V3
stablelm-tuned-alpha-3b-q4_0-ggjt.bin stabilityai/stablelm-tuned-alpha-3b Q4_0 GGJT V3
stablelm-tuned-alpha-3b-q5_1.bin stabilityai/stablelm-tuned-alpha-3b Q5_1 GGML V3
stablelm-tuned-alpha-3b-q5_1-ggjt.bin stabilityai/stablelm-tuned-alpha-3b Q5_1 GGJT V3
stablelm-tuned-alpha-7b-f16.bin stabilityai/stablelm-tuned-alpha-7b F16 GGML V3
stablelm-tuned-alpha-7b-q4_0.bin stabilityai/stablelm-tuned-alpha-7b Q4_0 GGML V3
stablelm-tuned-alpha-7b-q4_0-ggjt.bin stabilityai/stablelm-tuned-alpha-7b Q4_0 GGJT V3
stablelm-tuned-alpha-7b-q5_1.bin stabilityai/stablelm-tuned-alpha-7b Q5_1 GGML V3
stablelm-tuned-alpha-7b-q5_1-ggjt.bin stabilityai/stablelm-tuned-alpha-7b Q5_1 GGJT V3

Usage

Python via llm-rs:

Installation

Via pip: pip install llm-rs

Run inference

from llm_rs import AutoModel

#Load the model, define any model you like from the list above as the `model_file`
model = AutoModel.from_pretrained("rustformers/stablelm-ggml",model_file="stablelm-base-alpha-3b-q4_0-ggjt.bin")

#Generate
print(model.generate("The meaning of life is"))

Rust via Rustformers/llm:

Installation

git clone --recurse-submodules https://github.com/rustformers/llm.git
cd llm
cargo build --release

Run inference

cargo run --release -- gptneox infer -m path/to/model.bin  -p "Tell me how cool the Rust programming language is:"
Downloads last month
11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.