Text Generation
MLX
Safetensors
English
mistral
Isaak Carter Augustus
Upload folder using huggingface_hub (#1)
f2411a9 verified
|
raw
history blame
822 Bytes
metadata
language:
  - en
license: apache-2.0
tags:
  - mlx
datasets:
  - Skylion007/openwebtext
  - JeanKaddour/minipile
pipeline_tag: text-generation
inference:
  parameters:
    do_sample: true
    temperature: 0.5
    top_p: 0.5
    top_k: 50
    max_new_tokens: 250
    repetition_penalty: 1.176

mlx-community/TinyMistral-248M-4bits

The Model mlx-community/TinyMistral-248M-4bits was converted to MLX format from Locutusque/TinyMistral-248M using mlx-lm version 0.14.0.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/TinyMistral-248M-4bits")
response = generate(model, tokenizer, prompt="hello", verbose=True)