MLX
English
llama
mc0ps's picture
Update README.md
62ef0b5 verified
metadata
language:
  - en
license: apache-2.0
tags:
  - mlx
datasets:
  - cerebras/SlimPajama-627B
  - bigcode/starcoderdata
  - HuggingFaceH4/ultrachat_200k
  - HuggingFaceH4/ultrafeedback_binarized
widget:
  - text: >
      <|system|>

      You are a chatbot who can help code!</s>

      <|user|>

      Write me a function to calculate the first 10 digits of the fibonacci
      sequence in Python and print it out to the CLI.</s>

      <|assistant|>

mlx-community/TinyLlama-1.1B-Chat-v1.0-mlx

This model was converted to MLX format from TinyLlama/TinyLlama-1.1B-Chat-v1.0. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/TinyLlama-1.1B-Chat-v1.0-mlx")
response = generate(model, tokenizer, prompt="hello", verbose=True)