MLX
English
llama
mc0ps commited on
Commit
62ef0b5
1 Parent(s): 11babb2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -5
README.md CHANGED
@@ -24,13 +24,19 @@ widget:
24
  '
25
  ---
26
 
27
- # TinyLlama-1.1B-Chat-v1.0-mlx
28
  This model was converted to MLX format from [`TinyLlama/TinyLlama-1.1B-Chat-v1.0`]().
29
  Refer to the [original model card](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0) for more details on the model.
30
  ## Use with mlx
 
31
  ```bash
32
- pip install mlx
33
- git clone https://github.com/ml-explore/mlx-examples.git
34
- cd mlx-examples/llms/hf_llm
35
- python generate.py --model mlx-community/TinyLlama-1.1B-Chat-v1.0-mlx --prompt "My name is"
36
  ```
 
 
 
 
 
 
 
 
 
24
  '
25
  ---
26
 
27
+ # mlx-community/TinyLlama-1.1B-Chat-v1.0-mlx
28
  This model was converted to MLX format from [`TinyLlama/TinyLlama-1.1B-Chat-v1.0`]().
29
  Refer to the [original model card](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0) for more details on the model.
30
  ## Use with mlx
31
+
32
  ```bash
33
+ pip install mlx-lm
 
 
 
34
  ```
35
+
36
+ ```python
37
+ from mlx_lm import load, generate
38
+
39
+ model, tokenizer = load("mlx-community/TinyLlama-1.1B-Chat-v1.0-mlx")
40
+ response = generate(model, tokenizer, prompt="hello", verbose=True)
41
+ ```
42
+