AMKCode commited on
Commit
9656926
·
verified ·
1 Parent(s): 9481ba1

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +70 -0
README.md ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: Qwen/Qwen2.5-0.5B
3
+ language:
4
+ - en
5
+ library_name: transformers
6
+ license: apache-2.0
7
+ license_link: https://huggingface.co/Qwen/Qwen2.5-0.5B/blob/main/LICENSE
8
+ pipeline_tag: text-generation
9
+ tags:
10
+ - mlc-ai
11
+ - MLC-Weight-Conversion
12
+ ---
13
+ ---
14
+ library_name: mlc-llm
15
+ base_model: Qwen/Qwen2.5-0.5B
16
+ tags:
17
+ - mlc-llm
18
+ - web-llm
19
+ ---
20
+
21
+ # AMKCode/Qwen2.5-0.5B-q4f16_1-MLC
22
+
23
+ This is the [Qwen2.5-0.5B](https://huggingface.co/Qwen/Qwen2.5-0.5B) model in MLC format `q4f16_1`.
24
+ The conversion was done using the [MLC-Weight-Conversion](https://huggingface.co/spaces/mlc-ai/MLC-Weight-Conversion) space.
25
+ The model can be used for projects [MLC-LLM](https://github.com/mlc-ai/mlc-llm) and [WebLLM](https://github.com/mlc-ai/web-llm).
26
+
27
+ ## Example Usage
28
+
29
+ Here are some examples of using this model in MLC LLM.
30
+ Before running the examples, please install MLC LLM by following the [installation documentation](https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages).
31
+
32
+ ### Chat
33
+
34
+ In command line, run
35
+ ```bash
36
+ mlc_llm chat HF://mlc-ai/AMKCode/Qwen2.5-0.5B-q4f16_1-MLC
37
+ ```
38
+
39
+ ### REST Server
40
+
41
+ In command line, run
42
+ ```bash
43
+ mlc_llm serve HF://mlc-ai/AMKCode/Qwen2.5-0.5B-q4f16_1-MLC
44
+ ```
45
+
46
+ ### Python API
47
+
48
+ ```python
49
+ from mlc_llm import MLCEngine
50
+
51
+ # Create engine
52
+ model = "HF://mlc-ai/AMKCode/Qwen2.5-0.5B-q4f16_1-MLC"
53
+ engine = MLCEngine(model)
54
+
55
+ # Run chat completion in OpenAI API.
56
+ for response in engine.chat.completions.create(
57
+ messages=[{"role": "user", "content": "What is the meaning of life?"}],
58
+ model=model,
59
+ stream=True,
60
+ ):
61
+ for choice in response.choices:
62
+ print(choice.delta.content, end="", flush=True)
63
+ print("\n")
64
+
65
+ engine.terminate()
66
+ ```
67
+
68
+ ## Documentation
69
+
70
+ For more information on MLC LLM project, please visit our [documentation](https://llm.mlc.ai/docs/) and [GitHub repo](http://github.com/mlc-ai/mlc-llm).