zhujiangang commited on
Commit
bb76afb
·
verified ·
1 Parent(s): 64ed0bb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +91 -3
README.md CHANGED
@@ -1,3 +1,91 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ ---
4
+ ---
5
+ license: mit
6
+ ---
7
+ # Ling
8
+
9
+ <p align="center">
10
+ <img src="https://huggingface.co/inclusionAI/Ling-lite/resolve/main/ant-bailing.png" width="100"/>
11
+ <p>
12
+
13
+ <p align="center">
14
+ 🤗 <a href="https://huggingface.co/inclusionAI">Hugging Face</a>
15
+ <p>
16
+
17
+ ## Introduction
18
+
19
+ Ling is a MoE LLM provided and open-sourced by InclusionAI. We introduce two different sizes, which are Ling-Lite and Ling-Plus. Ling-Lite has 16.8 billion parameters with 2.75 billion activated parameters, while Ling-Plus has 290 billion parameters with 28.8 billion activated parameters. Both models demonstrate impressive performance compared to existing models in the industry.
20
+
21
+ Their structure makes it easy to scale up and down and adapt to different tasks, so users can use these models for a wide range of tasks, from processing natural language to solving complex problems. Furthermore, the open-source nature of Ling promotes collaboration and innovation within the AI community, fostering a diverse range of use cases and enhancements.
22
+
23
+ As more developers and researchers engage with the platform, we can expect rapid advancements and improvements, leading to even more sophisticated applications. This collaborative approach accelerates development and ensures that the models remain at the forefront of technology, addressing emerging challenges in various fields.
24
+
25
+ ## Model Downloads
26
+
27
+ You can download the following table to see the various parameters for your use case. If you are located in mainland China, we also provide the model on Modulescope.cn to speed up the download process.
28
+
29
+ <div align="center">
30
+
31
+ | **Model** | **#Total Params** | **#Activated Params** | **Context Length** | **Download** |
32
+ | :----------------: | :---------------: | :-------------------: | :----------------: | :----------: |
33
+ | Ling-lite-base | 16.8B | 2.75B | 64K | [🤗 HuggingFace](https://huggingface.co/inclusionAI/Ling-lite-base)|
34
+ | Ling-lite | 16.8B | 2.75B | 64K | [🤗 HuggingFace](https://huggingface.co/inclusionAI/Ling-lite)|
35
+ </div>
36
+
37
+ ## Evaluation
38
+
39
+ Detailed evaluation results are reported in our technical report [TBD].
40
+
41
+ ## Quickstart
42
+ ### 🤗 Hugging Face Transformers
43
+
44
+ Here is a code snippet to show you how to use the chat model with `transformers`:
45
+
46
+ ```python
47
+ from transformers import AutoModelForCausalLM, AutoTokenizer
48
+
49
+ model_name = "inclusionAI/Ling-lite"
50
+
51
+ model = AutoModelForCausalLM.from_pretrained(
52
+ model_name,
53
+ torch_dtype="auto",
54
+ device_map="auto"
55
+ )
56
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
57
+
58
+ prompt = "Give me a short introduction to large language models."
59
+ messages = [
60
+ {"role": "system", "content": "You are Ling, an assistant created by inclusionAI"},
61
+ {"role": "user", "content": prompt}
62
+ ]
63
+ text = tokenizer.apply_chat_template(
64
+ messages,
65
+ tokenize=False,
66
+ add_generation_prompt=True
67
+ )
68
+ model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
69
+
70
+ generated_ids = model.generate(
71
+ **model_inputs,
72
+ max_new_tokens=512
73
+ )
74
+ generated_ids = [
75
+ output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
76
+ ]
77
+
78
+ response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
79
+ ```
80
+
81
+ ## Deployment
82
+
83
+ ### vLLM
84
+
85
+ ### MindIE
86
+
87
+ ## License
88
+ This code repository is licensed under [the MIT License](https://huggingface.co/inclusionAI/Ling-lite/blob/main/LICENCE).
89
+
90
+ ## Citation
91
+ [TBD]