renillhuang commited on
Commit
996a317
β€’
1 Parent(s): 8ccede2

Add github link

Browse files
Files changed (1) hide show
  1. README.md +10 -20
README.md CHANGED
@@ -1,21 +1,12 @@
1
- ---
2
- license: other
3
- license_name: orion
4
- license_link: https://huggingface.co/OrionStarAI/Orion-14B-LongChat/blob/main/ModelsCommunityLicenseAgreement
5
- widget:
6
- - text: "Hi!"
7
- output:
8
- text: "Hello! How can I help you today?"
9
- pipeline_tag: text-generation
10
- ---
11
-
12
  <!-- markdownlint-disable first-line-h1 -->
13
  <!-- markdownlint-disable html -->
14
- ![](./assets/imgs/assets_imgs_orion_start.PNG)
 
 
15
 
16
  <div align="center">
17
  <h1>
18
- Orion-14B-LongChat
19
  </h1>
20
  </div>
21
 
@@ -25,8 +16,8 @@ pipeline_tag: text-generation
25
  <h4 align="center">
26
  <p>
27
  <b>🌐English</b> |
28
- <a href="https://huggingface.co/OrionStarAI/Orion-14B-LongChat/blob/main/README_cn.md">πŸ‡¨πŸ‡³δΈ­ζ–‡</a><br><br>
29
- πŸ€— <a href="https://huggingface.co/OrionStarAI" target="_blank">HuggingFace Mainpage</a> | πŸ€– <a href="https://modelscope.cn/organization/OrionStarAI" target="_blank">ModelScope Mainpage</a><br>🎬 <a href="https://huggingface.co/spaces/OrionStarAI/Orion-14B-App-Demo" target="_blank">HuggingFace Demo</a> | 🎫 <a href="https://modelscope.cn/studios/OrionStarAI/Orion-14B-App-Demo/summary" target="_blank">ModelScope Demo</a><br>πŸ“– <a href="https://github.com/OrionStarAI/Orion/blob/master/doc/Orion14B_v3.pdf" target="_blank">Tech Report</a>
30
  <p>
31
  </h4>
32
 
@@ -40,12 +31,12 @@ pipeline_tag: text-generation
40
  - [πŸ”— Model Download](#model-download)
41
  - [πŸ”– Model Benchmark](#model-benchmark)
42
  - [πŸ“Š Model Inference](#model-inference)
43
- - [πŸ₯‡ Company Introduction](#company-introduction)
44
  - [πŸ“œ Declarations & License](#declarations-license)
 
45
 
46
  # 1. Model Introduction
47
 
48
- - Orion-14b-LongChat is based on Orion-14B for optimized training using a longer text corpus. The Orion-14B-LongChat can handle contexts over 200K tokens and perform well.
49
 
50
  - The Orion-14B series models exhibit the following features:
51
  - Among models with 20B-parameter scale level, Orion-14B-Base model shows outstanding performance in comprehensive evaluations.
@@ -53,7 +44,6 @@ pipeline_tag: text-generation
53
  - The fine-tuned models demonstrate strong adaptability, excelling in human-annotated blind tests.
54
  - The long-chat version supports extremely long texts, performing exceptionally well at a token length of 200k and can support up to a maximum of 320k.
55
  - The quantized versions reduce model size by 70%, improve inference speed by 30%, with performance loss less than 1%.
56
-
57
  <table style="border-collapse: collapse; width: 100%;">
58
  <tr>
59
  <td style="border: none; padding: 10px; box-sizing: border-box;">
@@ -336,10 +326,10 @@ Truly Useful Robots", OrionStar empowers more people through AI technology.
336
 
337
  **The core strengths of OrionStar lies in possessing end-to-end AI application capabilities,** including big data preprocessing, large model pretraining, fine-tuning, prompt engineering, agent, etc. With comprehensive end-to-end model training capabilities, including systematic data processing workflows and the parallel model training capability of hundreds of GPUs, it has been successfully applied in various industry scenarios such as government affairs, cloud services, international e-commerce, and fast-moving consumer goods.
338
 
339
- Companies with demands for deploying large-scale model applications are welcome to contact us.
340
  **Enquiry Hotline: 400-898-7779**<br>
341
  **E-mail: ai@orionstar.com**
342
 
343
  <div align="center">
344
- <img src="./assets/imgs/assets_imgs_wechat_group.jpg" alt="wechat" width="40%" />
345
  </div>
 
 
 
 
 
 
 
 
 
 
 
 
1
  <!-- markdownlint-disable first-line-h1 -->
2
  <!-- markdownlint-disable html -->
3
+ <div align="center">
4
+ <img src="./assets/imgs/orion_start.PNG" alt="logo" width="50%" />
5
+ </div>
6
 
7
  <div align="center">
8
  <h1>
9
+ Orion-14B
10
  </h1>
11
  </div>
12
 
 
16
  <h4 align="center">
17
  <p>
18
  <b>🌐English</b> |
19
+ <a href="./README_zh.MD">πŸ‡¨πŸ‡³δΈ­ζ–‡</a><br><br>
20
+ πŸ€— <a href="https://huggingface.co/OrionStarAI" target="_blank">HuggingFace Mainpage</a> | πŸ€– <a href="https://modelscope.cn/organization/OrionStarAI" target="_blank">ModelScope Mainpage</a><br>🎬 <a href="https://huggingface.co/spaces/OrionStarAI/Orion-14B-App-Demo" target="_blank">HuggingFace Demo</a> | 🎫 <a href="https://modelscope.cn/studios/OrionStarAI/Orion-14B-App-Demo/summary" target="_blank">ModelScope Demo</a><br>😺 <a href="https://github.com/OrionStarAI/Orion" target="_blank">GitHub</a><br>πŸ“– <a href="https://github.com/OrionStarAI/Orion/blob/master/doc/Orion14B_v3.pdf" target="_blank">Tech Report</a>
21
  <p>
22
  </h4>
23
 
 
31
  - [πŸ”— Model Download](#model-download)
32
  - [πŸ”– Model Benchmark](#model-benchmark)
33
  - [πŸ“Š Model Inference](#model-inference)
 
34
  - [πŸ“œ Declarations & License](#declarations-license)
35
+ - [πŸ₯‡ Company Introduction](#company-introduction)
36
 
37
  # 1. Model Introduction
38
 
39
+ - Orion-14B series models are open-source multilingual large language models trained from scratch by OrionStarAI. The base model is trained on 2.5T multilingual corpus, including Chinese, English, Japanese, Korean, etc, and it exhibits superior performance in these languages. For details, please refer to [tech report](https://github.com/OrionStarAI/Orion/blob/master/doc/Orion14B_v3.pdf).
40
 
41
  - The Orion-14B series models exhibit the following features:
42
  - Among models with 20B-parameter scale level, Orion-14B-Base model shows outstanding performance in comprehensive evaluations.
 
44
  - The fine-tuned models demonstrate strong adaptability, excelling in human-annotated blind tests.
45
  - The long-chat version supports extremely long texts, performing exceptionally well at a token length of 200k and can support up to a maximum of 320k.
46
  - The quantized versions reduce model size by 70%, improve inference speed by 30%, with performance loss less than 1%.
 
47
  <table style="border-collapse: collapse; width: 100%;">
48
  <tr>
49
  <td style="border: none; padding: 10px; box-sizing: border-box;">
 
326
 
327
  **The core strengths of OrionStar lies in possessing end-to-end AI application capabilities,** including big data preprocessing, large model pretraining, fine-tuning, prompt engineering, agent, etc. With comprehensive end-to-end model training capabilities, including systematic data processing workflows and the parallel model training capability of hundreds of GPUs, it has been successfully applied in various industry scenarios such as government affairs, cloud services, international e-commerce, and fast-moving consumer goods.
328
 
329
+ Companies with demands for deploying large-scale model applications are welcome to contact us.<br>
330
  **Enquiry Hotline: 400-898-7779**<br>
331
  **E-mail: ai@orionstar.com**
332
 
333
  <div align="center">
334
+ <img src="./assets/imgs/wechat_group.jpg" alt="wechat" width="40%" />
335
  </div>