pemywei commited on
Commit
b651219
1 Parent(s): 0446a4a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -1
README.md CHANGED
@@ -7,4 +7,27 @@ license: apache-2.0
7
  This model is finetuned on [polyLM-13b](https://huggingface.co/DAMO-NLP-MT/polylm-13b) using [multialpaca](https://huggingface.co/datasets/DAMO-NLP-MT/multialpaca) (a self-instruction dataset)
8
 
9
  # Demo
10
- [Open](https://modelscope.cn/studios/damo/demo-polylm-multialpaca-13b/summary)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  This model is finetuned on [polyLM-13b](https://huggingface.co/DAMO-NLP-MT/polylm-13b) using [multialpaca](https://huggingface.co/datasets/DAMO-NLP-MT/multialpaca) (a self-instruction dataset)
8
 
9
  # Demo
10
+ [Open](https://modelscope.cn/studios/damo/demo-polylm-multialpaca-13b/summary)
11
+
12
+ # Bias, Risks, and Limitations
13
+
14
+ The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2307.06018.pdf):
15
+
16
+ > Our contributions are fully methodological: adding the support of multilingualism to LLM during training and SFT phases. It is unavoidable that PolyLM might exhibit several common deficiencies of language models, e.g. hallucination and toxicity. PolyLM should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application.
17
+
18
+ > This version activates the instruction-following capability of PolyLM through self-instruction, but currently, the training instructions are relatively simple and the support for abilities such as multi-turn dialogue, context understanding, CoT, Plugin, etc. is not very friendly. We are making efforts to develop a new version.
19
+
20
+ # Citation
21
+
22
+ **BibTeX:**
23
+
24
+ ```bibtex
25
+ @misc{wei2023polylm,
26
+ title={PolyLM: An Open Source Polyglot Large Language Model},
27
+ author={Xiangpeng Wei and Haoran Wei and Huan Lin and Tianhao Li and Pei Zhang and Xingzhang Ren and Mei Li and Yu Wan and Zhiwei Cao and Binbin Xie and Tianxiang Hu and Shangjie Li and Binyuan Hui and Bowen Yu and Dayiheng Liu and Baosong Yang and Fei Huang and Jun Xie},
28
+ year={2023},
29
+ eprint={2307.06018},
30
+ archivePrefix={arXiv},
31
+ primaryClass={cs.CL}
32
+ }
33
+ ```