yukiontheiceberg commited on
Commit
c40be13
1 Parent(s): c0c250e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -0
README.md CHANGED
@@ -1,3 +1,71 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ language:
4
+ - en
5
+ pipeline_tag: text-generation
6
+ library_name: transformers
7
+ tags:
8
+ - llm
9
+ - code
10
  ---
11
+
12
+ # CrystalChat
13
+
14
+ <center><img src="crystalcoder_logo.jpg" alt="crystal coder logo" width="300"/></center>
15
+
16
+
17
+ We present CrystalChat, an instruction following model finetuned from [LLM360/CrystalCoder](https://huggingface.co/LLM360/CrystalCoder)
18
+
19
+ | Model | Trained Tokens | ARC | HellaSwag | MMLU (5-shot) | TruthfulQA | Language Avg. | HumanEval (pass@1) | MBPP (pass@1) | Coding Avg. | Avg. of Avg.|
20
+ | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
21
+ | Mistral 7B | - | 59.98 | 83.31 | 64.16 | 42.15 | 62.40 | 29.12 | 38.78 | 33.95 | 48.68 |
22
+ | **CrystalChat 7B** | 1.4T | 51.71 | 76.12 | 53.22 | 47.29 | 57.08 | 34.12 | 39.11 | 36.62 | 46.85 |
23
+ | CrystalCoder 7B | 1.4T | 47.01 | 71.97 | 48.78 | 35.91 | 50.92 | 28.38 | 36.38 | 32.38 | 41.65 |
24
+ | CodeLlaMA 7B | 2.5T | 39.93 | 60.80 | 31.12 | 37.82 | 42.42 | 33.50 | 41.40 | 37.45 | 39.94 |
25
+ | OpenLLaMA v2 7B | 1T | 43.60 | 72.20 | 41.29 | 35.54 | 48.18 | 15.32 | 12.69 | 28.01 | 38.10 |
26
+ | LLaMA 2 7B | 2T | 53.07 | 77.74 | 43.80 | 38.98 | 53.39 | 13.05 | 20.09 | 16.57 | 34.98 |
27
+ | StarCoder-15B | 1.03 | - | - | - | - | - | 33.63 | 43.28 | 38.46 | - |
28
+
29
+ ## Model Description
30
+
31
+ - **Model type:** Language model with the same architecture as LLaMA-7B
32
+ - **Language(s) (NLP):** English
33
+ - **License:** Apache 2.0
34
+ - **Resources for more information:**
35
+ - [Training Code](https://github.com/LLM360/crystalcoder-train)
36
+ - [Data Preparation](https://github.com/LLM360/crystalcoder-data-prep)
37
+ - [Metrics](https://github.com/LLM360/Analysis360)
38
+ - [Fully processed CrystalCoder pretraining data](https://huggingface.co/datasets/LLM360/CrystalCoderDatasets)
39
+
40
+ # Loading CrystalChat
41
+
42
+ ```python
43
+ import torch
44
+ from transformers import LlamaTokenizer, LlamaForCausalLM
45
+
46
+ tokenizer = LlamaTokenizer.from_pretrained("LLM360/CrystalChat/", trust_remote_code=True)
47
+ model = LlamaForCausalLM.from_pretrained("LLM360/CrystalChat", trust_remote_code=True)
48
+
49
+ prompt = 'int add(int x, int y) {'
50
+
51
+ input_ids = tokenizer(prompt, return_tensors="pt").input_ids
52
+ gen_tokens = model.generate(input_ids, do_sample=True, max_length=400)
53
+
54
+ print("-"*20 + "Output for model" + 20 * '-')
55
+ print(tokenizer.batch_decode(gen_tokens)[0])
56
+ ```
57
+
58
+ # Citation
59
+
60
+ **BibTeX:**
61
+
62
+ ```bibtex
63
+ @misc{liu2023llm360,
64
+ title={LLM360: Towards Fully Transparent Open-Source LLMs},
65
+ author={Zhengzhong Liu and Aurick Qiao and Willie Neiswanger and Hongyi Wang and Bowen Tan and Tianhua Tao and Junbo Li and Yuqi Wang and Suqi Sun and Omkar Pangarkar and Richard Fan and Yi Gu and Victor Miller and Yonghao Zhuang and Guowei He and Haonan Li and Fajri Koto and Liping Tang and Nikhil Ranjan and Zhiqiang Shen and Xuguang Ren and Roberto Iriondo and Cun Mu and Zhiting Hu and Mark Schulze and Preslav Nakov and Tim Baldwin and Eric P. Xing},
66
+ year={2023},
67
+ eprint={2312.06550},
68
+ archivePrefix={arXiv},
69
+ primaryClass={cs.CL}
70
+ }
71
+ ```