zd21 commited on
Commit
bdac23e
1 Parent(s): 3c28046

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +65 -0
README.md CHANGED
@@ -1,3 +1,68 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ datasets:
4
+ - zd21/SciInstruct
5
+ language:
6
+ - en
7
  ---
8
+ # SciGLM: Training Scientific Language Models with Self-Reflective Instruction Annotation and Tuning
9
+
10
+ <p align="center">
11
+ 📃 <a href="https://arxiv.org/abs/2401.07950" target="_blank">[SciGLM]</a> <a href="https://github.com/THUDM/SciGLM" target="_blank">[GitHub]</a> <br>
12
+ </p>
13
+
14
+ **SciGLM** is a suite of scientific language models able to conduct college-level scientific reasoning. Central to our approach is a novel self-reflective instruction annotation framework to address the data scarcity challenge in the science domain. This framework leverages existing LLMs to generate step-by-step reasoning for unlabelled scientific questions, followed by a process of self-reflective critic-and-revise. Applying this framework, we curated SciInstruct, a diverse and high-quality dataset encompassing physics, chemistry, math, and formal proofs.
15
+
16
+ ## **SciInstruct**
17
+
18
+ We construct the SciInstruct as follows:
19
+
20
+ | Subject | Math | Physics\& Chemistry | Formal Proofs (Lean) | Total |
21
+ | --- | ---- | --------- | ------- | --- |
22
+ | # Number | 89,934 | 123,869 | 40,248 | 254,051 |
23
+
24
+ We release our data and model for public use. If you wish to use SciInstruct or SciGLM, you can download them from the following links.
25
+
26
+ Download data:
27
+ [[Google Drive](https://drive.google.com/file/d/1UlvMEau9659BoBxbMG6sk-oKaiIIO-hJ/view?usp=sharing)]
28
+ [[Tsinghua Cloud](https://cloud.tsinghua.edu.cn/d/da691b9466544d55be8e/)]
29
+
30
+ Download model:
31
+ [[Hugging Face](https://huggingface.co/zd21/SciGLM-6B)]
32
+
33
+ ## **Training & Inference**
34
+
35
+ ### **Fine-tuning**
36
+ You can use the SciGLM model through Huggingface's Transformers library.
37
+
38
+ ```
39
+ git clone https://github.com/THUDM/SciGLM.git
40
+ cd SciGLM
41
+ pip install -r requirements.txt
42
+ ```
43
+
44
+ To train the 6B model, run:
45
+ ```
46
+ bash /path/training/finetune.sh
47
+ ```
48
+
49
+ ### Inference
50
+ ```
51
+ cd /path/to/inference
52
+ python cli_demo.py
53
+ ```
54
+
55
+ ## **Citation**
56
+
57
+ If you find our work helpful, please kindly cite our paper:
58
+
59
+ ```
60
+ @misc{zhang2024sciglm,
61
+ title={SciGLM: Training Scientific Language Models with Self-Reflective Instruction Annotation and Tuning},
62
+ author={Dan Zhang and Ziniu Hu and Sining Zhoubian and Zhengxiao Du and Kaiyu Yang and Zihan Wang and Yisong Yue and Yuxiao Dong and Jie Tang},
63
+ year={2024},
64
+ eprint={2401.07950},
65
+ archivePrefix={arXiv},
66
+ primaryClass={cs.CL}
67
+ }
68
+ ```