RichardErkhov commited on
Commit
be45250
•
1 Parent(s): a928d1d

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +132 -0
README.md ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ Eurus-7b-kto - GGUF
11
+ - Model creator: https://huggingface.co/openbmb/
12
+ - Original model: https://huggingface.co/openbmb/Eurus-7b-kto/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [Eurus-7b-kto.Q2_K.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q2_K.gguf) | Q2_K | 2.53GB |
18
+ | [Eurus-7b-kto.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.IQ3_XS.gguf) | IQ3_XS | 2.81GB |
19
+ | [Eurus-7b-kto.IQ3_S.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.IQ3_S.gguf) | IQ3_S | 2.96GB |
20
+ | [Eurus-7b-kto.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
21
+ | [Eurus-7b-kto.IQ3_M.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.IQ3_M.gguf) | IQ3_M | 3.06GB |
22
+ | [Eurus-7b-kto.Q3_K.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q3_K.gguf) | Q3_K | 3.28GB |
23
+ | [Eurus-7b-kto.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
24
+ | [Eurus-7b-kto.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
25
+ | [Eurus-7b-kto.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
26
+ | [Eurus-7b-kto.Q4_0.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q4_0.gguf) | Q4_0 | 3.83GB |
27
+ | [Eurus-7b-kto.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
28
+ | [Eurus-7b-kto.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
29
+ | [Eurus-7b-kto.Q4_K.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q4_K.gguf) | Q4_K | 4.07GB |
30
+ | [Eurus-7b-kto.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
31
+ | [Eurus-7b-kto.Q4_1.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q4_1.gguf) | Q4_1 | 4.24GB |
32
+ | [Eurus-7b-kto.Q5_0.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q5_0.gguf) | Q5_0 | 4.65GB |
33
+ | [Eurus-7b-kto.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
34
+ | [Eurus-7b-kto.Q5_K.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q5_K.gguf) | Q5_K | 4.78GB |
35
+ | [Eurus-7b-kto.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
36
+ | [Eurus-7b-kto.Q5_1.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q5_1.gguf) | Q5_1 | 5.07GB |
37
+ | [Eurus-7b-kto.Q6_K.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q6_K.gguf) | Q6_K | 5.53GB |
38
+ | [Eurus-7b-kto.Q8_0.gguf](https://huggingface.co/RichardErkhov/openbmb_-_Eurus-7b-kto-gguf/blob/main/Eurus-7b-kto.Q8_0.gguf) | Q8_0 | 7.17GB |
39
+
40
+
41
+
42
+
43
+ Original model description:
44
+ ---
45
+ license: apache-2.0
46
+ datasets:
47
+ - openbmb/UltraFeedback
48
+ - openbmb/UltraInteract_pair
49
+ tags:
50
+ - reasoning
51
+ - preference_learning
52
+ - kto
53
+ pipeline_tag: text-generation
54
+ ---
55
+
56
+ <div align="center">
57
+
58
+ <img src="https://huggingface.co/openbmb/Eurus-7b-sft/resolve/main/figures/Eurus-logo.png" width="200px">
59
+
60
+ **Eurus: A suit of open-source LLMs optimized for reasoning**
61
+
62
+ <p align="center">
63
+ <a href="#introduction"> Introduction</a> •
64
+ <a href="#evaluation">Evaluation</a>
65
+ </p>
66
+
67
+
68
+ </div>
69
+
70
+ # Links
71
+
72
+ - 📜 [Paper](https://arxiv.org/abs/2404.02078)
73
+ - 🤗 [Eurus Collection](https://huggingface.co/collections/openbmb/eurus-660bc40bec5376b3adc9d1c5)
74
+ - 🤗 UltraInteract
75
+ - [SFT](https://huggingface.co/datasets/openbmb/UltraInteract_sft)
76
+ - [Preference Learning](https://huggingface.co/datasets/openbmb/UltraInteract_pair)
77
+ - [GitHub Repo](https://github.com/OpenBMB/Eurus)
78
+
79
+
80
+ # Introduction
81
+
82
+ Eurus-7B-KTO is [KTO](https://arxiv.org/abs/2402.01306) fine-tuned from [Eurus-7B-SFT](https://huggingface.co/openbmb/Eurus-7b-sft) on all multi-turn trajectory pairs in [UltraInteract](https://huggingface.co/openbmb/UltraInteract) and all pairs in [UltraFeedback](https://huggingface.co/openbmb/UltraFeedback).
83
+
84
+ It achieves the best overall performance among open-source models of similar sizes and even outperforms specialized models in corresponding domains in many cases. Notably, Eurus-7B-KTO outperforms baselines that are 5× larger.
85
+
86
+ ## Usage
87
+
88
+ We apply tailored prompts for coding and math, consistent with UltraInteract data formats:
89
+
90
+ **Coding**
91
+
92
+ ```
93
+ [INST] Write Python code to solve the task:
94
+ {Instruction} [/INST]
95
+ ```
96
+ **Math-CoT**
97
+
98
+ ```
99
+ [INST] Solve the following math problem step-by-step.
100
+ Simplify your answer as much as possible. Present your final answer as \\boxed{Your Answer}.
101
+ {Instruction} [/INST]
102
+ ```
103
+
104
+ **Math-PoT**
105
+
106
+ ```
107
+ [INST] Tool available:
108
+ [1] Python interpreter
109
+ When you send a message containing Python code to python, it will be executed in a stateful Jupyter notebook environment.
110
+ Solve the following math problem step-by-step.
111
+ Simplify your answer as much as possible.
112
+ {Instruction} [/INST]
113
+ ```
114
+
115
+ ## Evaluation
116
+ - Eurus, both the 7B and 70B variants, achieve the best overall performance among open-source models of similar sizes. Eurus even outperforms specialized models in corresponding domains in many cases. Notably, Eurus-7B outperforms baselines that are 5× larger, and Eurus-70B achieves better performance than GPT-3.5 Turbo.
117
+ - Preference learning with UltraInteract can further improve performance, especially in math and the multi-turn ability.
118
+ <img src="./figures/main_exp.png" alt="stats" style="zoom: 40%;" />
119
+
120
+
121
+ ## Citation
122
+ ```
123
+ @misc{yuan2024advancing,
124
+ title={Advancing LLM Reasoning Generalists with Preference Trees},
125
+ author={Lifan Yuan and Ganqu Cui and Hanbin Wang and Ning Ding and Xingyao Wang and Jia Deng and Boji Shan and Huimin Chen and Ruobing Xie and Yankai Lin and Zhenghao Liu and Bowen Zhou and Hao Peng and Zhiyuan Liu and Maosong Sun},
126
+ year={2024},
127
+ eprint={2404.02078},
128
+ archivePrefix={arXiv},
129
+ primaryClass={cs.AI}
130
+ }
131
+ ```
132
+