Update README.md
Browse files
README.md
CHANGED
@@ -17,7 +17,9 @@ base_model: codellama/CodeLlama-7b-hf
|
|
17 |
|
18 |
Tulu is a series of language models that are trained to act as helpful assistants.
|
19 |
Codetulu 2 7B is a fine-tuned version of Codellama that was trained on a mix of publicly available, synthetic and human datasets.
|
20 |
-
|
|
|
|
|
21 |
|
22 |
|
23 |
## Model description
|
@@ -106,12 +108,13 @@ The following hyperparameters were used during finetuning:
|
|
106 |
If you find Tulu 2 is useful in your work, please cite it with:
|
107 |
|
108 |
```
|
109 |
-
@misc{
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
|
|
115 |
}
|
116 |
```
|
117 |
|
|
|
17 |
|
18 |
Tulu is a series of language models that are trained to act as helpful assistants.
|
19 |
Codetulu 2 7B is a fine-tuned version of Codellama that was trained on a mix of publicly available, synthetic and human datasets.
|
20 |
+
|
21 |
+
For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
|
22 |
+
](https://arxiv.org/abs/2311.10702).
|
23 |
|
24 |
|
25 |
## Model description
|
|
|
108 |
If you find Tulu 2 is useful in your work, please cite it with:
|
109 |
|
110 |
```
|
111 |
+
@misc{ivison2023camels,
|
112 |
+
title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
|
113 |
+
author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
|
114 |
+
year={2023},
|
115 |
+
eprint={2311.10702},
|
116 |
+
archivePrefix={arXiv},
|
117 |
+
primaryClass={cs.CL}
|
118 |
}
|
119 |
```
|
120 |
|