hamishivi commited on
Commit
049df6f
1 Parent(s): eb27f3a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -7
README.md CHANGED
@@ -18,7 +18,9 @@ base_model: meta-llama/Llama-2-70b-hf
18
  Tulu is a series of language models that are trained to act as helpful assistants.
19
  Tulu 1 llama2 70B is a fine-tuned version of Llama 2 that was trained on a mix of publicly available, synthetic and human datasets.
20
  Specifically, this model is trained on our v1 Tulu data mixture.
21
- Check out our paper [TODO: link]() for more details!
 
 
22
 
23
 
24
  ## Model description
@@ -118,12 +120,13 @@ If you use this model, please cite the original Tulu work:
118
  If you find Tulu 2 is useful in your work, please cite it with:
119
 
120
  ```
121
- @misc{ivison2023changing,
122
- title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
123
- author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
124
- year={2023},
125
- archivePrefix={arXiv},
126
- primaryClass={cs.CL}
 
127
  }
128
  ```
129
 
 
18
  Tulu is a series of language models that are trained to act as helpful assistants.
19
  Tulu 1 llama2 70B is a fine-tuned version of Llama 2 that was trained on a mix of publicly available, synthetic and human datasets.
20
  Specifically, this model is trained on our v1 Tulu data mixture.
21
+
22
+ For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
23
+ ](https://arxiv.org/abs/2311.10702).
24
 
25
 
26
  ## Model description
 
120
  If you find Tulu 2 is useful in your work, please cite it with:
121
 
122
  ```
123
+ @misc{ivison2023camels,
124
+ title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
125
+ author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
126
+ year={2023},
127
+ eprint={2311.10702},
128
+ archivePrefix={arXiv},
129
+ primaryClass={cs.CL}
130
  }
131
  ```
132