Text Generation
Transformers
PyTorch
Safetensors
English
llama
conversational
Inference Endpoints
text-generation-inference
hamishivi commited on
Commit
f275d38
1 Parent(s): b3c64ee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -6
README.md CHANGED
@@ -20,6 +20,9 @@ Tulu is a series of language models that are trained to act as helpful assistant
20
  Tulu V2 DPO 13B is a fine-tuned version of Llama 2 that was trained on on a mix of publicly available, synthetic and human datasets using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290).
21
  This model is a strong alternative to Llama 2 13b Chat.
22
 
 
 
 
23
 
24
  ## Model description
25
 
@@ -135,12 +138,13 @@ The following hyperparameters were used during DPO training:
135
  If you find Tulu 2 is useful in your work, please cite it with:
136
 
137
  ```
138
- @misc{ivison2023changing,
139
- title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
140
- author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
141
- year={2023},
142
- archivePrefix={arXiv},
143
- primaryClass={cs.CL}
 
144
  }
145
  ```
146
 
 
20
  Tulu V2 DPO 13B is a fine-tuned version of Llama 2 that was trained on on a mix of publicly available, synthetic and human datasets using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290).
21
  This model is a strong alternative to Llama 2 13b Chat.
22
 
23
+ For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
24
+ ](https://arxiv.org/abs/2311.10702).
25
+
26
 
27
  ## Model description
28
 
 
138
  If you find Tulu 2 is useful in your work, please cite it with:
139
 
140
  ```
141
+ @misc{ivison2023camels,
142
+ title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
143
+ author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
144
+ year={2023},
145
+ eprint={2311.10702},
146
+ archivePrefix={arXiv},
147
+ primaryClass={cs.CL}
148
  }
149
  ```
150