FalconLLM commited on
Commit
88b92ae
β€’
1 Parent(s): b046281

Update citation info

Browse files
Files changed (1) hide show
  1. README.md +23 -1
README.md CHANGED
@@ -219,7 +219,29 @@ Falcon-40B was trained a custom distributed training codebase, Gigatron. It uses
219
 
220
  ## Citation
221
 
222
- *Paper coming soon 😊.*
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
223
 
224
  ## License
225
 
 
219
 
220
  ## Citation
221
 
222
+ *Paper coming soon* 😊. In the meanwhile, you can use the following information to cite:
223
+ ```
224
+ @article{falcon40b,
225
+ title={{Falcon-40B}: an open large language model with state-of-the-art performance},
226
+ author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
227
+ year={2023}
228
+ }
229
+ ```
230
+
231
+ To learn more about the pretraining dataset, see the πŸ““ [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
232
+
233
+ ```
234
+ @article{refinedweb,
235
+ title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
236
+ author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
237
+ journal={arXiv preprint arXiv:2306.01116},
238
+ eprint={2306.01116},
239
+ eprinttype = {arXiv},
240
+ url={https://arxiv.org/abs/2306.01116},
241
+ year={2023}
242
+ }
243
+ ```
244
+
245
 
246
  ## License
247