Update README.md
Browse files
README.md
CHANGED
@@ -220,6 +220,9 @@ For the LongLLaMA Code see [codellama/CodeLlama-7b-hf](https://huggingface.co/co
|
|
220 |
Some of the examples use external code (see headers of files for copyright notices and licenses).
|
221 |
|
222 |
## Acknowledgments
|
|
|
|
|
223 |
We gratefully acknowledge the TPU Research Cloud program, which was instrumental to our research by providing significant computational resources. We are also grateful to Xinyang Geng and Hao Liu for releasing [OpenLLaMA](https://github.com/openlm-research/open_llama) checkpoints and the [EasyLM](https://github.com/young-geng/EasyLM) library.
|
224 |
|
225 |
We would like to thank [Xiaosong,He](https://github.com/hxs91) for suggestions on how to improve the explanations of cross-batch code.
|
|
|
|
220 |
Some of the examples use external code (see headers of files for copyright notices and licenses).
|
221 |
|
222 |
## Acknowledgments
|
223 |
+
Special thanks to [Keiran Paster](https://twitter.com/keirp1) for providing immensely valuable suggestions about the pre-training data.
|
224 |
+
|
225 |
We gratefully acknowledge the TPU Research Cloud program, which was instrumental to our research by providing significant computational resources. We are also grateful to Xinyang Geng and Hao Liu for releasing [OpenLLaMA](https://github.com/openlm-research/open_llama) checkpoints and the [EasyLM](https://github.com/young-geng/EasyLM) library.
|
226 |
|
227 |
We would like to thank [Xiaosong,He](https://github.com/hxs91) for suggestions on how to improve the explanations of cross-batch code.
|
228 |
+
|