File size: 1,013 Bytes
a44a9fc |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
---
license: odc-by
task_categories:
- text-generation
language:
- en
tags:
- web
- common crawl
size_categories:
- 100B<n<1T
---
# 📚 fineweb-pro
<p align="center">
<img src="prox-teaser.png">
</p>
[ArXiv](http://arxiv.org/abs/2409.17115) | [Models](https://huggingface.co/collections/gair-prox/prox-general-models-65f1674f0607712c4d6eec76) | [Code](https://github.com/GAIR-NLP/ProX)
DCLM-pro is refined from [DCLM](https://huggingface.co/datasets/mlfoundations/dclm-baseline-1.0-parquet) using the **ProX** refining framework.
It contains about >500B high quality tokens, ready for general language model pre-training.
## License
DCLM-pro is based on DCLM, which is made available under an cc-by-4.0 license.
### Citation
```
@article{zhou2024programming,
title={Programming Every Example: Lifting Pre-training Data Quality like Experts at Scale},
author={Zhou, Fan and Wang, Zengzhi and Liu, Qian and Li, Junlong and Liu, Pengfei},
journal={arXiv preprint arXiv:2409.17115},
year={2024}
}
``` |