Datasets:

Modalities:
Text
Formats:
parquet
Languages:
English
ArXiv:
DOI:
Libraries:
Datasets
Dask
License:
FalconLLM commited on
Commit
184df75
β€’
1 Parent(s): 43965da

Update for paper release

Browse files
Files changed (1) hide show
  1. README.md +16 -6
README.md CHANGED
@@ -20,7 +20,7 @@ dataset_info:
20
  num_examples: 968000015
21
  download_size: 466888198663
22
  dataset_size: 2766953721769
23
- license: apache-2.0
24
  task_categories:
25
  - text-generation
26
  language:
@@ -32,9 +32,9 @@ size_categories:
32
 
33
  # πŸ“€ Falcon RefinedWeb
34
 
35
- **Falcon RefinedWeb is a massive English web dataset built by [TII](https://www.tii.ae) and released under an Apache 2.0 license.**
36
 
37
- *Paper coming soon 😊.*
38
 
39
  RefinedWeb is built through stringent filtering and large-scale deduplication of CommonCrawl; we found models trained on RefinedWeb to achieve performance in-line or better than models trained on curated datasets, while only relying on web data.
40
 
@@ -58,7 +58,7 @@ RefinedWeb is the main dataset we have used for training the [Falcon LLM](https:
58
  ## Dataset Description
59
 
60
  * **Homepage:** [falconllm.tii.ae](falconllm.tii.ae)
61
- * **Paper:** coming soon 😊
62
  * **Point of Contact:** [falconllm@tii.ae](mailto:falconllm@tii.ae)
63
 
64
  ### Dataset Summary
@@ -151,11 +151,21 @@ Despite our best efforts to filter content that does not qualify as natural lang
151
 
152
  ### Licensing Information
153
 
154
- Apache 2.0.
155
 
156
  ### Citation Information
157
 
158
- Paper coming soon 😊.
 
 
 
 
 
 
 
 
 
 
159
 
160
  ### Contact
161
  falconllm@tii.ae
 
20
  num_examples: 968000015
21
  download_size: 466888198663
22
  dataset_size: 2766953721769
23
+ license: odc-by
24
  task_categories:
25
  - text-generation
26
  language:
 
32
 
33
  # πŸ“€ Falcon RefinedWeb
34
 
35
+ **Falcon RefinedWeb is a massive English web dataset built by [TII](https://www.tii.ae) and released under an ODC-By 1.0 license.**
36
 
37
+ See the πŸ““ [paper on arXiv](https://arxiv.org/abs/2306.01116) for more details.
38
 
39
  RefinedWeb is built through stringent filtering and large-scale deduplication of CommonCrawl; we found models trained on RefinedWeb to achieve performance in-line or better than models trained on curated datasets, while only relying on web data.
40
 
 
58
  ## Dataset Description
59
 
60
  * **Homepage:** [falconllm.tii.ae](falconllm.tii.ae)
61
+ * **Paper:** [https://arxiv.org/abs/2306.01116](https://arxiv.org/abs/2306.01116)
62
  * **Point of Contact:** [falconllm@tii.ae](mailto:falconllm@tii.ae)
63
 
64
  ### Dataset Summary
 
151
 
152
  ### Licensing Information
153
 
154
+ This public extract is made available under an [ODC-By 1.0](https://opendatacommons.org/licenses/by/1-0/) license; users should also abide to the [CommonCrawl ToU](https://commoncrawl.org/terms-of-use/).
155
 
156
  ### Citation Information
157
 
158
+ ```
159
+ @article{refinedweb,
160
+ title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
161
+ author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
162
+ journal={arXiv preprint arXiv:2306.01116},
163
+ eprint={2306.01116},
164
+ eprinttype = {arXiv},
165
+ url={https://arxiv.org/abs/2306.01116},
166
+ year={2023}
167
+ }
168
+ ```
169
 
170
  ### Contact
171
  falconllm@tii.ae