Update README.md
Browse files
README.md
CHANGED
@@ -17,6 +17,10 @@ BigBanyanTree is an initiative to empower colleges to set up their data engineer
|
|
17 |
# Content
|
18 |
Each `arrow` file contains a table with fields extracted from Common Crawl WARC files.
|
19 |
|
|
|
|
|
|
|
|
|
20 |
## <span style="color:red">⚠️ WARNING ⚠️</span>
|
21 |
|
22 |
The **URLs** and **IP addresses** extracted in this dataset are sourced from **publicly available Common Crawl data dumps**. Please be aware that:
|
|
|
17 |
# Content
|
18 |
Each `arrow` file contains a table with fields extracted from Common Crawl WARC files.
|
19 |
|
20 |
+
The datasets provided are derived from processing randomly sampled 895 WARC files from the [2020-50 CommonCrawl dump](https://data.commoncrawl.org/crawl-data/CC-MAIN-2020-50/index.html).
|
21 |
+
|
22 |
+
The MaxMind database used to enrich WARC data with geolocation information is GeoLite2-City_20240903 (released on 3rd Sept. 2024).
|
23 |
+
|
24 |
## <span style="color:red">⚠️ WARNING ⚠️</span>
|
25 |
|
26 |
The **URLs** and **IP addresses** extracted in this dataset are sourced from **publicly available Common Crawl data dumps**. Please be aware that:
|