Datasets:

Languages:
English
Size Categories:
n>1T
ArXiv:
License:

Please provide a list of file hashes in order to check integrity of downloads

#24
by markusheimerl - opened

I just finished downloading and Id like to verify the integrity of my download like this:

import os
import subprocess
import requests

# Read the list of URLs from v1_6.txt
with open("dolma/urls/v1_6.txt", "r") as file:
    urls = file.read().splitlines()

# Extract the filenames from the URLs
expected_files = [url.split("/")[-1] for url in urls]

# Get the list of downloaded files using ls -lh command
ls_output = subprocess.check_output("ls -lh /home/markusheimerl/xvit-415020_dolma/", shell=True).decode().strip()
downloaded_files = [line.split()[-1] for line in ls_output.split("\n")]

# Find missing files
missing_files = set(expected_files) - set(downloaded_files)

# Print the missing files
if missing_files:
    print("Missing files:")
    for file in missing_files:
        print(file)
else:
    print("All files have been downloaded.")

# Verify the integrity of downloaded files
print("\nVerifying file integrity:")
for url in urls:
    filename = url.split("/")[-1]
    if filename in downloaded_files:
        response = requests.head(url)
        expected_size = response.headers["Content-Length"]
        local_size = os.path.getsize(f"/home/markusheimerl/xvit-415020_dolma/{filename}")
        
        if expected_size == local_size:
            print(f"{filename}: OK")
        else:
            print(f"{filename}: Mismatch (Expected: {expected_size} bytes, Local: {local_size} bytes)")

But response.headers["Content-Length"] throws a key error. Sure that has more to do with how cloudflare hosts those files but a separate textfile with all the file hashes provided by you would be great!

hi, do you solve this? To verify the file. There are so many files exist with no error.

I did not solve this. I simply hope that all downloads finished successfully.

Sign up or log in to comment