Unable to load the dataset in Google Colab

#2
by ChatbotML - opened

Hi Shawin, I'm following your LLM series (excellent, by the way) and so far so good. But in the 5th article I am unable to load this dataset in Google Colab.

When I put the full URL (or download one of the .parquet files locally) and run the instruction "dataset = load_dataset("xxxxx")" it always gives an error of type "FileNotFoundError: Couldn't find a dataset script at xxxx/shawhin/imdb-truncated/imdb-truncated.py"

Which file should I download locally? Only the .parquet files? How do I load them with load_dataset()?
Any suggestion?

Looking in HF's loading datasets documentation (I'm not an expert in HF) I've found a solution to my problem:

base_url = "https://huggingface.co/datasets/shawhin/imdb-truncated/resolve/main/data/"
data_files = {"train": base_url + "train-00000-of-00001-5a744bf76a1d84b2.parquet",
"test": base_url + "test-00000-of-00001-9a6632370b120d19.parquet",
"validation": base_url + "validation-00000-of-00001-a3a52fabb70c739f.parquet"}

dataset = load_dataset("parquet", data_files=data_files)
dataset

Everything is working now.

Okay great, thanks for sharing. I'm glad the series has been helpful :)

shawhin changed discussion status to closed

Sign up or log in to comment