Datasets:
Tasks:
Token Classification
Modalities:
Text
Sub-tasks:
named-entity-recognition
Languages:
English
Size:
100K - 1M
Tags:
structure-prediction
License:
Issue while downloading the dataset.
#7
by
tushifire
- opened
from datasets import load_dataset
dataset = load_dataset("DFKI-SLT/few-nerd")
Got the above code from by clicking on the use this dataset button on the model card.
Ran it on colab and faced with the following error trace.
/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py:88: UserWarning:
The secret `HF_TOKEN` does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
warnings.warn(
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-5-04726d0e7b2f> in <cell line: 2>()
1 from datasets import load_dataset
----> 2 dataset = load_dataset("DFKI-SLT/few-nerd")
3 frames
/usr/local/lib/python3.10/dist-packages/datasets/load.py in load_dataset(path, name, data_dir, data_files, split, cache_dir, features, download_config, download_mode, verification_mode, ignore_verifications, keep_in_memory, save_infos, revision, token, use_auth_token, task, streaming, num_proc, storage_options, trust_remote_code, **config_kwargs)
2554
2555 # Create a dataset builder
-> 2556 builder_instance = load_dataset_builder(
2557 path=path,
2558 name=name,
/usr/local/lib/python3.10/dist-packages/datasets/load.py in load_dataset_builder(path, name, data_dir, data_files, cache_dir, features, download_config, download_mode, revision, token, use_auth_token, storage_options, trust_remote_code, _require_default_config_name, **config_kwargs)
2263 builder_cls = get_dataset_builder_class(dataset_module, dataset_name=dataset_name)
2264 # Instantiate the dataset builder
-> 2265 builder_instance: DatasetBuilder = builder_cls(
2266 cache_dir=cache_dir,
2267 dataset_name=dataset_name,
/usr/local/lib/python3.10/dist-packages/datasets/builder.py in __init__(self, cache_dir, dataset_name, config_name, hash, base_path, info, features, token, use_auth_token, repo_id, data_files, data_dir, storage_options, writer_batch_size, name, **config_kwargs)
369 if data_dir is not None:
370 config_kwargs["data_dir"] = data_dir
--> 371 self.config, self.config_id = self._create_builder_config(
372 config_name=config_name,
373 custom_features=features,
/usr/local/lib/python3.10/dist-packages/datasets/builder.py in _create_builder_config(self, config_name, custom_features, **config_kwargs)
575 if not config_kwargs:
576 example_of_usage = f"load_dataset('{self.dataset_name}', '{self.BUILDER_CONFIGS[0].name}')"
--> 577 raise ValueError(
578 "Config name is missing."
579 f"\nPlease pick one among the available configs: {list(self.builder_configs.keys())}"
ValueError: Config name is missing.
Please pick one among the available configs: ['inter', 'intra', 'supervised']
Example of usage:
`load_dataset('few-nerd', 'inter')`
Try 2
Even using the suggested way load_dataset('few-nerd', 'inter') failed.
---------------------------------------------------------------------------
DatasetNotFoundError Traceback (most recent call last)
<ipython-input-8-284f722048d5> in <cell line: 2>()
1 from datasets import load_dataset
----> 2 dataset = load_dataset('few-nerd', 'inter')
3 frames
/usr/local/lib/python3.10/dist-packages/datasets/load.py in load_dataset(path, name, data_dir, data_files, split, cache_dir, features, download_config, download_mode, verification_mode, ignore_verifications, keep_in_memory, save_infos, revision, token, use_auth_token, task, streaming, num_proc, storage_options, trust_remote_code, **config_kwargs)
2554
2555 # Create a dataset builder
-> 2556 builder_instance = load_dataset_builder(
2557 path=path,
2558 name=name,
/usr/local/lib/python3.10/dist-packages/datasets/load.py in load_dataset_builder(path, name, data_dir, data_files, cache_dir, features, download_config, download_mode, revision, token, use_auth_token, storage_options, trust_remote_code, _require_default_config_name, **config_kwargs)
2226 download_config = download_config.copy() if download_config else DownloadConfig()
2227 download_config.storage_options.update(storage_options)
-> 2228 dataset_module = dataset_module_factory(
2229 path,
2230 revision=revision,
/usr/local/lib/python3.10/dist-packages/datasets/load.py in dataset_module_factory(path, revision, download_config, download_mode, dynamic_modules_path, data_dir, data_files, cache_dir, trust_remote_code, _require_default_config_name, _require_custom_configs, **download_kwargs)
1871 raise ConnectionError(f"Couldn't reach the Hugging Face Hub for dataset '{path}': {e1}") from None
1872 if isinstance(e1, (DataFilesNotFoundError, DatasetNotFoundError, EmptyDatasetError)):
-> 1873 raise e1 from None
1874 if isinstance(e1, FileNotFoundError):
1875 raise FileNotFoundError(
/usr/local/lib/python3.10/dist-packages/datasets/load.py in dataset_module_factory(path, revision, download_config, download_mode, dynamic_modules_path, data_dir, data_files, cache_dir, trust_remote_code, _require_default_config_name, _require_custom_configs, **download_kwargs)
1817 msg = f"Dataset '{path}' doesn't exist on the Hub or cannot be accessed"
1818 msg = msg + f" at revision '{revision}'" if revision else msg
-> 1819 raise DatasetNotFoundError(
1820 msg
1821 + f". If the dataset is private or gated, make sure to log in with `huggingface-cli login` or visit the dataset page at https://huggingface.co/datasets/{path} to ask for access."
DatasetNotFoundError: Dataset 'few-nerd' doesn't exist on the Hub or cannot be accessed. If the dataset is private or gated, make sure to log in with `huggingface-cli login` or visit the dataset page at https://huggingface.co/datasets/few-nerd to ask for access.
Hi,
both times only almost correct. The 2nd time you forgot to put the correct dataset name, i.e. "DFKI-SLT/few-nerd". The correct call would be dataset = load_dataset("DFKI-SLT/few-nerd", "inter").
Leo
tushifire
changed discussion status to
closed