Dataset Preview
Viewer
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    DatasetGenerationError
Message:      An error occurred while generating the dataset
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1282, in compute_config_parquet_and_info_response
                  fill_builder_info(builder, hf_endpoint=hf_endpoint, hf_token=hf_token, validate=validate)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 707, in fill_builder_info
                  ) = retry_validate_get_features_num_examples_size_and_compression_ratio(
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 626, in retry_validate_get_features_num_examples_size_and_compression_ratio
                  validate(pf)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 664, in validate
                  raise TooBigRowGroupsError(
              worker.job_runners.config.parquet_and_info.TooBigRowGroupsError: Parquet file has too big row groups. First row group has 509618876 which exceeds the limit of 300000000
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
                  response.raise_for_status()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/models.py", line 1024, in raise_for_status
                  raise HTTPError(http_error_msg, response=self)
              requests.exceptions.HTTPError: 429 Client Error: Too Many Requests for url: https://huggingface.co/datasets/1aurent/BACH/resolve/55effe690290867396fc4403792fd27553376b34/data/train-00004-of-00015-9aafc9a5d645bda6.parquet
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1995, in _prepare_split_single
                  for _, table in generator:
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 821, in wrapped
                  for item in generator(*args, **kwargs):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 86, in _generate_tables
                  parquet_file = pq.ParquetFile(f)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 341, in __init__
                  self.reader.open(
                File "pyarrow/_parquet.pyx", line 1250, in pyarrow._parquet.ParquetReader.open
                File "pyarrow/types.pxi", line 88, in pyarrow.lib._datatype_to_pep3118
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 1101, in read_with_retries
                  out = read(*args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 747, in read
                  return super().read(length)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1846, in read
                  out = self.cache._fetch(self.loc, self.loc + length)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/caching.py", line 189, in _fetch
                  self.cache = self.fetcher(start, end)  # new block replaces old
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 710, in _fetch_range
                  hf_raise_for_status(r)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 371, in hf_raise_for_status
                  raise HfHubHTTPError(str(e), response=response) from e
              huggingface_hub.utils._errors.HfHubHTTPError: 429 Client Error: Too Many Requests for url: https://huggingface.co/datasets/1aurent/BACH/resolve/55effe690290867396fc4403792fd27553376b34/data/train-00004-of-00015-9aafc9a5d645bda6.parquet
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1295, in compute_config_parquet_and_info_response
                  parquet_operations, partial = stream_convert_to_parquet(
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 912, in stream_convert_to_parquet
                  builder._prepare_split(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Open a discussion for direct support.

image
image
label
class label
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
0Benign
End of preview.

BreAst Cancer Histology (BACH) Dataset: Grand Challenge on Breast Cancer Histology images

Description

The dataset is composed of Hematoxylin and eosin (H&E) stained breast histology microscopy images.

Microscopy images are labelled as normal, benign, in situ carcinoma or invasive carcinoma according to the predominant cancer type in each image. The annotation was performed by two medical experts and images where there was disagreement were discarded. Images have the following specifications:

  • Color model: R(ed)G(reen)B(lue)
  • Size: 2048 x 1536 pixels
  • Pixel scale: 0.42 µm x 0.42 µm
  • Memory space: 10-20 MB (approx.)
  • Type of label: image-wise

Citation

@dataset{polonia_2020_3632035,
  author    = {Polónia, António and Eloy, Catarina and Aguiar, Paulo},
  title     = {{BACH Dataset : Grand Challenge on Breast Cancer Histology images}},
  month     = jan,
  year      = 2020,
  publisher = {Zenodo}
}
Downloads last month
26

Models trained or fine-tuned on 1aurent/BACH