The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   ConfigNamesError
Exception:    HfHubHTTPError
Message:      504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/benediktkol/DDOS/tree/1ed1314d32ef3a5a7e1434000783a8433517bd0e?recursive=True&expand=False&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoTDNaaGJHbGtZWFJwYjI0dmJtVnBaMmhpYjNWeWFHOXZaQzgyTDNObFoyMWxiblJoZEdsdmJpSjk6MjAwMA%3D%3D
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 73, in compute_config_names_response
                  config_names = get_dataset_config_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 347, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1910, in dataset_module_factory
                  raise e1 from None
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1885, in dataset_module_factory
                  return HubDatasetModuleFactoryWithoutScript(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1263, in get_module
                  patterns = get_data_patterns(base_path, download_config=self.download_config)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 497, in get_data_patterns
                  return _get_data_files_patterns(resolver)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 292, in _get_data_files_patterns
                  data_files = pattern_resolver(pattern)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 384, in resolve_pattern
                  for filepath, info in fs.glob(pattern, detail=True, **glob_kwargs).items()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 406, in glob
                  return super().glob(path, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 604, in glob
                  allpaths = self.find(root, maxdepth=depth, withdirs=True, detail=True, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 426, in find
                  out = self._ls_tree(path, recursive=True, refresh=refresh, revision=resolved_path.revision, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 378, in _ls_tree
                  for path_info in tree:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 2923, in list_repo_tree
                  for path_info in paginate(path=tree_url, headers=headers, params={"recursive": recursive, "expand": expand}):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_pagination.py", line 46, in paginate
                  hf_raise_for_status(r)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 371, in hf_raise_for_status
                  raise HfHubHTTPError(str(e), response=response) from e
              huggingface_hub.utils._errors.HfHubHTTPError: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/benediktkol/DDOS/tree/1ed1314d32ef3a5a7e1434000783a8433517bd0e?recursive=True&expand=False&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoTDNaaGJHbGtZWFJwYjI0dmJtVnBaMmhpYjNWeWFHOXZaQzgyTDNObFoyMWxiblJoZEdsdmJpSjk6MjAwMA%3D%3D

Need help to make the dataset viewer work? Open a discussion for direct support.

DDOS: The Drone Depth and Obstacle Segmentation Dataset

The Drone Depth and Obstacle Segmentation (DDOS) dataset comprises synthetic aerial images captured by drones, along with corresponding depth maps and pixel-wise semantic segmentation masks. DDOS is purpose-built to support research and development in computer vision, focusing on tasks such as depth estimation and obstacle segmentation from aerial imagery. Emphasizing the detection of thin structures like wires and effective navigation in diverse weather conditions, DDOS serves as a valuable resource for advancing algorithms in autonomous drone technology.


Data Structure

DDOS is organised as follows:

  • Data Splits:

    • Train: Contains 300 flights with a total of 30k images for training.
    • Validation: Contains 20 flights with a total of 2k images for validation during model development.
    • Test: Contains 20 flights with a total of 2k images for the final evaluation of the trained model.
  • Environments:

    • Neighbourhood: Contains data captured in urban and residential environments.
    • Park: Contains data captured in park and natural environments.
  • Flights:

    • Each flight is represented by a unique flight ID and is contained within the corresponding environment directory.
  • Data for Each Flight:

    • Image: Contains RGB images captured by the drone camera.
    • Depth: Contains depth maps representing the distance of objects from the camera. These maps are saved as uint16 PNG images, where pixel values range from 0 to 65535, representing distances from 0 to 100 meters linearly.
    • Segmentation: Contains pixel-wise segmentation masks for semantic segmentation. Classes, as well as their corresponding mappings, are mentioned below.
    • Flow: Contains optical flow data representing the apparent motion of objects between consecutive frames.
    • Surface Normal: Contains surface normal maps representing the orientation of object surfaces.

Overview of file structure:

data/
β”œβ”€β”€ train/
β”‚   β”œβ”€β”€ neighbourhood/
β”‚   β”‚   β”œβ”€β”€ 0/
β”‚   β”‚   β”‚   β”œβ”€β”€ depth/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   β”‚   β”‚   └── 99.png
β”‚   β”‚   β”‚   β”œβ”€β”€ flow/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   β”‚   β”‚   └── 99.png
β”‚   β”‚   β”‚   β”œβ”€β”€ image/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   β”‚   β”‚   └── 99.png
β”‚   β”‚   β”‚   β”œβ”€β”€ segmentation/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   β”‚   β”‚   └── 99.png
β”‚   β”‚   β”‚   β”œβ”€β”€ surfacenormals/
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   β”‚   β”‚   └── 99.png
β”‚   β”‚   β”‚   β”œβ”€β”€ metadata.csv
β”‚   β”‚   β”‚   └── weather.csv
β”‚   β”‚   β”œβ”€β”€ ...
β”‚   β”‚   └── 249/
β”‚   β”‚       └── ...
β”‚   └── park/
β”‚       β”œβ”€β”€ 0/
β”‚       β”‚   β”œβ”€β”€ depth/
β”‚       β”‚   β”‚   └── ...
β”‚       β”‚   β”œβ”€β”€ flow/
β”‚       β”‚   β”‚   └── ...
β”‚       β”‚   β”œβ”€β”€ image/
β”‚       β”‚   β”‚   └── ...
β”‚       β”‚   β”œβ”€β”€ segmentation/
β”‚       β”‚   β”‚   └── ...
β”‚       β”‚   β”œβ”€β”€ surfacenormals/
β”‚       β”‚   β”‚   └── ...
β”‚       β”‚   β”œβ”€β”€ metadata.csv
β”‚       β”‚   └── weather.csv
β”‚       β”œβ”€β”€ ...
β”‚       └── 49/
β”‚           └── ...
β”œβ”€β”€ validation/
β”‚   └── ...
└── test/
    └── ...

Additional Information

Class Mapping: The segmentation masks use the following class labels for obstacle segmentation:

CLASS_MAPPING = {
    'ultra_thin': 255,
    'thin_structures': 240,
    'small_mesh': 220,
    'large_mesh': 200,
    'trees': 180,
    'buildings': 160,
    'vehicles': 140,
    'animals': 100,
    'other': 80
}

Metadata: The dataset contains metadata, such as coordinates, pose, acceleration, weather conditions and camera parameters, which provide valuable contextual information about each flight.


Dataset Usage

  • Data Loading: To load and use the DDOS dataset in your projects, you can refer to the official PyTorch data loading tutorial: PyTorch Data Loading Tutorial. This tutorial will guide you through the process of loading data, creating data loaders, and preparing the dataset for training or evaluation using PyTorch.

  • Respect the Data Splits: Please ensure that the testing data is not used for validation. Mixing these datasets could lead to inaccurate assessments of model performance. Maintaining separate datasets for testing and validation helps ensure reliable evaluation and accurate reporting of results.


License

DDOS is openly licensed under CC BY-NC 4.0


Citation

If you use DDOS in your research or projects, please cite our paper:

@article{kolbeinsson2023ddos,
  title={{DDOS}: The Drone Depth and Obstacle Segmentation Dataset},
  author={Benedikt Kolbeinsson and Krystian Mikolajczyk},
  journal={arXiv preprint arXiv:2312.12494},
  year={2023}
}
Downloads last month
3
Edit dataset card