Datasets:
metadata
size_categories:
- 1K<N<10K
source_datasets:
- original
task_categories:
- image-segmentation
task_ids:
- instance-segmentation
pretty_name: XAMI-dataset
tags:
- COCO format
- Astronomy
- XMM-Newton
- CC BY-NC 3.0 IGO
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: observation id
dtype: string
- name: segmentation
dtype: image
- name: bbox
dtype: image
- name: label
dtype: string
- name: area
dtype: string
- name: image shape
dtype: string
splits:
- name: train
num_bytes: 154137131
num_examples: 272
- name: valid
num_bytes: 210925170
num_examples: 360
download_size: 365017887
dataset_size: 365062301
XAMI: XMM-Newton optical Artefact Mapping for astronomical Instance segmentation
The DatasetCheck the XAMI model and the XAMI dataset on Github.
Downloading the dataset
- using a python script
from huggingface_hub import hf_hub_download
dataset_name = 'xami_dataset' # the dataset name of Huggingface
images_dir = '.' # the output directory of the dataset images
hf_hub_download(
repo_id="iulia-elisa/XAMI-dataset", # the Huggingface repo ID
repo_type='dataset',
filename=dataset_name+'.zip',
local_dir=images_dir
);
# Unzip file
!unzip -q "xami_dataset.zip" -d 'path/to/dest'
Or you can simply download only the dataset zip file from HuggingFace using a CLI command:
DEST_DIR='/path/to/local/dataset/dir'
huggingface-cli download iulia-elisa/XAMI-dataset xami_dataset.zip --repo-type dataset --local-dir "$DEST_DIR" && unzip "$DEST_DIR/xami_dataset.zip" -d "$DEST_DIR" && rm "$DEST_DIR/xami_dataset.zip"