XAMI-dataset / README.md
iulia-elisa's picture
Update README.md
3d45d82 verified
---
size_categories:
- 1K<N<10K
source_datasets:
- original
task_categories:
- image-segmentation
task_ids:
- instance-segmentation
pretty_name: XAMI-dataset
tags:
- COCO format
- Astronomy
- XMM-Newton
- CC BY-NC 3.0 IGO
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: observation id
dtype: string
- name: segmentation
dtype: image
- name: bbox
dtype: image
- name: label
dtype: string
- name: area
dtype: string
- name: image shape
dtype: string
splits:
- name: train
num_bytes: 154137131.0
num_examples: 272
- name: valid
num_bytes: 210925170.0
num_examples: 360
download_size: 365017887
dataset_size: 365062301.0
---
<div align="center">
<h1> XAMI: XMM-Newton optical Artefact Mapping for astronomical Instance segmentation </h1>
<i> The Dataset </i>
</div>
Check the **[XAMI model](https://github.com/ESA-Datalabs/XAMI-model)** and the **[XAMI dataset](https://github.com/ESA-Datalabs/XAMI-dataset)** on Github.
# Downloading the dataset
- using a python script
```python
from huggingface_hub import hf_hub_download
dataset_name = 'xami_dataset' # the dataset name of Huggingface
images_dir = '.' # the output directory of the dataset images
hf_hub_download(
repo_id="iulia-elisa/XAMI-dataset", # the Huggingface repo ID
repo_type='dataset',
filename=dataset_name+'.zip',
local_dir=images_dir
);
# Unzip file
!unzip -q "xami_dataset.zip" -d 'path/to/dest'
```
Or you can simply download only the dataset zip file from HuggingFace using a CLI command:
```bash
DEST_DIR='/path/to/local/dataset/dir'
huggingface-cli download iulia-elisa/XAMI-dataset xami_dataset.zip --repo-type dataset --local-dir "$DEST_DIR" && unzip "$DEST_DIR/xami_dataset.zip" -d "$DEST_DIR" && rm "$DEST_DIR/xami_dataset.zip"
```
## Licence
**[CC BY-NC 3.0 IGO](https://creativecommons.org/licenses/by-nc/3.0/igo/deed.en).**