The dataset viewer is not available for this dataset.
The dataset tries to import a module that is not installed.
Error code:   DatasetModuleNotInstalledError
Exception:    ImportError
Message:      To be able to use openclimatefix/goes-mrms, you need to install the following dependencies: s3fs, xarray.
Please install them using 'pip install s3fs xarray' for instance.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 55, in compute_config_names_response
                  for config in sorted(get_dataset_config_names(path=dataset, token=hf_token))
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 351, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1512, in dataset_module_factory
                  raise e1 from None
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1481, in dataset_module_factory
                  return HubDatasetModuleFactoryWithScript(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1202, in get_module
                  local_imports = _download_additional_modules(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 294, in _download_additional_modules
                  raise ImportError(
              ImportError: To be able to use openclimatefix/goes-mrms, you need to install the following dependencies: s3fs, xarray.
              Please install them using 'pip install s3fs xarray' for instance.

Need help to make the dataset viewer work? Open a discussion for direct support.

YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/datasets-cards)

Dataset Card for Goes-MRMS

Dataset Summary

This dataset is a combination of GOES-16 data and MRMS radar precipitation data to roughly match the unreleased dataset used to train Google Research's MetNet. In the papers they used GOES-16 satellite imagery, MultiRadar/Multi-System (MRMS) instantaneous precipitation, hourly cumulative precipitation, and High Resolution Rapid Refresh NWP initializations as inputs to predict future MRMS precipitation rates. The precipitation rates were binned into 0.2mm/hr bins to make the output a classification task, and allow for the models to predict a probability distribution over the region of interest.

Additionally, the input image patches are much larger than the target image patches. For MetNet, the input images covered 512x512 km area, while the target was the center 64x64 km crop. For MetNet-2 the input covered 2048x2048 km with the target being the central 512x512 km.

Supported Tasks and Leaderboards

[Needs More Information]

Languages

[Needs More Information]

Dataset Structure

Data Instances

[Needs More Information]

Data Fields

[Needs More Information]

Data Splits

MetNet (January 2018-July 2019) (16 days training, 2 days validation, 2 days test) MetNet-2 (July 2017-August 2020) (Non-overlapping time ranges with 12 hour black outs in between) Full (July 2017-January 2022) (Train: 2017-2020. except for first of the month, Validation: first of the month July 2017-2020, Test: 2021-2022)

Dataset Creation

Curation Rationale

The original curation rationale was for forecasting precipitation rate in a probabilistic way. This dataset covers a different time period than in the original paper, going from July 2017 through December 2021. There is a split available to match the temporal coverage of the original MetNet paper, (Janurary 2018 to July 2019) or the MetNet-2 paper (July 2017 to August 2020).

Source Data

Initial Data Collection and Normalization

From the MetNet paper: "For both MRMS and GOES we acquired data for the period January 2018 through July 2019. We split the data temporally into three non-overlapping data sets by repeatedly using approximately 16 days for training followed by two days for validation and two days for testing. From these temporal splits we randomly extracted 13,717 test and validation samples and kept increasing the training set size until we observed no over-fitting at 1.72 million training samples."

From the MetNet-2 paper: "The training data consists of 1,230,585 patches of size 2048 km x 2048 km at the input and targets of size 512 km x 512 km including all 360 (2 to 720 minutes) time slices. The training area covers a region of 7000x2500 kilometers. We sample target patches from the input context region minus an all around border of 512 km. The input context is padded for all regions outside of the 7000x2500 CONUS. The validation data used for developing the models consists of 11,991 patches and the test data of 39,864 patches. The training, validation and test data are drawn from non-overlapping ranges of hours, with black out periods of 12 hours in between, over a period of observations of 3 years from July 2017 to August 2020. This ensures that the model does not learn any spurious training and evaluation correlations within any single day. HRRR only generates forecasts starting at full hours."

Who are the source language producers?

[Needs More Information]

Annotations

Annotation process

[Needs More Information]

Who are the annotators?

[Needs More Information]

Personal and Sensitive Information

[Needs More Information]

Considerations for Using the Data

Social Impact of Dataset

[Needs More Information]

Discussion of Biases

[Needs More Information]

Other Known Limitations

[Needs More Information]

Additional Information

Dataset Curators

Jacob Bieker (jacob@openclimatefix.org) MetNet-1 split: MetNet Authors MetNet-2 split: MetNet-2 Authors

Licensing Information

All data is open and without restrictions from NOAA.

Citation Information

Please cite NOAA as the data provider.

Downloads last month
56
Edit dataset card