Muthukumaran commited on
Commit
e53570d
1 Parent(s): eacf6ff

Upload 3 files

Browse files
Files changed (3) hide show
  1. README.md +73 -0
  2. fire_scars_train_val.tar.gz +3 -0
  3. hls_burn_scars.py +93 -0
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ size_categories:
3
+ - n<1K
4
+ license: cc-by-4.0
5
+ language:
6
+ - en
7
+ ---
8
+
9
+ # Dataset Card for HLS Burn Scar Scenes
10
+
11
+ ## Dataset Description
12
+
13
+ - **Homepage: https://huggingface.co/datasets/nasa-impact/hls_burn_scars**
14
+ - **Point of Contact: Dr. Christopher Phillips (cep0013@uah.edu)**
15
+
16
+ ### Dataset Summary
17
+
18
+ This dataset contains Harmonized Landsat and Sentinel-2 imagery of burn scars and the associated masks for the years 2018-2021 over the contiguous United States. There are 804 512x512 scenes. Its primary purpose is for training geospatial machine learning models.
19
+
20
+ ## Dataset Structure
21
+
22
+
23
+ ## TIFF Metadata
24
+ Each tiff file contains a 512x512 pixel tiff file. Scenes contain six bands, and masks have one band. For satellite scenes, each band has already been converted to reflectance.
25
+
26
+ ## Band Order
27
+ For scenes:
28
+ Channel, Name, HLS S30 Band number
29
+ 1, Blue, B02
30
+ 2, Green, B03
31
+ 3, Red, B04
32
+ 4, NIR, B8A
33
+ 5, SW 1, B11
34
+ 6, SW 2, B12
35
+
36
+ Masks are a single band with values:
37
+ 1 = Burn scar
38
+ 0 = Not burned
39
+ -1 = Missing data
40
+
41
+ ## Class Distribution
42
+ Burn Scar - 11%
43
+ Not burned - 88%
44
+ No Data - 1%
45
+
46
+ ## Data Splits
47
+ The 804 files have been randomly split into training (2/3) and validation (1/3) directories, each containing the masks, scenes, and index files.
48
+
49
+ ## Dataset Creation
50
+
51
+ After co-locating the shapefile and HLS scene, the 512x512 chip was formed by taking a window with the burn scar in the center. Burn scars near the edges of HLS tiles are offset from the center.
52
+ Images were manually filtered for cloud cover and missing data to provide as clean a scene as possible, and burn scar presence was also manually verified.
53
+
54
+ ## Source Data
55
+ Imagery are from V1.4 of HLS. A full description and access to HLS may be found at https://hls.gsfc.nasa.gov/
56
+
57
+ The data were from shapefiles maintained by the Monitoring Trends in Burn Severity (MTBS) group. The original data may be found at:
58
+ https://mtbs.gov/
59
+
60
+ ## Citation
61
+
62
+ If this dataset helped your research, please cite `HLS Burn Scars` in your publications. Here is an example BibTeX entry:
63
+
64
+ ```
65
+ @software{HLS_Foundation_2023,
66
+ author = {Phillips, Christopher and Roy, Sujit and Ankur, Kumar and Ramachandran, Rahul},
67
+ doi = {10.57967/hf/0956},
68
+ month = aug,
69
+ title = {{HLS Foundation Burnscars Dataset}},
70
+ url = {https://huggingface.co/ibm-nasa-geospatial/hls_burn_scars},
71
+ year = {2023}
72
+ }
73
+ ```
fire_scars_train_val.tar.gz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d84e413358c69724fb793da620c3158c0683dc895787ccc2e3fd2cd5a1581fa5
3
+ size 1779237070
hls_burn_scars.py ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from glob import glob
3
+ import datasets
4
+
5
+ _CITATION = """\
6
+ @software{HLS_Foundation_2023,
7
+ author = {Phillips, Christopher and Roy, Sujit and Ankur, Kumar and Ramachandran, Rahul},
8
+ doi = {10.57967/hf/0956},
9
+ month = aug,
10
+ title = {{HLS Foundation Burnscars Dataset}},
11
+ url = {https://huggingface.co/ibm-nasa-geospatial/hls_burn_scars},
12
+ year = {2023}
13
+ }
14
+ """
15
+
16
+ _DESCRIPTION = """\
17
+ This dataset contains Harmonized Landsat and Sentinel-2 imagery of burn scars and the associated masks for the years 2018-2021 over the contiguous United States. There are 804 512x512 scenes. Its primary purpose is for training geospatial machine learning models.
18
+ """
19
+
20
+ _HOMEPAGE = "https://huggingface.co/datasets/ibm-nasa-geospatial/hls_burn_scars"
21
+
22
+ _LICENSE = "cc-by-4.0"
23
+
24
+ _URLS = {
25
+ "hls_burn_scars": {
26
+ "train/val": "https://huggingface.co/datasets/ibm-nasa-geospatial/hls_burn_scars/resolve/main/hls_burn_scars.tar.gz"
27
+ }
28
+ }
29
+
30
+ class HLSBurnScars(datasets.GeneratorBasedBuilder):
31
+ """MIT Scene Parsing Benchmark dataset."""
32
+
33
+ VERSION = datasets.Version("0.0.1")
34
+
35
+ BUILDER_CONFIGS = [
36
+ datasets.BuilderConfig(name="hls_burn_scars", version=VERSION, description=_DESCRIPTION),
37
+ ]
38
+
39
+ def _info(self):
40
+ features = datasets.Features(
41
+ {
42
+ "image": datasets.Image(),
43
+ "annotation": datasets.Image(),
44
+ }
45
+ )
46
+ return datasets.DatasetInfo(
47
+ description=_DESCRIPTION,
48
+ features=features,
49
+ homepage=_HOMEPAGE,
50
+ license=_LICENSE,
51
+ citation=_CITATION,
52
+ )
53
+
54
+ def _split_generators(self, dl_manager):
55
+ urls = _URLS[self.config.name]
56
+
57
+ data_dirs = dl_manager.download_and_extract(urls)
58
+ train_data = os.path.join(data_dirs['train/val'], "training")
59
+ val_data = os.path.join(data_dirs['train/val'], "validation")
60
+
61
+ return [
62
+ datasets.SplitGenerator(
63
+ name=datasets.Split.TRAIN,
64
+ gen_kwargs={
65
+ "data": train_data,
66
+ "split": "training",
67
+ },
68
+ ),
69
+ datasets.SplitGenerator(
70
+ name=datasets.Split.VALIDATION,
71
+ gen_kwargs={
72
+ "data": val_data,
73
+ "split": "validation",
74
+ },
75
+ ),
76
+ datasets.SplitGenerator(
77
+ name=datasets.Split.TEST,
78
+ gen_kwargs={
79
+ "data": val_data,
80
+ "split": "testing",
81
+ },
82
+ )
83
+ ]
84
+
85
+ def _generate_examples(self, data, split):
86
+ files = glob(f"{data}/*_merged.tif")
87
+ for idx, filename in enumerate(files):
88
+ if filename.endswith("_merged.tif"):
89
+ annotation_filename = filename.replace('_merged.tif', '.mask.tif')
90
+ yield idx, {
91
+ "image": {"path": filename},
92
+ "annotation": {"path": annotation_filename}
93
+ }