The dataset viewer is not available for this split.
Rows from parquet row groups are too big to be read: 346.57 MiB (max=286.10 MiB)
Error code:   TooBigContentError

Need help to make the dataset viewer work? Open a discussion for direct support.

Dataset Metadata

Identification Information

Citation

  • Title:Aerial surveys of a sunflower crop’s lifecycle from April to September 2023
  • Originator: Sofia University - faculty of mathematics and informatics, SAP LABS Bulgaria
  • Publication Date: 2023.11.08

Abstract

Efficient food production is shaping up to be one of the new frontiers for new technologies and solutions. One such prominent domain is the remote sensing ecosystem, and more precicely, technologies such as multispectral and hyperspectral sensing equipment. These devices are gradually moving from the academia environment to the industry world, and there decrease is cost allows for many new applications to emerge.

Multispectral drones are advanced unmanned aerial vehicles (UAVs) equipped with cameras or sensors, capable of capturing imagery across multiple spectral bands. Unlike traditional RGB counterparts, they capture data not only within, but also beyond the visible spectrum, such as near-infrared (NIR). This data can provide valuable insights for various applications, including agriculture, environmental monitoring, land surveying, and more. One of the main uses of multispectral drones in agriculture is related to the calculation of vegetation (NDVI, NDRE etc.) and other indices that inform the farmer about crop development, stress etc. The latter can also serve as indirect indicator of soil conditions and water distribution. This approach enables more accurate and detailed assessments compared to traditional visual inspections. Similar multispectral data is provided by earth observation satellites, such as Sentinel-2, however they are limited with respect to revisit time, spatial resolution and most importantly, their inability to see through clouds. Therefore, the use of multispectral drones can fill these operational gaps and provide more precise and timely data to the farmers. However, to work simultaneously with satellite and drone data, analysts must have confidence in the precision and comparability of these two data sources (e.g., for NDVI). For example, the DJI P4 multispectral images have slightly different band sensitivities when compared with Sentinel-2, which may cause deviations in the index values. Another prominent problem is related to the field illumination, which depends on time of day and weather conditions. Even though the DJI P4 drone has a calibration sensor, supposed to compensate for the illuminating spectrum deviations, to the best of our knowledge, no public data set exists that demonstrates the tolerance of deviations between e.g., different drone footages or between DJI P4 and Sentinel-2. Moreover, Sentinel-2 implements atmospheric corrections that may contribute to such deviations as well. Machine learning models can be utilized to extract valuable insights from multispectral data in precision agriculture applications. By leveraging the rich information captured across multiple spectral bands, machine learning algorithms can analyze and interpret the data to provide actionable recommendations for farmers and agronomists, such as highlighting areas with the most vegetation stress. Successful implementation of machine learning models for precision agriculture, based on multispectral data, requires high quality data sets, which are currently scarce. Therefore, collection of a high-quality, multispectral data set is a prerequisite to future machine learning experiments in the domain of precision farming.

For these reasons, our research team conducted multiple surveys, tracking the entire lifecycle of a sunflower field and gathering spectal data.

Purpose

This dataset was developed as part of a research project, investigating the capabilities and application of drones and multispectral cameras for the agricultural domain. The provided data can be used for the following scenarios:

  1. Training models relying on multispectral datasources.
  2. Improve existing algorithms in the computer vision domain.

Time Period of Content

  • Single Date/Time: Start Date 2023-04-25 to End Date 2023-09-04

Data Quality Information

Composite images have been generated with DJI Terra, with 70% frontal and 60% side overlap. There are instances where a survey has been completed in the span of 2 days due to adverse environment conditions. Although there was an effort to have surveys execution in a constant time window (morning and afternoon), for some of the runs this is not the case. The raw data is validated to be complete - representing the entirety of the observed field for every survey.

Horizontal Coordinate System

  • Geographic Coordinate System: EPSG:4326
    • Angular Unit: Decimal degrees
    • Datum: WGS 84
    • Prime Meridian: Greenwich
    • Domain: Raster

Entity and Attribute Information

Detailed Description

Entities

Data is organized into directories. Each directory corresponds to one survey and uses DD.MM.YYYY format.

Each survey directory contains 2 subdirectories : raw and results. results directory is the output from the DJI Terra processing of the raw data, collected by the drone.

  • Contents:
    • raw
      • Composite images, derived from a single drone sensor. Images follow result_<Blue, Green, etc.> nomenclature.
      • .prj projection file for every composite image
      • .tfw georeference file for every composite image
    • results
      • subdirectories for each executed flight, required to complete the survey.
      • each subdirectory keeps the raw data for each sensing point on the drone's mission path
      • one point is represented by one JPG image and 5 grayscale TIF images, corresponding to each sensor of the drone

Composite image

Composite image sample

Raw data images

Raw data images

All images are injected with geo-referencing data, timestamps, image quality, camera properties.

The datasets hold additional metadata in two files:

  • field_shape.geojson - bounding box for the sunflower field
  • crop_details.txt - information about the crop

Capture aperture

Drone surveys are executed with DJI Phantom 4 Multispectral drone. The drone uses the following sensors to capture data:

Sensors: Six 1/2.9” CMOS

Filters:

  • Blue (B): 450 nm ± 16 nm
  • Green (G): 560 nm ± 16 nm
  • Red (R): 650 nm ± 16 nm
  • Red edge (RE): 730 nm ± 16 nm
  • Near-infrared (NIR): 840 nm ± 26 nm

Lenses:

  • FOV (Field of View): 62.7°
  • Focal Length: 5.74 mm
  • Aperture: f/2.2

Software used for generating composite images: DJI Terra 3.6.8.

Metadata Reference Information

  • Metadata Contact:

    • Name: Pavel Genevski
    • Organization: SAP LABS Bulgaria
    • Position: Research expert
    • Email: pavel.genevski@sap.com
  • Metadata Contact:

  • Metadata Date: Date of creating this metadata (2023.11.08)

  • Metadata Standard Name: FGDC Content Standard for Digital Geospatial Metadata

Additional Information

  • Keywords: agriculture, multispectral, crop, sunflower
  • Access Constraints: CC BY 4.0
  • Use Constraints: CC BY 4.0
Downloads last month
0
Edit dataset card