The dataset viewer is not available for this subset.
Cannot get the split names for the config 'default' of the dataset.
Exception:    SplitsNotFoundError
Message:      The split names could not be parsed from the dataset config.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 138, in compute
                  return CompleteJobResult(compute_split_names_from_info_response(dataset=self.dataset, config=self.config))
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 117, in compute_split_names_from_info_response
                  config_info_response = get_previous_step_or_raise(kind="config-info", dataset=dataset, config=config)
                File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 566, in get_previous_step_or_raise
                  raise CachedArtifactError(
              libcommon.simple_cache.CachedArtifactError: The previous step failed.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 498, in get_dataset_config_info
                  for split_generator in builder._split_generators(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/folder_based_builder/folder_based_builder.py", line 119, in _split_generators
                  analyze(archives, downloaded_dirs, split_name)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/folder_based_builder/folder_based_builder.py", line 93, in analyze
                  for downloaded_dir_file in dl_manager.iter_files(downloaded_dir):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 869, in __iter__
                  yield from self.generator(*self.args, **self.kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 947, in _iter_from_urlpaths
                  if xisfile(urlpath, download_config=download_config):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 262, in xisfile
                  fs, *_ = fsspec.get_fs_token_paths(path, storage_options=storage_options)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 622, in get_fs_token_paths
                  fs = filesystem(protocol, **inkwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/registry.py", line 290, in filesystem
                  return cls(**storage_options)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 79, in __call__
                  obj = super().__call__(*args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/zip.py", line 57, in __init__
                  self.zip = zipfile.ZipFile(
                File "/usr/local/lib/python3.9/zipfile.py", line 1266, in __init__
                  self._RealGetContents()
                File "/usr/local/lib/python3.9/zipfile.py", line 1329, in _RealGetContents
                  endrec = _EndRecData(fp)
                File "/usr/local/lib/python3.9/zipfile.py", line 286, in _EndRecData
                  return _EndRecData64(fpin, -sizeEndCentDir, endrec)
                File "/usr/local/lib/python3.9/zipfile.py", line 232, in _EndRecData64
                  raise BadZipFile("zipfiles that span multiple disks are not supported")
              zipfile.BadZipFile: zipfiles that span multiple disks are not supported
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 68, in compute_split_names_from_streaming_response
                  for split in get_dataset_split_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 571, in get_dataset_split_names
                  info = get_dataset_config_info(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 503, in get_dataset_config_info
                  raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err
              datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.

Need help to make the dataset viewer work? Open a discussion for direct support.

Monado SLAM Datasets cover image

Monado SLAM Datasets

The Monado SLAM datasets (MSD), are egocentric visual-inertial SLAM datasets recorded to improve the Basalt-based inside-out tracking component of the Monado project. These have a permissive license CC-BY 4.0, meaning you can use them for any purpose you want, including commercial, and only a mention of the original project is required. The creation of these datasets was supported by Collabora

Monado is an open-source OpenXR runtime that you can use to make devices OpenXR compatible. It also provides drivers for different existing hardware thanks to different contributors in the community creating drivers for it. Monado provides different XR-related modules that these drivers can use. To be more specific, inside-out head tracking is one of those modules and, while you can use different tracking systems, the main system is a fork of Basalt. Creating a good open-source tracking solution requires a solid measurement pipeline to understand how changes in the system affect tracking quality. For this reason, the creation of these datasets was essential.

These datasets are very specific to the XR use case as they contain VI-SLAM footage recorded from devices such as VR headsets, but other devices like phones or AR glasses might be added in the future. These were made since current SLAM datasets like EuRoC or TUM-VI were not specific enough for XR, or they didn't have permissively enough usage licenses.

For questions or comments, you can use the Hugging Face Community, join Monado's discord server and ask in the #slam channel, or send an email to mateo.demayo@collabora.com.

List of sequences

Valve Index datasets

These datasets were recorded using a Valve Index with the vive driver in Monado and they have ground truth from 3 lighthouses tracking the headset through the proprietary OpenVR implementation provided by SteamVR. The exact commit used in Monado at the time of recording is a4e7765d. The datasets are in the ASL dataset format, the same as the EuRoC datasets. Besides the main EuRoC format files, we provide some extra files with raw timestamp data for exploring real time timestamp alignment techniques.

The dataset is post-processed to reduce as much as possible special treatment from SLAM systems: camera-IMU and ground truth-IMU timestamp alignment, IMU alignment and bias calibration have been applied, lighthouse tracked pose has been converted to IMU pose, and so on. Most of the post-processing was done with Basalt calibration and alignment tools, as well as the xrtslam-metrics scripts for Monado tracking. The post-processing process is documented in this video which goes through making the MIPB08 dataset ready for use starting from its raw version.

Data

Camera samples

In the vive driver from Monado, we don't have direct access to the camera device timestamps but only to V4L2 timestamps. These are not exactly hardware timestamps and have some offset with respect to the device clock in which the IMU samples are timestamped.

The camera frames can be found in the camX/data directory as PNG files with names corresponding to their V4L2 timestamps. The camX/data.csv file contains aligned timestamps of each frame. The camX/data.extra.csv also contains the original V4L2 timestamp and the "host timestamp" which is the time at which the host computer had the frame ready to use after USB transmission. By separating arrival time and exposure time algorithms can be made to be more robust for real time operation.

The cameras of the Valve Index have global shutters with a resolution of 960×960 streaming at 54fps. They have auto exposure enabled. While the cameras of the Index are RGB you will find only grayscale images in these datasets. The original images are provided in YUYV422 format but only the luma component is stored.

For each dataset, the camera timestamps are aligned with respect to IMU timestamps by running visual-only odometry with Basalt on a 30-second subset of the dataset. The resulting trajectory is then aligned with the basalt_time_alignment tool that aligns the rotational velocities of the trajectory with the gyroscope samples and returns the resulting offset in nanoseconds. That correction is then applied to the dataset. Refer to the post-processing walkthrough video for more details.

IMU samples

The IMU timestamps are device timestamps, they come at about 1000Hz. We provide an imu0/data.raw.csv file that contains the raw measurements without any axis scale misalignment o bias correction. imu0/data.csv has the scale misalignment and bias corrections applied so that the SLAM system can ignore those corrections. imu0/data.extra.csv contains the arrival time of the IMU sample to the host computer for algorithms that want to adapt themselves to work in real time.

Ground truth information

The ground truth setup consists of three lighthouses 2.0 base stations and a SteamVR session providing tracking data through the OpenVR API to Monado. While not as precise as other MoCap tracking systems like OptiTrack or Vicon it should still provide pretty good accuracy and precision close to the 1mm range. There are different attempts at studying the accuracy of SteamVR tracking that you can check out like this, this, or this. When a tracking system gets closer to millimeter accuracy these datasets will no longer be as useful for improving it.

The raw ground truth data is stored in gt/data.raw.csv. OpenVR does not provide timestamps and as such, the timestamps recorded are from when the host asks OpenVR for the latest pose with a call to GetDeviceToAbsoluteTrackingPose. The poses contained in this file are not of the IMU but of the headset origin as interpreted by SteamVR, which usually is between the middle of the eyes and facing towards the displays. The file gt/data.csv corrects each entry of the previous file with timestamps aligned with the IMU clock and poses of the IMU instead of this headset origin.

Calibration

There are multiple calibration datasets in the MIC_calibration directory. There are camera-focused and IMU-focused calibration datasets. See the README.md file in there for more information on what each sequence is.

In the MI_valve_index/extras directory you can find the following files:

  • calibration.json: Calibration file produced with the basalt_calibrate_imu tool from MIC01_camcalib1 and MIC04_imucalib1 datasets with camera-IMU time offset and IMU bias/misalignment info removed so that it works with the fully the all the datasets by default which are fully post-processed and don't require those fields.
  • calibration.extra.json: Same as calibration.json but with the cam-IMU time offset and IMU bias and misalignment information filled in.
  • factory.json: JSON file exposed by the headset's firmware with information of the device. It includes camera and display calibration as well as more data that might be of interest. It is not used but included for completeness' sake.
  • other_calibrations/: Calibration results obtained from the other calibration datasets. Shown for comparison and ensuring that all of them have similar values. MICXX_camcalibY has camera-only calibration produced with the basalt_calibrate tool, while the corresponding MICXX_imucalibY datasets use these datasets as a starting point and have the basalt_calibrate_imu calibration results.
Camera model

By default, the calibration.json file provides parameters k1, k2, k3, and k4 for the Kannala-Brandt camera model with fish-eye distortion (also known as OpenCV's fish-eye).

Calibrations with other camera models might be added later on, otherwise, you can use the calibration sequences for custom calibrations.

IMU model

For the default calibration.json where all parameters are zero, you can ignore any model and just use the measurements present in imu0/data.csv directly. If instead, you want to use the raw measurements from imu0/data.raw.csv you will need to apply the Basalt accelerometer and gyroscope models that use a misalignment-scale correction matrix together with a constant initial bias. The random walk and white noise parameters were not computed and default reasonable values are used instead.

Post-processing walkthrough

If you are interested in understanding the step-by-step procedure of post-processing of the dataset, below is a video detailing the procedure for the MIPB08 dataset.

Post-processing walkthrough video

Sequences

  • MIC_calibration: Calibration sequences that record this calibration target from Kalibr with the squares of the target having sides of 3 cm. Some sequences are focused on camera calibration covering the image planes of both stereo cameras while others on IMU calibration properly exciting all six components of the IMU.
  • MIP_playing: Datasets in which the user is playing a particular VR game on SteamVR while Monado records the datasets.
    • MIPB_beat_saber: This contains different songs played at different speeds. The fitbeat song is one that requires a lot of head movement while MIPB08 is a long 40min dataset with many levels played.
    • MIPP_pistol_whip: This is a shooting and music game, each dataset is a different level/song.
    • MIPT_thrill_of_the_fight: This is a boxing game.
  • MIO_others: These are other datasets that might be useful, they include play-pretend scenarios in which the user is supposed to be playing some particular game, then there is some inspection and scanning/mapping of the room, some very short and lightweight datasets for quick testing, and some datasets with a lot of movement around the environment.

Evaluation

These are the results of running the current Monado tracker that is based on Basalt on the dataset sequences.

Seq. Avg. time* Avg. feature count ATE (m) RTE 100ms (m) ** SDM 0.01m (m/m) ***
MIO01 10.04 ± 1.43 [36 23] ± [28 18] 0.605 ± 0.342 0.035671 ± 0.033611 0.4246 ± 0.5161
MIO02 10.41 ± 1.48 [32 18] ± [25 16] 1.182 ± 0.623 0.063340 ± 0.059176 0.4681 ± 0.4329
MIO03 10.24 ± 1.37 [47 26] ± [26 16] 0.087 ± 0.033 0.006293 ± 0.004259 0.2113 ± 0.2649
MIO04 9.47 ± 1.08 [27 16] ± [25 16] 0.210 ± 0.100 0.013121 ± 0.010350 0.3086 ± 0.3715
MIO05 9.95 ± 1.01 [66 34] ± [33 21] 0.040 ± 0.016 0.003188 ± 0.002192 0.1079 ± 0.1521
MIO06 9.65 ± 1.06 [44 28] ± [33 22] 0.049 ± 0.019 0.010454 ± 0.008578 0.2620 ± 0.3684
MIO07 9.63 ± 1.16 [46 26] ± [30 19] 0.019 ± 0.008 0.002442 ± 0.001355 0.0738 ± 0.0603
MIO08 9.74 ± 0.87 [29 22] ± [18 16] 0.059 ± 0.021 0.007167 ± 0.004657 0.1644 ± 0.3433
MIO09 9.94 ± 0.72 [44 29] ± [14 8] 0.006 ± 0.003 0.002940 ± 0.002024 0.0330 ± 0.0069
MIO10 9.48 ± 0.82 [35 21] ± [18 10] 0.016 ± 0.009 0.004623 ± 0.003310 0.0620 ± 0.0340
MIO11 9.34 ± 0.79 [32 20] ± [19 10] 0.024 ± 0.010 0.007255 ± 0.004821 0.0854 ± 0.0540
MIO12 11.05 ± 2.20 [43 23] ± [31 19] 0.420 ± 0.160 0.005298 ± 0.003603 0.1546 ± 0.2641
MIO13 10.47 ± 1.89 [35 21] ± [24 18] 0.665 ± 0.290 0.026294 ± 0.022790 1.0180 ± 1.0126
MIO14 9.27 ± 1.03 [49 31] ± [30 21] 0.072 ± 0.028 0.002779 ± 0.002487 0.1657 ± 0.2409
MIO15 9.75 ± 1.16 [52 26] ± [29 16] 0.788 ± 0.399 0.011558 ± 0.010541 0.6906 ± 0.6876
MIO16 9.72 ± 1.26 [33 17] ± [25 15] 0.517 ± 0.135 0.013268 ± 0.011355 0.4397 ± 0.7167
MIPB01 10.28 ± 1.25 [63 46] ± [34 24] 0.282 ± 0.109 0.006797 ± 0.004551 0.1401 ± 0.1229
MIPB02 9.88 ± 1.08 [55 37] ± [30 20] 0.247 ± 0.097 0.005065 ± 0.003514 0.1358 ± 0.1389
MIPB03 10.21 ± 1.12 [66 44] ± [32 23] 0.186 ± 0.103 0.005938 ± 0.004261 0.1978 ± 0.3590
MIPB04 9.58 ± 1.02 [51 37] ± [24 17] 0.105 ± 0.060 0.004822 ± 0.003428 0.0652 ± 0.0555
MIPB05 9.97 ± 0.97 [73 48] ± [32 23] 0.039 ± 0.017 0.004426 ± 0.002828 0.0826 ± 0.1313
MIPB06 9.95 ± 0.85 [58 35] ± [32 21] 0.050 ± 0.022 0.004164 ± 0.002638 0.0549 ± 0.0720
MIPB07 10.07 ± 1.00 [73 47] ± [31 20] 0.064 ± 0.038 0.004984 ± 0.003170 0.0785 ± 0.1411
MIPB08 9.97 ± 1.08 [71 47] ± [36 24] 0.636 ± 0.272 0.004066 ± 0.002556 0.0740 ± 0.0897
MIPP01 10.03 ± 1.21 [36 22] ± [21 15] 0.559 ± 0.241 0.009227 ± 0.007765 0.3472 ± 0.9075
MIPP02 10.19 ± 1.20 [42 22] ± [22 15] 0.257 ± 0.083 0.011046 ± 0.010201 0.5014 ± 0.7665
MIPP03 10.13 ± 1.24 [37 20] ± [23 15] 0.260 ± 0.101 0.008636 ± 0.007166 0.3205 ± 0.5786
MIPP04 9.74 ± 1.09 [38 23] ± [22 16] 0.256 ± 0.144 0.007847 ± 0.006743 0.2586 ± 0.4557
MIPP05 9.71 ± 0.84 [37 24] ± [21 15] 0.193 ± 0.086 0.005606 ± 0.004400 0.1670 ± 0.2398
MIPP06 9.92 ± 3.11 [37 21] ± [21 14] 0.294 ± 0.136 0.009794 ± 0.008873 0.4016 ± 0.5648
MIPT01 10.78 ± 2.06 [68 44] ± [33 23] 0.108 ± 0.060 0.003995 ± 0.002716 0.7109 ± 13.3461
MIPT02 10.85 ± 1.27 [79 54] ± [39 28] 0.198 ± 0.109 0.003709 ± 0.002348 0.0839 ± 0.1175
MIPT03 10.80 ± 1.55 [76 52] ± [42 30] 0.401 ± 0.206 0.005623 ± 0.003694 0.1363 ± 0.1789
AVG 11.33 ± 1.83 [49 23] ± [37 15] 0.192 ± 0.090 0.009439 ± 0.007998 0.3247 ± 0.6130
Seq. Avg. time* Avg. feature count ATE (m) RTE 100ms (m) ** SDM 0.01m (m/m) ***
MGO01 12.06 ± 2.10 [19 16] ± [13 12] 0.680 ± 0.249 0.022959 ± 0.019026 0.3604 ± 1.3031
MGO02 11.20 ± 1.83 [19 15] ± [19 16] 0.556 ± 0.241 0.027931 ± 0.019074 0.3218 ± 0.4599
MGO03 9.88 ± 1.92 [22 16] ± [16 16] 0.145 ± 0.041 0.013003 ± 0.008555 0.2433 ± 0.3512
MGO04 9.43 ± 1.45 [16 14] ± [16 16] 0.261 ± 0.113 0.024674 ± 0.017380 0.3609 ± 0.4829
MGO05 9.93 ± 1.71 [39 40] ± [17 26] 0.030 ± 0.011 0.004212 ± 0.002632 0.0621 ± 0.1044
MGO06 10.40 ± 1.84 [24 22] ± [18 18] 0.111 ± 0.038 0.018013 ± 0.011398 0.2496 ± 0.2802
MGO07 9.74 ± 1.54 [30 24] ± [13 12] 0.021 ± 0.010 0.005628 ± 0.003707 0.0992 ± 0.1538
MGO08 9.42 ± 1.43 [17 13] ± [11 8] 0.027 ± 0.015 0.013162 ± 0.009729 0.1667 ± 0.4068
MGO09 10.90 ± 1.70 [39 34] ± [11 9] 0.008 ± 0.004 0.006278 ± 0.004054 0.0738 ± 0.0492
MGO10 9.31 ± 1.36 [29 37] ± [14 17] 0.008 ± 0.003 0.003496 ± 0.002333 0.0439 ± 0.0311
MGO11 9.26 ± 1.08 [30 22] ± [13 17] 0.017 ± 0.006 0.006065 ± 0.004285 0.0687 ± 0.0604
MGO12 9.33 ± 1.39 [20 19] ± [17 19] 0.610 ± 0.270 0.017372 ± 0.016246 0.7225 ± 10.7366
MGO13 10.08 ± 1.98 [18 17] ± [16 17] 0.683 ± 0.211 0.025764 ± 0.017900 0.2542 ± 0.3324
MGO14 10.00 ± 1.83 [29 25] ± [17 21] 0.070 ± 0.025 0.012013 ± 0.007674 0.1417 ± 0.1850
MGO15 9.07 ± 1.39 [9 7] ± [10 7] 0.037 ± 0.016 0.003737 ± 0.003425 0.7053 ± 4.3405
AVG 10.00 ± 1.64 [24 21] ± [15 15] 0.218 ± 0.084 0.013620 ± 0.009828 0.2583 ± 1.2852
Seq. Avg. time* Avg. feature count ATE (m) RTE 100ms (m) ** SDM 0.01m (m/m) ***
MOO01 7.58 ± 1.55 [30 23] ± [21 20] 0.281 ± 0.131 0.016662 ± 0.010451 0.2358 ± 0.3848
MOO02 6.89 ± 1.65 [27 21] ± [24 25] 0.237 ± 0.101 0.015469 ± 0.009201 0.1710 ± 0.2281
MOO03 7.33 ± 1.77 [30 26] ± [21 24] 0.177 ± 0.088 0.013521 ± 0.009276 0.2610 ± 0.6376
MOO04 6.11 ± 1.35 [22 14] ± [20 16] 0.065 ± 0.026 0.009849 ± 0.005401 0.0889 ± 0.1166
MOO05 7.04 ± 1.54 [53 46] ± [20 30] 0.018 ± 0.007 0.003070 ± 0.001838 0.0284 ± 0.0181
MOO06 6.66 ± 1.58 [38 35] ± [21 27] 0.056 ± 0.028 0.008395 ± 0.005154 0.0847 ± 0.1033
MOO07 6.38 ± 1.71 [43 31] ± [16 21] 0.013 ± 0.006 0.003422 ± 0.002073 0.0317 ± 0.0326
MOO08 7.17 ± 1.65 [25 19] ± [19 15] 0.028 ± 0.015 0.011164 ± 0.006958 0.0939 ± 0.1051
MOO09 8.31 ± 1.84 [43 38] ± [19 17] 0.004 ± 0.002 0.003284 ± 0.002181 0.0063 ± 0.0000
MOO10 6.94 ± 1.43 [38 21] ± [18 15] 0.010 ± 0.005 0.003765 ± 0.002338 0.0440 ± 0.0232
MOO11 6.66 ± 1.57 [32 32] ± [18 22] 0.019 ± 0.010 0.005102 ± 0.003253 0.0433 ± 0.0356
MOO12 5.78 ± 1.40 [32 34] ± [21 26] 0.694 ± 0.329 0.008292 ± 0.007220 0.1275 ± 0.2512
MOO13 6.12 ± 1.60 [21 16] ± [22 19] 0.501 ± 0.188 0.017042 ± 0.010342 0.1448 ± 0.1551
MOO14 7.07 ± 1.32 [26 19] ± [17 16] 0.113 ± 0.058 0.007743 ± 0.004316 0.1130 ± 0.1661
MOO15 6.51 ± 1.70 [20 11] ± [15 6] 0.629 ± 0.312 0.015308 ± 0.014007 0.7254 ± 0.3257
MOO16 5.21 ± 1.08 [23 28] ± [6 8] 0.046 ± 0.022 0.001441 ± 0.001238 0.1750 ± 0.1788
AVG 6.74 ± 1.55 [31 26] ± [19 19] 0.181 ± 0.083 0.008971 ± 0.005953 0.1484 ± 0.1726
  • *: Average frame time. On an AMD Ryzen 7 5800X CPU. Run with pipeline fully saturated. Real time operation frame times should be slightly lower.
  • **: RTE using delta of 6 frames (11ms)
  • ***: The SDM metric is similar to RTE, it represents distance in meters drifted for each meter of the dataset. The metric is implemented in the xrtslam-metrics project.

License

This work is licensed under a Creative Commons Attribution 4.0 International License. Creative Commons License

Downloads last month
0
Edit dataset card