The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   ConfigNamesError
Exception:    DataFilesNotFoundError
Message:      No (supported) data files found in frodobots/FrodoBots-2K
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/", line 73, in compute_config_names_response
                  config_names = get_dataset_config_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/", line 347, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/", line 1904, in dataset_module_factory
                  raise e1 from None
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/", line 1885, in dataset_module_factory
                  return HubDatasetModuleFactoryWithoutScript(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/", line 1270, in get_module
                  module_name, default_builder_kwargs = infer_module_for_data_files(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/", line 597, in infer_module_for_data_files
                  raise DataFilesNotFoundError("No (supported) data files found" + (f" in {path}" if path else ""))
              datasets.exceptions.DataFilesNotFoundError: No (supported) data files found in frodobots/FrodoBots-2K

Need help to make the dataset viewer work? Open a discussion for direct support.

FrodoBots 2K Dataset

The FrodoBots 2K Dataset is a diverse collection of camera footage, GPS, IMU, audio recordings & human control data collected from ~2,000 hours of tele-operated sidewalk robots driving in 10+ cities.

This dataset is collected from Earth Rovers, a global scavenger hunt "Drive to Earn" game developed by FrodoBots Lab.

Please join our Discord for discussions with fellow researchers/makers!

  • If you're interested in contributing driving data, you can buy your own unit(s) from our online shop (US$299 per unit) and start driving around your neighborhood (& earn in-game points in the process)!

  • If you're interested in testing out your AI models on our existing fleet of Earth Rovers in various cities or your own Earth Rover, feel free to DM Michael Cho on Twitter/X to gain access to our Remote Access SDK.

  • If you're interested in playing the game (ie. remotely driving an Earth Rover), you may join as a gamer at Earth Rovers School.

Earth Rovers: Drive to Earn Game with Sidewalk Robots

Dataset Summary

There are 7 types of data that are associated with a typical Earth Rovers drive, as follows:

  1. Control data: Gamer's control inputs captured at a frequency of 10Hz (Ideal) as well as the RPM (revolutions per minute) readings for each of the 4 wheels on the robot.

  2. GPS data: Latitude, longitude, and timestamp info collected during the robot drives at a frequency of 1Hz.

  3. IMU (Inertial Measurement Unit) data: 9-DOF sensor data, including acceleration (captured at 100Hz), gyroscope (captured at 1Hz), and magnetometer info (captured at 1Hz), along with timestamp data.

  4. Rear camera video: Video footage captured by the robot's rear-facing camera at a typical frame rate of 20 FPS with a resolution of 540x360.

  5. Front camera video: Video footage captured by the robot's front-facing camera at a typical frame rate of 20 FPS with a resolution of 1024x576.

  6. Microphone: Audio recordings captured by the robot's microphone, with a sample rate of 16000Hz, channel 1.

  7. Speaker: Audio recordings of the robot's speaker output (ie. gamer's microphone), also with a sample rate of 16000Hz, channel 1.

Note: As of 12 May 2024, ~1,300 hrs are ready for download. The remaining ~700 hours are still undergoing data cleaning and will be available for download by end May or early June.

Video Walkthrough

Our cofounder, Michael Cho, walks through the core components of the dataset, as well as a discussion on latency issues surrounding the data collection. Video_Walkthrough.png

In total, there were 9,000+ individual driving sessions recorded. The chart below shows the distribution of individual driving session duration.

Histogram of Session Duration.png

These drives were done with Earth Rovers in 10+ cities. The chart below shows the distribution of recorded driving duration in the various cities.

Cumulative Sessions Hours.png

About FrodoBots

FrodoBots is a project aiming to crowdsource the world's largest real-world teleoperation datasets with robotic gaming.

We have 3 core thesis:

  1. Robotic gaming can be a thing: It is possible to create fun gaming experience where gamers control robots remotely to complete missions in real life.
  2. Affordable robots are just as useful in collecting data for Embodied AI research: We design our robots to be like "toys", so that as many people as possible can afford to buy one and play with them.
  3. DePIN can scale this project: We can create a global community of robot hardware owners/operators by incentivizing them with well-designed tokenomics, taking best practices from other DePIN (Decentralized Physical Infrastructure Network) projects.
Testing in Madrid
Testing in London
Testing in Stockholm
Testing in Wuhan
Testing in Liuzhou
Testing in Berlin
Game controller view.gif
Game Controller + Browser = Control FrodoBots Anywhere
Chatting with people.gif
Chatting with locals via built-in microphone/speaker
Turning on the spot.gif
Zero turning radius = Easy maneuvering
Night Driving.gif
Night driving test in Palo Alto
Driving through rain.gif
Driving through rain
Road Crossing in University Ave Palo Alto.gif
Road crossing in Palo Alto
Left to right in Stanford.gif
Earth Rover being tested in Stanford University campus

Motivations for open-sourcing the dataset

The team behind FrodoBots is focused on building an real-world video gaming experience using real-life robots (we call it "robotic gaming"). A by-product of gamers playing the game is the accompanying dataset that's generated.

By sharing this dataset with the research community, we hope to see new innovations that can (1) take advantage of this dataset & (2) leverage our existing fleet of community-sourced robots (via our Remote Access SDK) as a platform for testing SOTA Embodied AI models in the real world.

Help needed!

We are a very small team with little experience in various downstream data pipeline and AI research skillsets. One thing we do have is lots of real-world data.

Please reach out or join our Discord if you'd have any feedback or like to contribute to our efforts, especially on following:

  • Data cleaning: We have way more data than what we've open-sourced in this dataset, primarily because we struggle with variuos data cleaning tasks.
  • Data analytics: We have done a couple charts but that's about it.
  • Data annotations: We have open-sourced the raw files, but it'll be great to work with teams with data annotation know-how to further augment the current dataset.
  • Data visualization: A lot more can be done to visualize some of these raw inputs (eg. layering timestamped data on top of the video footage).
  • Data anonymization: We'd like to build in various data anonymization (eg. face blurring) in future releases. We attempted to do this but struggled with downstream data manipulation issues (eg. dropped frames, lower video resolution, etc)
  • Data streaming & hosting: If this project continues to scale, we'd have millions of hours of such data in the future. Will need help with storage/streaming.


Download FrodoBots dataset using the link in this csv file.

Helper code

We've provided a helpercode.ipynb file that will hopefully serve as a quick-start for researchers to play around with the dataset.


The team at FrodoBots Lab created this dataset, including Michael Cho, Sam Cho, Aaron Tung, Niresh Dravin & Santiago Pravisani.

Downloads last month
Edit dataset card