The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   RetryableConfigNamesError
Exception:    HfHubHTTPError
Message:      500 Server Error: Internal Server Error for url: https://huggingface.co/api/datasets/ShuoooooC/Navmamba_full/tree/ee2e5864c5afb36196bd12475eeb4696513b66d4/dependencies%2Fhabitat-sim?recursive=True&expand=False (Request ID: Root=1-66dcd2d9-2b4d80304f70019a48592982;46e1378b-0acd-420d-8e52-d0b5d9347574)

Internal Error - We're working hard to fix this as soon as possible!
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response
                  config_names = get_dataset_config_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 347, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1914, in dataset_module_factory
                  raise e1 from None
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1889, in dataset_module_factory
                  return HubDatasetModuleFactoryWithoutScript(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1269, in get_module
                  patterns = get_data_patterns(base_path, download_config=self.download_config)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 501, in get_data_patterns
                  return _get_data_files_patterns(resolver)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 295, in _get_data_files_patterns
                  data_files = pattern_resolver(pattern)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 388, in resolve_pattern
                  for filepath, info in fs.glob(pattern, detail=True, **glob_kwargs).items()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 417, in glob
                  return super().glob(path, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 604, in glob
                  allpaths = self.find(root, maxdepth=depth, withdirs=True, detail=True, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 437, in find
                  out = self._ls_tree(path, recursive=True, refresh=refresh, revision=resolved_path.revision, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 366, in _ls_tree
                  self._ls_tree(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 383, in _ls_tree
                  for path_info in tree:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 2912, in list_repo_tree
                  for path_info in paginate(path=tree_url, headers=headers, params={"recursive": recursive, "expand": expand}):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_pagination.py", line 37, in paginate
                  hf_raise_for_status(r)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 371, in hf_raise_for_status
                  raise HfHubHTTPError(str(e), response=response) from e
              huggingface_hub.utils._errors.HfHubHTTPError: 500 Server Error: Internal Server Error for url: https://huggingface.co/api/datasets/ShuoooooC/Navmamba_full/tree/ee2e5864c5afb36196bd12475eeb4696513b66d4/dependencies%2Fhabitat-sim?recursive=True&expand=False (Request ID: Root=1-66dcd2d9-2b4d80304f70019a48592982;46e1378b-0acd-420d-8e52-d0b5d9347574)
              
              Internal Error - We're working hard to fix this as soon as possible!

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/datasets-cards)

Object Goal Navigation with Recursive Implicit Maps

Method to Clone the repo

git clone --recurse-submodules https://huggingface.co/datasets/ShuoooooC/Navmamba_full onav_rim
cd onav_rim

Environment docker

docker pull shuochen0725/navmamba:2.10.0

# Create the container and use this 
conda activate navmamba
If not working, try this !!!!!!

Environment - 2, Install it locally

  1. Environment Preaparation
conda remove -n navmamba --all
conda create -n navmamba python=3.10.13 cmake=3.14.0 
conda activate navmamba
git lfs install
git clone --recurse-submodules https://huggingface.co/datasets/ShuoooooC/Navmamba_full test
cd onav_rim
  1. Install mamba and related package
# Mamba Install 
conda install pytorch==1.12.0 torchvision==0.13.0 torchaudio==0.12.0 cudatoolkit=11.6 -c pytorch -c nvidia -c conda-forge
conda install cudatoolkit-dev -c conda-forge

cd offline_bc/models/causal-conv1d
pip install -e .

cd ../mamba-1p1p1/
pip install -e .
cd ../../..
  1. Install the package for the Habitat module
# Habitat install
cd dependencies/habitat-sim
pip install -r requirements.txt
apt-get update || true
# These are fairly ubiquitous packages and your system likely has them already,
# but if not, let's get the essentials for EGL support:
apt-get install -y --no-install-recommends \
     libjpeg-dev libglm-dev libgl1-mesa-glx libegl1-mesa-dev mesa-utils xorg-dev freeglut3-dev g++
./build.sh --headless --bullet  # for headless systems

# avoid logging
export GLOG_minloglevel=2
export MAGNUM_LOG=quiet
cd ../..
  1. Install othe package that requires
# install CLIP
pip install ftfy regex tqdm
pip install git+https://github.com/openai/CLIP.git 
pip install openai-clip -i https://pypi.tuna.tsinghua.edu.cn/simple
# install the codebase
pip install -r requirements.txt
python setup.py develop --all
  1. Unzip datasets
cd requires_datasets
unzip depth_fts.zip
unzip meta_infos_lmdb.zip
unzip objectnav_mp3d_70k.zip
unzip objectnav_mp3d.zip
unzip rgb_fts.zip
unzip sem_fts.zip
unzip scene_mp3d.zip
unzip test_assets.zip

# This will takes a lot of time, please be patience
git clone https://huggingface.co/sxyxs/mp3d_on
cd mp3d_on
mv rgb_fts/ code/onav_rim/data/datasets/objectnav/mp3d_70k_demos_prefts
cd zip_by_type
unzip depth_fts.zip 
# You need to move this folder in the code/onav_rim/data/datasets/objectnav/mp3d_70k_demos_prefts as well after unzip. Only need This folder "depth_fts".
# "Metion", onav_rim is the name for this project. Replace the correct path by yourself please
cd ../..

mv ddppo-models/ code/onav_rim/data/
mv rednet-models/ code/onav_rim/data/
mv code/onav_rim/data/ ./..
rm -rf code/
cd .. 

The data structure should look like this:

  β”œβ”€β”€ onav_rim/
  β”‚  β”œβ”€β”€ data
  β”‚  β”‚  β”œβ”€β”€ ddppo-models/
  β”‚  |  |  β”œβ”€β”€ gibson-2plus-resnet50.pth
  β”‚  β”‚  β”œβ”€β”€ rednet-models/
  β”‚  β”‚  β”‚  β”œβ”€β”€ rednet_semmap_mp3d_tuned.pth
  β”‚  β”‚  β”œβ”€β”€ scene_datasets/
  β”‚  β”‚  β”‚  β”œβ”€β”€ mp3d/
  β”‚  β”‚  β”œβ”€β”€ datasets
  β”‚  β”‚  β”‚  β”œβ”€β”€ objectnav/
  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ objectnav_mp3d_70k/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ train/
  β”‚  β”‚  β”‚  β”‚  │── mp3d/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ v1/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ val/
  β”‚  β”‚  β”‚  β”‚  │── mp3d_70k_demos_prefits/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ depth_fts/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ meta_infos_lmdb/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ rgb_fts/
  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€β”€ sem_fts/  
  β”‚  β”‚  β”œβ”€β”€ test_assets/
  β”‚  β”‚  β”‚  β”œβ”€β”€ objects/

Usage

  1. Train and evaluate the model

Try this at first, if not work, do it manually

./job_scripts/three_tasks.sh   

For Leraning Rate Test

export PYTHONPATH=$PYTHONPATH:$(pwd)/dependencies/habitat-sim
export WORLD_SIZE=1
export MASTER_ADDR='gpu001'
export MASTER_PORT=10001

configfile=offline_bc/config/lr_large.yaml
CUDA_VISIBLE_DEVICES=0 python offline_bc/train_models.py --exp-config $configfile
export PYTHONPATH=$PYTHONPATH:$(pwd)/dependencies/habitat-sim
export WORLD_SIZE=1
export MASTER_ADDR='gpu002'
export MASTER_PORT=10002

configfile=offline_bc/config/lr_small.yaml
CUDA_VISIBLE_DEVICES=1 python offline_bc/train_models.py --exp-config $configfile

For Input Reshape

export PYTHONPATH=$PYTHONPATH:$(pwd)/dependencies/habitat-sim
export WORLD_SIZE=1
export MASTER_ADDR='gpu003'
export MASTER_PORT=10003

configfile=offline_bc/config/shape_model.yaml
CUDA_VISIBLE_DEVICES=2 python offline_bc/train_models.py --exp-config $configfile

We provide a trained model in [Dropbox](https://www.dropbox.com/scl/fo/bk0ok8ibxd5yirohqjmu6/h?rlkey=4riilnmjgk0e4o5rxt3prtsfl&dl=0).


## Acknowledgements
Some of the codes are built upon [habitat-imitation-baselines](https://github.com/Ram81/habitat-imitation-baselines/tree/master). Thanks them for their great work!


## Citation
If you find our work useful in your research, please consider citing:

@inproceedings{chen2023rim, author={Shizhe Chen and Thomas Chabal and Ivan Laptev and Cordelia Schmid}, title={Object Goal Navigation with Recursive Implicit Maps}, booktitle={Proc. of The International Conference on Intelligent Robots and Systems (IROS)}, year={2023} }


Downloads last month
1