Skip to content

oravus/object-rel-nav

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Official code release to test different controllers / trajectory planners.

Setup

Environment

Setup conda environment (with habitat & ultralytics)

(see this to setup without habitat)

conda create -n nav
conda activate nav

conda install python=3.9 mamba -c conda-forge
mamba install pip numpy matplotlib pytorch torchvision pytorch-cuda=11.8 opencv=4.6 cmake=3.14.0 habitat-sim withbullet numba=0.57 pyyaml ipykernel networkx h5py natsort open-clip-torch transformers einops scikit-learn kornia pgmpy python-igraph pyvis -c pytorch -c nvidia -c aihabitat -c conda-forge

mamba install -c conda-forge ultralytics

mamba install -c conda-forge tyro faiss-gpu scikit-image ipykernel spatialmath-python gdown utm seaborn wandb kaggle yacs

# setup git submodule (for objectreact controller)
git clone https://github.com/oravus/object-rel-nav.git
cd object-rel-nav/
git submodule update --init --recursive

# install habitat-lab
cd libs/
git clone https://github.com/facebookresearch/habitat-lab.git
cd habitat-lab/
git checkout v0.2.4
pip install -e habitat-lab

Data

Download official Habitat data and our benchmark trajectories.

In ./data/, sym link the following downloads as subdirs: hm3d v0.2, instance_imagenav_hm3d_v3, and hm3d_iin_val.

  • Download official hm3d v0.2 following instructions here (a free Matterport account required), eg val data for testing, specifically:

    • hm3d-val-glb-v0.2.tar (4G) - Download
    • hm3d-val-habitat-v0.2.tar (3.3G) - Download
    • hm3d-val-semantic-annots-v0.2.tar (2.0G) - Download
    • hm3d-val-semantic-configs-v0.2.tar (40K) - Download

    These downloaded hm3d-val* can be extracted to ./data/hm3d_v0.2/val/

  • Download official InstanceImageNav challenge dataset from here (Direct Link | ~512 mb)

  • Download our test trajectory data hm3d_iin_val and maps_via_alt_goal (for the shortcut task) from HuggingFace:

    cd data/
    huggingface-cli download oravus/objectreact_hm3d_iin --repo-type dataset --local-dir ./ --include "evaluation/**"
    unzip -q 'evaluation/*.zip'
    rm -r evaluation
    

Models

Download controller models.

In model_weights/:

  • TANGO: Download depth anything model from here [1.3 GB] and its base vit from here [1.3 GB].
  • PixNav: Download the original authors' provided checkpoint from our hosting here [208 MB].
  • ObjectReact: Downloaded automatically from HuggingFace [18MB].

Experiment

NAVIGATE

For quickly running a navigation episode using robohop controller (uses configs/defaults.yaml, with comments explaining the parameters):

python main.py

To use ObjectReact or TANGO controller (tango.yaml, or your own config), run as:

python main.py -c configs/object_react.yaml

Check the output dir ./out/ for output.log and visualizations. The above config uses ground truth perception for a quick start, set goal_source='topological' and edge_weight_str='e3d_max' to use inferred perception.

EVAL

python scripts/evaluate_objecreact.py <path to dir containing timestamped results subdirs>

MAP: Topological Object-Level Graph

To create a topological graph given a folder of RGB images, please see/run this example script:

python scripts/create_maps_hm3d.py ./data/hm3d_iin_val/ fast_sam None 0 1

TRAIN

Please see instructions here.

Cite

ObjectReact:

@inproceedings{garg2025objectreact,
  title={ObjectReact: Learning Object-Relative Control for Visual Navigation},
  author={Garg, Sourav and Craggs, Dustin and Bhat, Vineeth and Mares, Lachlan and Podgorski, Stefan and Krishna, Madhava and Dayoub, Feras and Reid, Ian},
  booktitle={Conference on Robot Learning},
  year={2025},
  organization={PMLR}
}

TANGO:

@inproceedings{podgorski2025tango,
  title={TANGO: Traversability-Aware Navigation with Local Metric Control for Topological Goals},
  author={Podgorski, Stefan and Garg, Sourav and Hosseinzadeh, Mehdi and Mares, Lachlan and Dayoub, Feras and Reid, Ian},
  booktitle={2025 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={2399--2406},
  year={2025},
  organization={IEEE}
}

RoboHop:

@inproceedings{garg2024robohop,
  title={Robohop: Segment-based topological map representation for open-world visual navigation},
  author={Garg, Sourav and Rana, Krishan and Hosseinzadeh, Mehdi and Mares, Lachlan and S{\"u}nderhauf, Niko and Dayoub, Feras and Reid, Ian},
  booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={4090--4097},
  year={2024},
  organization={IEEE}
}

About

Object-Relative Navigation: ObjectReact [CoRL'25] | TANGO [ICRA'25] | RoboHop [ICRA'24]

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published