Skip to content

benedictquartey/robotlimp

Repository files navigation

LIMP - Language Instruction Grounding for Motion Planning

Splash

This is the code base for the paper "Verifiably Following Complex Robot Instructions with Foundation Models". We present a novel approach that leverages pre-trained foundation models and temporal logics to enable robots verifiably follow expressive and complex open ended instructions in real world environments without prebuilt semantic maps.

Installation

  1. Create conda environment and install relevant packages: conda env create -f environment.yml
    • Known issue on macs with apple silicon (osx-arm64):
      • A dependent library for LTL and automata manipulation Spot doesn't currently support installation via conda for osx-arm64.
      • You will need to remove the spot package from environment.yml to setup the limp env, then install Spot from source or via homebrew
  2. Activate your conda environment: conda activate limp
  3. Setup the Open Spatial Grounding Library (OSG) submodule
    • Update submodule run
      • git submodule update --init --recursive
      • git submodule foreach git pull origin main
    • Install OSG pip install -e open-spatial-grounding
    • In the limp conda environment install Mobile SAM or Segment Anything as stated in OSG's installation instructions. Remember to download model checkpoints into the model_ckpts folder.
    • Copy the osg folder from the Open Spatial Grounding library and place it in this root directory.
  4. Obtain an Openai api key and add it to your system variables.

Scan Environment

  • This codebase supports directly using RGBD data from robot exploration however the easiest way to get started is using an Iphone Pro equiped with lidar sensors.
  • Download the Record3D app, scan the scene of your choice and export the *.r3d file to your computer.
    • Make sure to capture all test objects in your recording.
    • You can download a sample r3d file from here: apartment

Running Instructions

Citation

The methods implemented in this codebase were proposed in the paper "Verifiably Following Complex Robot Instructions with Foundation Models". If you find any part of this code useful, please consider citing:

@inproceedings{quartey2025verifiably,
title={Verifiably following complex robot instructions with foundation models},
author={Quartey, Benedict and Rosen, Eric and Tellex, Stefanie and Konidaris, George},
booktitle={2025 IEEE International Conference on Robotics and Automation (ICRA)},
pages={1--8},
year={2025},
organization={IEEE}
}

About

Language Instruction Grounding for Motion Planning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published