Lightweight, extensible whole-body teleoperation framework for humanoid robots.
Real-time motion retargeting from BVH / Pico 4 VR to Unitree G1, in MuJoCo sim or on real hardware.
Documentation • 中文文档 • Pico Sim2Sim • Pico Sim2Real • Training
1. Install
pip install -e .2. Download assets
pip install modelscope
python scripts/setup/download_assets.py --only gmr ckpt bvh3. Run
python scripts/run/run_sim.py \
controller.policy_path=track.onnx \
input.bvh_file=data/sample_bvh/aiming1_subject1.bvhYou should see a MuJoCo viewer with the robot tracking the BVH motion.
To show the simulated D435i RGB camera view, add the explicit camera viewer:
python scripts/run/run_sim.py \
controller.policy_path=track.onnx \
input.bvh_file=data/sample_bvh/aiming1_subject1.bvh \
'viewers=[sim2sim,camera]'Full docs at BotRunner64.github.io/Teleopit, covering installation profiles, all tutorials, configuration reference, and architecture.
- Consolidated realtime input around pico-bridge 0.2.0 and removed the old ZMQ/onboard Pico path.
- Unified sim/sim2real reference buffering, resume realignment, and velocity smoothing.
- Added UDP BVH realtime input, online sim config, multi-viewer support, and fixed camera viewing.
- Split sim2real reference/safety runtime modules and updated the G1 MuJoCo camera asset.
- Added Pico 4 teleoperation through pico-bridge and the G1 Bridge SDK.
- Added offline playback keyboard controls, Pico sim2sim mode control, and a standalone standing controller.
- Improved realtime mocap buffering/catch-up and upgraded the released model to the 30k checkpoint.
- Dataset shard-only refactor and
adaptive_binsampling - External asset management (ModelScope), repository slimming
- Initial public release: General-Tracking-G1 training, ONNX sim2sim inference, Pico 4 VR teleoperation, Unitree G1 hardware deployment
