XRoboToolkit-Unity-Client
is a Unity-based software developed for PICO devices to facilitate robot training and remote teleoperation. It works in conjunction with PC-side software to achieve robot training and teleoperation functionalities.
Item | Description |
---|---|
Network - SN | Display Serial number of the XR device, only functional with Pico 4 Ultra enterprise version |
Network - IP | IP address of the XR device |
Network - FPS | Data syncing frames per second |
Network - Status | Connection status between robot and XR device |
Network - PC Service | IP address of the PC running PC service |
Network - Enter | Manually input the PC service's IP |
Tracking - Head | Toggle On/Off to send out head 6 DoF pose |
Tracking - Controller | Toggle On/Off to parse VR controller's 6 DoF pose and button status in data stream |
Tracking - Hand | Toggle On/Off to parse hand tracking data in data stream |
Tracking - PICO Motion Tracker - Mode | Dropdown menu to select None, full body tracking (require Pico tracker) to parse in data stream |
Tracking - PICO Motion Tracker - TrackerNum | Number of the trackers |
Tracking - Data & Control - Send | Toggle On/Off to sync above selected poses between XR device and robot PC |
Tracking - Data & Control - Switch w/ A Button | Toggle On/Off to rapid pause or resume sync with the right-hand controller button A |
Tracking - Status | Panel to show tracking related information |
Remote Vision - State | Show the state of camera |
Remote Vision - Dropdown (Video Source) | Select a supported video source |
Remote Vision - Listen | Open a connection to receive the video stream to the selected video source |
Data Collection - Tracking | Whether to record pose tracking data |
Data Collection - Vision | Whether to record vision data |
Data Collection - Record | Start/Stop recording |
Log | Show logs |
- Pose sync between XR device and robot PC Transmits pose data from the XR headset to the robot-side PC for robot teleoperation.
- Local pose and stereo vision data collection
Synchronously records stereo vision and pose data collected from the XR headset, stored in the device's
/Download
directory. - Remote stereo vision sync between two XR headsets Transmits stereo vision from the robot-side headset to operator-side for 3D display.
- Remote stereo vision sync between PC camera and XR headset Transmits stereo vision from the robot-side PC camera to operator-side headset for 3D display.
- Connect robot PC and Pico 4 Ultra under the same network
- On robot PC, run service:
- Windows: Run
\Program Files (x86)\roboticsservice\runService.bat
- Linux (Ubuntu/Debian x86): Run
/opt/apps/roboticsservice/runService.sh
- Windows: Run
- Open app XRoboToolkit on Pico 4 Ultra
If the PC service is running successfully, when you open the app on Pico headset, you will receive a prompt window for server connection. Point to the IP address and click it with the trigger button on the controller.
The main panel will display "WORKING" if connection is established.
On the main panel, select preferred pose information to be synced, such as head tracking, controller tracking, or body tracking. If your headset and PC have established connection, then the pose data will be synced to PC when "Send" is toggled On. When "Switch w/ A Button" option toggles On, you may also use "A" button on the right controller to toggle "Send" On or Off during data syncing.
Note: At this moment, camera data collection still requires special approval through Pico's enterprise service.
On the main panel, select preferred pose data to be collected, click Record. You will see a brief blur effect on the screen, record button will turn red, and camera status will turn to 6. When you finish recording, press recording button again to end the collection session. The video files and pose files will be saved on your local headset. You may also click "Save Camera Parameters" to save camera intrinsic and extrinsic parameters for the local headset.
- Connect both XR headsets to the same network.
- Make sure that the camera to be used as the camera source has VST camera permission (requires special approval).
- Open XRRoboToolkit on both headsets.
Note: The headset serving as robot eyes (H1) should have the camera permission enabled, while the other headset (H2) is for the human operator side.
- H1: Remember the IP of the VR headset and DON'T DO ANYTHING.
- H2: On the Camera panel, Select PICO4U as the video source, Click Listen, input H1's IP, and click Confirm.
- H2: You should now be able to see the live camera. Press B button on the right-hand controller to switch between side-by-side views and stereo-3D views.
- H2: If you close the live camera window, you can simply repeat Step 5.
- If you want to stop the camera streaming, quit XRRoboToolkit on H1 and H2.
- Clone XRoboToolkit-Orin-Video-Sender on Orin.
- Build
- Run the following command on Orin:
./OrinVideoSender --listen <IP of Orin>:13579
- Open XRRoboToolkit on the XR headset.
- On the Camera panel, select "ZEDMINI" as the video source.
- Click "Listen", input the IP of Orin, and click Confirm.
- You should now be able to see the live camera. Press B button on the right-hand controller to switch between side-by-side views and stereo-3D views.
- If you close the live camera window, you can simply repeat Step 6.
- If you want to stop the camera streaming, quit XRRoboToolkit on the XR headset and stop the OrinVideoSender on Orin.
Core resource folder containing all project assets:
- InteractionTools XR interaction scripts and 3D models.
- Plugins
Android interface implementations including
robotassistant_lib-i18n-release.aar
and Android platform configurations. - Resources Project-specific assets.
- Scripts
Core application logic:
- Camera Camera-related functionality.
- ExtraDev PICO tracker peripheral integration.
- Network Network communication implementation.
- UI User interface components.
Android library containing PICO device interfaces and image processing logic.
- UIOperater UI interaction logic.
- UICameraCtrl Camera control implementation.
- TcpHandler Network data transmission handler.
- TrackingData Pose data processing module.
Managed via Unity Package Manager.
Unity project configuration files:
- Audio/Physics/Input settings
- Quality/Graphics configurations
Official SDK for PICO device integration: Download Link
- Unity 2022.3.16f1+
- Android Studio 4.2.2+
- Android SDK 29
- Android NDK 21.4.7075529
- PICO Integration SDK (com.unity.xr.picoxr) 3.1.2
- Use exact Unity version 2022.3.16f1 to avoid compatibility issues
- Verify Android SDK/NDK paths in Unity Preferences
- Ensure PICO SDK compatibility with Unity version
- Complete Android module installation during Unity setup
- Set platform to Android:
- File → Build Settings → Android → Switch Platform
- Configure signing:
- Player Settings → Publishing Settings
- Create new Keystore via Keystore Manager for first build
- Build execution:
- File → Build Settings → Build (macOS)
- Output path:
ProjectRoot/ProjectSettings/Android/
- Hotkeys:
- Windows:
Ctrl + Shift + B
- macOS:
Cmd + Shift + B
- Windows:
- Menu Path: Build → One - click packaging
Auto-increments version number (Format: Major.Minor.Build
):
- Example: 1.0.0 → 1.0.1 → ... → 1.1.0
ProjectRoot/
└── Builds/
├── Android/
├── iOS/
├── macOS/
└── Windows/
- Windows: Automatically opens File Explorer with output file selected
- macOS: Reveals build output in Finder
- Universal: Displays build result dialog
-
Hardware Interaction Layer
- PICO Enterprise API Calls (Requires Device Permissions)
PXR_Enterprise.SwitchSystemFunction(SystemFunctionSwitchEnum.SFS_SECURITY_ZONE_PERMANENTLY, SwitchEnum.S_OFF); PXR_Enterprise.OpenVSTCamera(); // Enable VST Passthrough Camera
- PICO Enterprise API Calls (Requires Device Permissions)
-
Image Processing Pipeline
- Android Native Decoder Bridge
private static AndroidJavaObject _javaObj = new AndroidJavaObject("com.picovr.robotassistantlib.MediaDecoder"); public static void initialize(int unityTextureId, int width, int height) { GetJavaObject().Call("initialize", unityTextureId, width, height); }
- Android Native Decoder Bridge
-
Network Transport Layer
- Asynchronous UDP Data Reception
UdpClient client = new UdpClient(port); BeginReceive(); void BeginReceive() { client.BeginReceive(ReceiveCallback, null); } void ReceiveCallback(IAsyncResult ar) { IPEndPoint remoteEP = null; byte[] data = client.EndReceive(ar, ref remoteEP); // Data parsing... }
- Asynchronous UDP Data Reception
-
Data Synchronization Mechanism
TcpHandler → NetPacket
: Data packet encapsulationNetPacket → ByteBuffer
: Serialization processingByteBuffer → Socket
: Asynchronous transmissionSocket → TcpHandler
: Callback handling
-
Unity Business Logic
- IP Address Validation
if (!IPAddress.TryParse(ip, out _)) { SetRemind(LogType.Error, "The IP format is incorrect!"); return; } TcpHandler.Connect(ip); // Trigger TCP connection
- IP Address Validation
-
Cross-Platform Hybrid Architecture: Unity C# layer and Android Java layer communicate via JNI bridge for hardware-accelerated encoding/decoding.
-
Dual Data Channels:
- Video Stream: 60 FPS with adaptive QoS
- Pose Data Channel: 90Hz with low-latency priority
-
Thread Model:
Thread Type Responsibilities Main Thread UI rendering & user input handling Worker Thread Video encoding/network transmission GL Thread OpenGL ES texture operations -
Memory Management: Uses ring buffer for video frames to prevent GC stutter.
-
Fault Recovery:
- Automatic TCP reconnection
- Keyframe retransmission support for video decoding
-
Key Performance Metrics:
Metric Value End-to-End Latency ~100ms (720P *2 @60FPS) Pose Data Packet Size 56 bytes/frame Video Encoding Bitrate 5 Mbps Network Fault Tolerance 3 retries + FEC
- Q: Can not sign the application
A: Please refer to this link for the solution.
- Q: No entitlement info ......
A: Connect the headset to public internet and run it.
- Q: Failed to connect on Linux.
A: Ensure the PC and headset are in the same WiFi network. Run the 3D application first and then run the Headset App.
- Q: How to update the video source? [V1.1.0+]
A: Do the following steps:
# pull the file first
adb pull /sdcard/Android/data/com.xrobotoolkit.client/files/video_source.yml
# edit the video_source.yml
# push the file back
adb push video_source.yml /sdcard/Android/data/com.xrobotoolkit.client/files/video_source.yml
- Q: How to revert the video source to default? [V1.1.0+]
A: Run adb shell rm /sdcard/Android/data/com.xrobotoolkit.client/files/video_source.yml
.
For technical support or documentation updates, contact the development team.
If you find this project useful, please consider citing it as follows.
@article{zhao2025xrobotoolkit,
title={XRoboToolkit: A Cross-Platform Framework for Robot Teleoperation},
author={Zhigen Zhao and Liuchuan Yu and Ke Jing and Ning Yang},
journal={arXiv preprint arXiv:2508.00097},
year={2025}
}