[Question] Why do OffsetCfg rotations make simulation impossible? #3819
Replies: 3 comments
-
|
Does this also happen with TiledCameraCfg? |
Beta Was this translation helpful? Give feedback.
-
Yes, it does |
Beta Was this translation helpful? Give feedback.
-
|
Thank you for following up. I will move this post to our Discussions for others to contribute. Here is a summary to consider in the meantime. The error you’re seeing — Why OffsetCfg Rotations Cause Simulation FailureAccording to the Isaac Lab camera configuration API:1
When your quaternion Correcting the Camera OrientationTo fix this, align the camera offset with the convention you are using: 1. Use a Neutral Orientation (Default Forward +Z)If you just want the camera pointing forward relative to the robot in ROS convention: offset=CameraCfg.OffsetCfg(
pos=(0.510, 0.0, 0.015),
rot=(1.0, 0.0, 0.0, 0.0), # identity rotation
convention="ros"
)2. If the Camera Should Point Along +X in ROSApply a 90° rotation around the Y-axis. Using a right-handed quaternion: offset=CameraCfg.OffsetCfg(
pos=(0.510, 0.0, 0.015),
rot=(0.7071, 0.0, 0.7071, 0.0), # rotate +90° about Y
convention="ros"
)3. For Custom AnglesTo manually generate quaternions consistent with from scipy.spatial.transform import Rotation as R
# Example: rotate 90° pitch (X), 0° yaw, 180° roll (Z)
quat = R.from_euler('xyz', [90, 0, 180], degrees=True).as_quat() # (x, y, z, w)
# Isaac Lab uses (w, x, y, z)
quat = (quat[3], quat[0], quat[1], quat[2])
print(quat)Then plug that value into your Summary of Recommended Practice
If you still want to visualize or debug your camera alignment, use in the Isaac Lab script: camera_prim = sim_app.get_prim_at_path("{ENV_REGEX_NS}/Robot/base/front_cam")
print(camera_prim.GetAttribute('xformOp:orient').Get())This will confirm what quaternion USD actually received after applying your Footnotes |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Question
I'm trying to train a robot for obstacle avoidance and I need to implement cameras and direct them correctly. OffsetCfg allows this, but when the simulation or training starts, the robot disappears from the scene and the terminal reports this error:
[Warning] [omni.usd] Warning: in Orthonormalize at line 495 of /builds/omniverse/usd-ci/USD/pxr/base/gf/matrix4d.cpp -- OrthogonalizeBasis did not converge, matrix may not be orthonormal.
This my camera declaration:
camera = CameraCfg(
prim_path="{ENV_REGEX_NS}/Robot/base/front_cam",
update_period=0.1,
height=480,
width=640,
data_types=["rgb", "distance_to_image_plane"],
spawn=sim_utils.PinholeCameraCfg(
focal_length=24.0, focus_distance=400.0, horizontal_aperture=20.955, clipping_range=(0.1, 1.0e5)
),
offset=CameraCfg.OffsetCfg(pos=(0.510, 0.0, 0.015), rot=(0.5, -0.5, 0.5, -0.5), convention="ros"),
)
What should I do to orient the camera correctly so I can train?
Beta Was this translation helpful? Give feedback.
All reactions