This project investigates the application of an Unscented Kalman Filter (UKF) for sensor fusion, combining data from an Inertial Measurement Unit (IMU) and a vision-based system for robot state estimation. The UKF is employed due to its ability to handle nonlinearities, which potentially improves accuracy compared to traditional Kalman Filters.
The project evaluates two scenarios:
- Part 1: Using visual pose estimation (position and orientation) as measurements.
- Part 2: Using optical flow-derived velocity as measurements.
The UKF framework is developed and tested using datasets, with performance measured by comparing estimated trajectories to actual sensor data.
- Sensor Fusion: Combines IMU and vision-based pose/velocity estimation.
- Unscented Kalman Filter (UKF): Handles nonlinearities in the system model.
- Pose and Velocity Estimation: Tracks position, orientation, and velocity of the robot.
- Nonlinear System Handling: Uses sigma points to propagate state through a nonlinear process model.
In this part, the UKF estimates the robot’s position and orientation using data from the camera. The algorithm propagates the state estimate using IMU measurements and updates it with visual measurements for correction.
Here, the UKF relies on velocity estimates derived from optical flow to update the state. Relying solely on velocity introduces trade-offs in position accuracy but demonstrates the UKF’s robustness with limited measurement data.
Fig 1: Part 1 - Estimated Position X, Y, Z
Fig 2: Part 1 - Estimated Orientation X, Y, Z
Fig 3: Part 1 - Estimated Velocity X, Y, Z
Fig 4: Part 2 - Estimated Position X, Y, Z
Fig 5: Part 2 - Estimated Orientation X, Y, Z
Fig 6: Part 2 - Estimated Velocity X, Y, Z
This project successfully implements an Unscented Kalman Filter (UKF) for sensor fusion using IMU and vision-based systems. The UKF demonstrates its ability to handle nonlinearities, resulting in accurate state estimation in both configurations.
- Part 1 effectively estimated the robot’s position and orientation using visual data.
- Part 2 showed robust velocity estimation using optical flow, despite some trade-offs in position accuracy.
The UKF's performance highlights its potential for real-world applications where nonlinearities and noisy sensor data are common challenges.
- Extended Sensor Fusion: Incorporate additional sensor modalities, such as GPS, to further improve state estimation.
- Dynamic Environment Testing: Perform testing in more complex environments to evaluate the robustness of the UKF.
- Alternative Filters: Explore alternative state estimation filters, such as the Unscented Particle Filter (UPF), for improved performance.
- Timothy D. Barfoot. State Estimation for Robotics. Cambridge University Press, 2017. ISBN: 1107159393.
- Ghodgaonkar, Shantanu. Implementation of an Extended Kalman Filter (EKF) for State Estimation. 2024. Link.
- Ghodgaonkar, Shantanu. Vision-Based 3D Attitude Estimation Using AprilTags. 2024. Link.
- MathWorks. Cholesky factorization. Link.
- Bruno Siciliano et al. Robotics: Modelling, Planning and Control. Springer Publishing Company, 2010. ISBN: 1849966346.
- Sebastian Thrun, Wolfram Burgard, and Dieter Fox. Probabilistic Robotics. The MIT Press, 2005. ISBN: 0262201623.
- E.A. Wan and R. Van Der Merwe. “The unscented Kalman filter for nonlinear estimation”. In: Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium. 2000. DOI: 10.1109/ASSPCC.2000.882463.