Visual SLAM for AR Glasses
Problem
Enable real-time tracking and mapping on AR glasses where compute, power, and latency constraints are much tighter than on desktop-class systems.
Method
The project uses a visual-inertial SLAM pipeline with optimized tracking and mapping components tailored for embedded deployment.
My Role
I developed core tracking and mapping components, balancing runtime performance with accuracy and stability under real-time constraints.
Focus
- Robust tracking in real time
- Efficient mapping on embedded hardware
- Practical deployment constraints for AR devices