Stereo SLAM for smart glasses

Stereo Visual-Inertial SLAM for Mobile/Wearable & Robots.

Abstract

Developed a lightweight stereo visual-inertial SLAM system for mobile XR platforms with these features:

  • Real-time tracking at 90Hz on resource-limited devices.
  • Accurate pose estimation with <1cm drift over 50m.
  • Support for multiple operating systems: Android, Windows, Linux, macOS, ROS.
  • Modular architecture for integration with smart glasses and XR systems.

Problem

Smart glasses and mobile XR devices face key challenges:
  • Limited CPU/GPU power for heavy SLAM computations
  • Need for high-frequency (90Hz) real-time tracking
  • Sensor fusion complexities with stereo cameras and IMUs
  • Cross-platform deployment difficulties

Contribution

  • Designed and implemented a modular, multithreaded SLAM pipeline optimized for ARM-based mobile SoCs
  • Built a sensor fusion system integrating stereo vision and IMU data for stable 6DoF pose estimation
  • Tuned performance to run efficiently on embedded and mobileplatforms (Galaxy S8–S22, Realsense D435i)
  • Developed cross-platform abstractions supporting Android, Linux, macOS, Windows, and ROS

Result

  • Achieved <1cm pose drift over 50 metersin indoor/outdoor tests
  • Enabled 90Hz real-time SLAM on various mobile devices including Galaxy S8–S22
  • Successfully integrated with tethered smart glasses for XR applications