Asynchronous timewarping

Asynchronous timewarping for smart glasses to reduce MTP latency and to increase framerate.

ATW project preview

Abstract

To enable high-performance XR experiences on mobile smart glasses, I developed a real-time rendering engine based on the Vulkan API with the following goals:

  • Reduce motion-to-photon (MTP) latency through IMU (Inertial Measurement Unit) data fusion.
  • Increase framerate using multithreaded stereo rendering.
  • Design a modular software architecture compatible with computer vision systems such as SLAM and Image Tracker.
  • Ensure reusability and extensibility across tethered and untethered XR platforms.

Problem

XR applications on mobile smart glasses face several critical challenges:
  • High MTP latency, which causes visual lag and motion sickness.
  • Low framerates, especially when running computationally heavy tasks like SLAM and image tracking.
  • Limited performance of traditional rendering pipelines on mobile or embedded hardware.
  • Difficulty integrating CV modules into tightly constrained rendering systems without sacrificing responsiveness.

Contribution

  • Architected the Vulkan rendering pipeline from scratch for mobile XR.
  • Applied asynchronous programming and concurrency techniques to improve real-time performance.
  • Prioritized modular, reusable software design, making the system extensible across other applications.
  • Focused on low-level optimization, integrating tightly with hardware (IMU, display).
  • Delivered a production-ready, cross-platform rendering module adaptable to future XR devices.

Result

  • Reduced MTP latency by 92ms (80% reduction), enabling more responsive user experience.
  • Increased framerate from 30Hz to 90Hz (300% increase) in SLAM-integrated scenarios. like SLAM and image tracking.
  • Verified improvements through simulation and hardware-in-the-loop testing.
atw_graph