MAXSCAN

Realtime Dense & Sparse Reconstruction and visual localization for AR.

Abstract

Designed and implemented a cross-platform mobile mapping system featuring:

  • Real-time sparse SLAM and localization on Android and iOS
  • Dense volumetric reconstruction using LiDAR on iOS devices
  • GPU-accelerated TSDF integration and surface extraction using Swift + Metal
  • Modular architecture for integration with AR frameworks and scalable map management

Problem

Mobile XR applications demand high-performance mapping, but:
  • Sparse and dense reconstruction are often separated, with limited real-time capability on mobile hardware
  • Dense reconstruction on iOS requires low-level GPU programming and sensor optimization
  • On Device localization and persistent map merging are technically complex and poorly supported in most toolchains

Contribution

  • Architected a modular, scalable SLAM system for mobile, supporting multiple reconstruction modes (sparse/dense)
  • Implemented GPU-accelerated TSDF pipeline using Metal for dense LiDAR reconstruction
  • Developed a localization and map-merging engine using lightweight data structures suitable for mobile memory constraints
  • Tuned system to run at real-time frame rates by optimizing memory access patterns and Metal shader performance
  • Integrated the system into mobile AR frameworks for live deployment and testing

Result

  • Achieved real-time sparse SLAM and localization across Android and iOS devices
  • Enabled dense 3D reconstruction using LiDAR at high frame rates on iOS
  • Delivered <1s re-localization latency and consistent map merging across sessions
  • Deployed internally as a 3D space scanning tool for persistent AR use cases