Managing Latency in Complex Augmented Reality Systems

Marco C. Jacobs, Mark A. Livingston, Andrei State

Proceedings of 1997 Symposium on Interactive 3D Graphics (Providence, RI, USA; April 27-30)

Versions of the paper available:

Abstract

Registration (or alignment) of the synthetic imagery with the real world is crucial in augmented reality (AR) systems. The data from user-input devices, tracking devices, and imaging devices need to be registered spatially and temporally with the user's view of the surroundings. Each device has an associated delay between its observations of the world and the moment when the AR display presented to the user appears to be affected by a change in the data. We call the differences in delay the relative latencies. Relative latency is a source of misregistration and should be reduced. We give general methods for handling multiple data streams with different latency values associated with them in a working AR system. We measure the latency differences (part of the system dependent set of calibrations), time-stamp on-host, adjust the moment of sampling, and interpolate or extrapolate data streams. By using these schemes, a more accurate and consistent view is computed and presented to the user.

Color Plates

Download the original TIFF image.

Plate 1. AR registration problems resulting from relative latency. Left: AR view of a mannequin from a moving camera. The wireframe model lags behind due to relative latency in the magnetic tracker compared to the camera video. Center: AR view of a moving ultrasound probe. The wireframe model of the probe and the probes ultrasound image data field lag behind the real world video due to the relative latency in the mechanical tracker compared to the camera video. Right: AR view of a volume of ultrasound data. Note how the needle trace in the volume (green arrows) curves due to the relative latency between the ultrasound image data and the mechanical tracker.

Download the original TIFF image.

Plate 2. Rotating drum used to measure relative latency in the ultrasound image data (black rectangular frame) compared to the camera video (background). Despite correct static spatial registration (left), the framed image lags behind as we spin the drum (center). We compute the latency from the measured rotational velocity and a known angular distance (right).

Download the original TIFF image.

Plate 3. Left: Verifying the accuracy of temporal interpolation for the mechanical tracker by performing a zig-zag ultrasound sweep of a cylinder in a water tank. Center: The reconstructed cylinder undulates due to the relative latency between the ultrasound image data and the mechanical tracker. Right: A sweep using the measured relative latency yields a straight cylinder in the volume data.

Download the original TIFF image.

Plate 4. Left: AR view of mannequin from a moving camera. Vision-based tracking eliminates relative latency and temporal registration error; cf. Plate 1, left. Center: AR view of the moving ultrasound probe. Predictive tracking of the mechanical tracker reduces the effect of relative latency and temporal registration error; cf. Plate 1, center. Right: Applying the relative latency measurement makes the ultrasound data lag behind the video image of the probe, but aligns the ultrasound data with the inside of the breast.

This work is part of the UNC Ultrasound Visualization project.

Publications

Project Members and Collaborators

Research Sponsors


Created 17-24 Apr 97 by Mark A. Livingston, Marco C. Jacobs

Last Modified: 29 Jul 97 by Mark A. Livingston.

Mail Andrei State for more info