Mark Alan Livingston
Defended: 7 October 1998
Advisor: Prof. Henry Fuchs
Reader: Prof. Gregory Welch
Reader: David Eberly, Senior Programmer, Numerical Design Limited
Reader: Prof. James Coggins
Committee member: Prof. Gary Bishop
Abstract
Tracking has proven a difficult problem to solve accurately without limiting
the user or the application. Vision-based systems have shown promise, but are
limited by occlusion of the landmarks. We introduce a new approach to
vision-based tracking using structured light to generate landmarks. The novel
aspect of this approach is the system need not know the 3D locations of
landmarks. This implies that motion within the field of view of the camera
does not disturb tracking as long as landmarks are reflected off any surface
into the camera.
This dissertation specifies an algorithm which tracks a camera using
structured light. A simulator demonstrates excellent performance on user
motion data from an application currently limited by inaccurate tracking.
Further analysis reveals directions for implementation of the system,
theoretical limitations, and potential extensions to the algorithm.
The term augmented reality (AR) has been given to applications that
merge computer graphics with images of the user's surroundings. AR could give
a doctor "X-ray vision" with which to examine the patient before or during
surgery. At this point in time, AR systems have not been used in place of the
traditional methods of performing medical or other tasks.
One important problem that limits acceptance of AR systems is lack of precise
registration---alignment---between real and synthetic objects. There
are many components of an AR system that contribute to registration. One of
the most important is the tracking system. The tracking data must be
accurate, so that the real and synthetic objects are aligned properly.
Our work in augmented reality focuses on medical applications. These
require precise alignment of medical imagery with the physician's view of
the patient. Although many technologies have been applied, including
mechanical, magnetic, optical, et al, we have yet to find a system
sufficiently accurate and robust to provide correct and reliable
registration.
We believe the system specified here contributes to tracking in AR
applications in two key ways: it takes advantage of equipment already used for
AR, and it has the potential to provide sufficient registration for demanding
AR applications without imposing the limitations of current vision-based
tracking systems.
A PDF file of
the entire thesis is available. Inquiries regarding the thesis work should
be sent to me at .
This research was supported in part by the following agencies and institutions:
- Defense Advanced Research Projects Agency (ISTO DABT 63-93-C-0048,
Prof. Henry Fuchs and Prof. Frederick P. Brooks, Principal Investigators)
- National Science Foundation Science and Technology Center for Computer
Graphics and Scientific Visualization (ASC-8920219, Prof. Henry Fuchs,
Principal Investigator for UNC site)
- 1997-1998 Link Fellowship, Link Foundation of the Institute for
Simulation and Training
- Naval Command, Control, and Ocean Surveillance Center (N6601-97-1-8919,
Prof. Henry Fuchs and Prof. Nick England, Principal Investigators, Prof. Greg
Welch and Mark A. Livingston, Investigators)
Portions of this thesis work are included in U.S. Patent #5,870,136,
issued on 8 February 1999 to Dr. Gary Bishop, Dr. Henry Fuchs,
Dr. Mark A. Livingston, and Dr. Greg Welch, entitled "Dynamic
Generation of Imperceptible Structured Light for Tracking and
Acquisition of Three Dimensional Scene Geometry and Surface
Characteristics in Interactive Three Dimensional Computer Graphics
Applications."