UNC Ultrasound Research
First Augmented-Reality System
A paper from 1992 [1] describes our
initial ultrasound visualization system, which also introduced the
term "augmented reality." The concept of augmented reality dates back
to Ivan Sutherland's original head-mounted display, which was an
optical see-through design. Our HMD was video see-through.
Our system displayed a small number of individual ultrasound slices of
a fetus superimposed onto a pregnant patient's abdomen. The system
consisted of a standard ultrasound machine, a frame grabber, a
Polhemus 3Space® tracking system, our custom built,
high-performance graphics engine called Pixel-Planes 5,
and a VPL head-mounted display fitted with a miniature video camera.
The video camera recorded the "real world" view of the ultrasound exam
of a pregnant patient, and an external chroma-keying device composited
the images from this camera with computer-generated imagery containing
the ultrasound data. The result was an image of the ultrasound data
laid on top of the patient's anatomy, but the ultrasound data did not
provide a clear image of the fetus and the fetus did not appear in 3D.
On-line Volume Reconstruction
The second system [5] attempted to
improve the visualization of the fetus through the use of volume
rendering, a computer graphics rendering technique in which the data
is represented as a collection of tiny volume elements or voxels.
Ultrasound slices were acquired and reconstructed on-line into a
rectilinear volume data set, which was rendered in real time on
Pixel-Planes 5. Slices were added to the volume at a rate of ~1 Hz,
and images were rendered at ~10 Hz. The display presented to the user
shows a synthetic ultrasound slice which appears to emit volume
material into a volumetric data set superimposed onto the pregnant
patient's abdomen. Even with a newer ultrasound machine, better
tracking (Ascension Flock of Birds®), and better (volume)
rendering, the images were blurry and ultimately disappointing.
On-line Acquisition, Off-Line Reconstruction
In addition to a superior tracking system (the UNC optoelectronic ceiling
tracker), the third system [6]
also featured improved camera and ultrasound probe calibration. Its
major innovation, however, was the attempt to improve the display by
lifting the requirement to acquire and show the ultrasound data in
real time. Data acquisition was still performed in real time, as it
would be for a full-fledged, on-line system, but the task of
visualizing that data was performed off-line. This allowed a large
number of ultrasound slices to be used for a higher-quality volume
reconstruction. The reconstructed volumes were larger and could be
rendered at high resolution. This effectively bypassed limitations
imposed by the state of the art in real time computer graphics
hardware and algorithms and even in tracking equipment. The
experiments conducted with this system yielded a higher-quality
visualization and set a standard to be approached by our subsequent
real-time systems.
Last Modified: 29 Jul 97 by Mark
A. Livingston