Click on each image to open a movie (in a new window if you have Javascript enabled). You need the Quicktime plugin to play them. To avoid occasional plugin errors, stop each movie with a mouse click before closing the window (if you have Javascript enabled) or before clicking on the Back button (Javascript disabled).
This technology uses videometric tracking of color-coded fiducials together with a conventional magnetic tracker to achieve highly accurate registration between images acquired by tracked video cameras and real-time computer-generated views. Two miniature video cameras tracked by this method can be attached to a head-mounted display, thus transporting the wearer of the head-mount into a world which contains both familiar, real objects and unfamiliar, virtual elements. Here are some examples of what can be accomplished with augmented reality technology:
3D
morph of steel ball into a shiny teapot which reflects its environment. The
location of the steel ball is exactly pre-calibrated and thus known to the
system. With the help of accurate tracking, the image of the ball is
accurately extracted from the video image and remapped onto the
computer-generated teapot. (no
sound,
5.6 Megabytes)
3D
morph of glass sphere into a transparent teapot. Same technique as
above, except that a glass sphere is used instead of the steel ball.
(no sound,
2.4 Megabytes)
Painting
virtual chrome onto a real object (nose sculpture). Same technique
again. The image of the steel ball is remapped onto the computer-generated
paint. The sculpture has been scanned beforehand, thus its shape is
precisely known to the computer. (no
sound,
5.2 Megabytes)
Virtual
object (knot) intersects and casts shadows onto real objects (sculpture,
tabletop). This also uses a pre-scanned sculpture. Thus the
computer can accurately render an object which intersects and shadows the
sculpture. The clip also shows how a moving light is also tracked by the
computer, so that the synthetic shadows move just like the real shadows.
(no sound,
2.5 Megabytes)
Real
objects (playing cards) are replicated as synthetic objects. Here
accurate tracking helps acquire an image of the playing card from the video
camera. That image is subsequently mapped onto the computer-generated
playing card. (no
sound,
5.6 Megabytes)
A
tabletop parody of the film "Independence Day." (with
sound, 1.3 Megabytes)
This section describes some of the technical fundamentals behind this technology.
Why
it's necessary. Conventional tracking technology (here: magnetic
tracker) is usually not accurate enough for such applications. The white
wireframe drawing represents the computer's attempt at lining up the tabletop
cuboids if only poor camera tracking information is available. (with
sound, 1.5 Megabytes)
How
it works. Explains the operation of the hybrid vision-assisted
tracking algorithm in some detail. (with sound,
23.1 Megabytes)
Demonstration
of robustness during operation. Shows how covering up fiducials or
distracting the system with color imagery other than the fiducials known by it
has no adverse effect on tracking. The colored rectangles are dynamic
fiducial search areas. (with sound, 3.9
Megabytes)
Another
robustness demonstration. Shows how camera motion and the resulting
appearance and disappearance of fiducials does not confuse the system. The
white wireframe drawing of the cuboids remains registered at all times.
(no sound, 2.2 Megabytes)
A
third robustness demonstration. Shows how even violent camera shake
does not confuse the system. (no
sound,
1.1 Megabytes)
Stereo
demonstration. The head-mounted display wearer experiences a more
convincing illusion if the system can operate in stereo. (with
sound, 2.2 Megabytes)
Registration
error analysis. Describes how the system responds to calibration and
fiducial detection errors. (with sound, 3.4
Megabytes)
Ultrasound-guided
needle biopsy of the breast. Needle biopsy is used to sample
suspicious lesions within the breast. We hope that one day augmented
reality technology will provide more accurate and faster guidance for the needle
insertion, thus alleviating the trauma to the patient and improving the accuracy
of the procedure. The clip shows how AR technology could help a physician
see inside a patient. (with sound, 32.1
Megabytes)
Fetal
ultrasound examination. This segment was generated in 1994 using
off-line techniques. This means the pregnant patient was first filmed with
a tracked camera, then an ultrasound scan of her abdomen was performed.
Afterwards the 3D fetal image was reconstructed from the scans. The
reconstructed fetus was then rendered from viewpoints matching the camera views
and superimposed over the latter. Thus the segment was put together in the
manner of motion picture visual effects. It was not done live within a
head-mounted display because sufficient computational power and algorithms were
not available. It demonstrated the potential of visualizing echography
data with augmented reality technology. (no
sound, 3.2
Megabytes)
Laparoscopic
visualization. Laparoscopy is a form of minimally invasive
surgery. The surgeon operates through small openings and views the patient
internals via laparoscopic cameras and video screens. We hope that some
day augmented reality will aid laparoscopic procedures by providing a
visualization that is akin to open surgery. This clip shows preliminary
experiments and demonstrations. (with sound, 33.2
Megabytes)
Hybrid Laparoscopic/Ultrasound/Preoperative-CT
Visualization and Phantom Biopsy. This clip shows a preliminary experiment that merges
live laparoscopic and ultrasound display with a preoperative CT image in which a
suspicious lesion has been segmented and outlined in red. The lesion is
biopsied under AR guidance and at the end of the clip the extracted sample is
shown. (no sound, 50
Megabytes)
Last modified 03/16/2006 12:23:48 AM