COMP 239: Exploring Virtual Worlds

The Televator TM Project

(Telepresence in a Real Elevator)

Overview

The Televator project is a joint Comp 239 effort to telepresence in a real elevator. The final project goal is to make use of our Sitterson Hall freight elevator as a vertical motion platform, and to provide elevator passengers a view through a virtual window (a screen with computer generated images) into imaginary and eventually real environments. By matching the images that appear in the elevator "window" (the screen) with the vertical motion of the elevator, we hope to provide a compelling visual experience.

The basic framework for the project is shown below. At the center of the Televator is the elevator, a simulation to begin with and the real thing eventually. Next, within the green ring, is an external viewer interface which will allow a second remotely-located person to view the inhabitants of the elevator, and visa versa. The external viewer will use a workstation to view a model of the elevator shaft with an elevator that moves in conjunction with the real thing (or the simulation). The will be able to fly up to the elevator and view the contents. The elevator passenger will on the other hand be able to see the external viewer, but in the form of an avatar. (The actual avatar form will be determined by the environment that is being viewed by the elevator passengers.) Finally, the elevator passenger(s) will be able to select the virtual environment they wish to view and interact with (in limited fashion). These environments are shown around the outside of the green ring. Soon you will be able to click on the individual parts of the diagram to see more information about that part.

Development Plan

The entire system will be developed in several phases, chosen with several goals in mind:

Phase 1: The Simulation

Overview

During phase 1 we will implement the entire system as a stand-alone software application to be used in the HMD lab in Sitterson Hall. Users will wear a head-mounted display and will interact with a "Python" 6D joystick, choosing elevator floors using virtual buttons in the virtual elevator, and interacting (in a simple manner) with the environment currently seen through the window. The virtual elevator model will effectively restrict the user motion (in the real world) to the confines of the elevator by not providing any means for flying, nor opening the elevator doors. The user could physically walk through the virtual elevator walls, but that's not our problem.

This first phase is meant to provide a path to a complete system that does not rely on our efforts to obtain permission to run video (etc.) to the elevator. The real elevator tracking and interface efforts will proceed in parallel with the hope that when they are ready, the phase 1 system can be transitioned to the phase 2 system, the real elevator system.

The Elevator

In place of the real elevator, a virtual elevator (modeled by Voicu Popescu) will be modeled in true dimensions underneath our Ascension tracking device. Voicu will design and provide an "elevator API" to facilitate a smooth transition to the real elevator (Phase 2 below). Among other things, the elevator API will provide simulated elevator position (vertical) and real user pose information to the individual environment applications, and will accept (in some form) the appropriate images to appear in the "window" in the back of the elevator. For example, if an environment application is told that it is the active environment (in response to a user button press) the environment application might then begin rendering proper images, based on elevator and user pose, to some reserved texture memory which the elevator API is then mapping onto the elevator screen (the window).

The External Viewer Interface

During this first phase Tom Hudson will develop a system that will allow an external (remote) viewer to visualize the motion of the elevator in the elevator shaft, and to fly around (virtually) the shaft looking at the inside of the elevator. Tom will also provide the other project members with an external viewer interface API. This API will for example allow environment applications to obtain the pose of the external viewer (when there is one) so that an appropriate avatar in the environment that the elevator passenger sees through the "window" can be made to move in conjunction with the movements of the external viewer. For example, if the elevator user has selected the ocean environment, that application could use the external viewer pose to control the general motion of a shark in the ocean. When there is no external viewer, the shark would move normally. Conversely using cameras mounted in the elevator the external viewer would be able to see the elevator passenger through a similar interface.

The Environment Applications

Each of the individual environment applications will be developed independent of each other during this phase, and relatively independent of the elevator simulation (Popescu) and the multi-user interface (Hudson). Because Popescu and Hudson will be developing APIs for their respective systems, the environment applications can be designed to simply interact with the API to determine the user pose in their environment, as composed from the virtual elevator position and the user's position within the elevator. The developers of the environment applications will of course simply need to keep in mind that the user will have very restricted motion and a view that is limited by the window in the elevator as shown in the birds-eye (top) view below.

Phase 2: The Real Elevator

Overview

During phase 2 we will effectively transition the entire system, minus the elevator simulator, to the real elevator. A user, i.e. a passenger in the elevator, will be able to don a pair of tracked shuttered glasses and then will be able to view the virtual environments through a small "window" (a small projection screen) in the back of the elevator. The user will have the option of selecting, via some real push-buttons on the bottom of the screen, which if the environments they want to view as the Televator travels up and down between floors.

The Elevator

The real elevator will replace the virtual elevator in the simulation of phase 1, although Voicu Popescu's API for the simulator will remain. The API will now acquire and supply (to environment applications) real elevator and user position data, and will send the selected environment images to a Digital Light Projector (DLP) mounted on the ceiling of the elevator. (See below under the Display section.)

Tracking

The user, a passenger in the elevator, will be tracked inside the elevator, and the elevator itself will be tracked (vertical position) in the elevator shaft. We might track the user's position in the elevator using a commercial off-the-shelf optical tracker, or we might use some simple (crude) video-based tracking mechanism to estimate the user's head position. In any case we will probably not track the user's orientation, only their position. The elevator position will probably be tracked with an inertial sensor (a linear accelerometer), or an optical range finder (laser based), or both.

Work on tracking the user and the elevator will proceed in parallel with phase 1 so that it is ready for phase 2.

Display

Because we want to minimize the equipment a user must don to use the system we do not want to use a head-mounted display in the elevator. Instead we will use a Digital Light Projector (DLP) to project images, and the user will wear synchronized shuttered glasses in order to see time-multiplexed stereo. The DLP will be mounted on the ceiling of the elevator, near the doors, projecting toward a screen mounted on the back wall of the elevator as shown below in two views. The image generation is described in the next section.

Image Generation

The video signal for the projector will not be generated locally in the elevator, but will instead be generated remotely (in Sitterson Hall) and then "piped into" the elevator via some custom wiring. (We are in the process of seeking NC approval for the additional wiring.) The user and elevator position information (see Tracking above) will be relayed via similar custom wiring to a graphics engine located (most likely) in the HMD lab, possibly the same machine used during phase 1.

The External Viewer Interface

The external viewer interface should remain the same.

The Environment Applications

The environment applications should remain the same.

The Real Thing Baby

At the moment Bill Mark and I are trying to finalize the Televator wiring needs. Then John Thomas is going to determine the proper sheething, etc., and we will apply for permission to run the cables in the shaft. (Seeking permission from UNC and the NC Dept. of Labor. We will then likely ask the UNC Physical Plant to do the wiring.)

Phase 2: Real Remote Environments

Although we do not have immediate plans to do so, we would love to be able to (some day) provide elevator passengers with views of not only synthetic imaginary environments, but real remote environments using remotely-mounted cameras and real-time image warping. For example, cameras mounted outside Sitterson Hall could be used to provide the passenger with the impression that they were seeing through the elevator and shaft into the parking lot outside. Even better, the cameras could be mounted on several window sills of a building that overlooks (say) the Eifel Tower, thus giving our local elevator passengers a feeling of being in a glass elevator in Paris.

Future work for sure...

Welcome[Comp 239 Home]


* Last revised: March 5, 1997