Head-mounted displays (HMDs) and head-tracked stereoscopic displays provide the user with the impression of being immersed in a simulated three-dimensional environment. To achieve this effect, the computer must constantly receive precise six-dimensional (6D) information about the position and orientation or pose of the user's head, and must rapidly adjust the displayed image(s) to reflect the changing head locations. This pose information comes from a tracking system. We are working on wide-area systems for the 6D tracking of heads, limbs, and hand-held devices.
In April, 1997 the UNC Tracker Research Group brought its latest wide-area ceiling tracker online, the HiBall Tracking System. The system (shown in images above) uses relatively inexpensive ceiling panels housing LEDs, a miniature camera cluster called a HiBall, and the single-constraint-at-a-time (SCAAT) algorithm which converts individual LED sightings into position and orientation data.
The HiBall Tracker provides sub-millimeter position accuracy and resolution, and better than 2/100 of a degree of orientation accuracy and resolution, over a 500 square foot area. This is the largest tracking system of comparable accuracy. The HiBall Tracking System is being sold commercially at this time.
Certain virtual-environment applications can benefit from long-range trackers. We believe that exploring an architectural model by walking is more natural and less confusing than exploration by flying or steering treadmills. For example, using the UNC wide-area optical tracking system, one can walk inside a model of Professor Fred Brooks's kitchen, where the scale of the virtual model is 1:1 with reality. We believe this method of exploration provides a better understanding of the model than unconstrained flying, which distorts the user's sense of scale and space.
Another important reason for long-range tracking is that augmented reality demands this capability. Most HMDs are closed-view, preventing the user from seeing the real world outside. In contrast, augmented reality uses see-through HMDs that let the user see the real world while simultaneously superimposing on or compositing 3D virtual objects with the real environment. Ideally, it would appear to the user that the real and virtual worlds coexist. Augmented-reality applications use the virtual objects to convey information that the user cannot detect with his or her own senses. Potential applications include giving doctors "X-ray vision" into their patients (by superimposing 3D MRI or ultrasound data onto the patient's anatomy) and aiding assembly and repair of complex 3D equipment with schematic overlays. Augmented reality, however, imposes much stricter requirements on the tracking system than virtual-environment applications do. Augmented reality requires highly accurate trackers because even tiny tracker errors cause noticeable misregistrations between real and virtual objects. Also, augmented reality demands long-range trackers because "flying" is meaningless. In a closed-view HMD, we can create the illusion of flight by translating all the virtual objects. In augmented reality, however, if a user wants to see the other side of a real patient, he must physically move himself and the HMD he wears, and this requires long-range trackers. Much more work needs to be done to improve tracking systems before augmented reality becomes a practical technology.
Brian Clipp, Greg Welch, Jan-Michael Frahm, and Marc Pollefeys, “Structure From Motion via a Two-Stage Pipeline of Extended Kalman Filters,” Proceedings of the British Machine Vision Conference (BMVC 2007), September 1013 2007 (PDF)
Hua Yang, Marc Pollefeys, Greg Welch, Jan-Michael Frahm, and Adrian Ilie. “Differential Camera Tracking Through Linearizing the Local Appearance Manifold,” in Proceedings of the 2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR' 07), 2007. (PDF)
Greg Welch, B. Danette Allen, Adrian Ilie, and Gary Bishop, “Measurement Sample Time Optimization for Human Motion Tracking/Capture Systems,” Proceedings of Trends and Issues in Tracking for Virtual Environments, Workshop at the IEEE Virtual Reality 2007 Conference (Charlotte, NC USA) (Gabriel Zachmann, ed.), Shaker, March 11 2007. (PDF)
Greg Welch, Michael Noland, and Gary Bishop, “Complementary Tracking and Two-Handed Interaction for Remote 3D Medical Consultation with a PDA,” Proceedings of Trends and Issues in Tracking for Virtual Environments, Workshop at the IEEE Virtual Reality 2007 Conference (Charlotte, NC USA) (Gabriel Zachmann, ed.), Shaker, March 11 2007. (PDF)
Hua Yang and Greg Welch. “Illumination Insensitive Model-Based 3D Object Tracking and Texture Refinement,” In Proceedings of the Third International Symposium on 3D Data Processing, Visualization and Transmission (3DPVT 2006), The University of North Carolina at Chapel Hill, Chapel Hill, NC USA, June 14-16, 2006. (PDF)
Hua Yang and Greg Welch. “Model-Based 3D Object Tracking Using an Extended-Extended Kalman Filter and Graphics Rendered Measurements,” in Proceedings of 1st Computer Vision for Interactive and Intelligent Environments (CV4IIE) workshop, University of Kentucky, Lexington, KY. (PDF)
Welch, Greg and Eric Foxlin (2002). Motion Tracking: No Silver Bullet, but a Respectable Arsenal, IEEE Computer Graphics and Applications, special issue on Tracking, November/December 2002, 22(6): 2438. (PDF)
Vallidis, Nicholas M. (2002). WHISPER: A Spread Spectrum Approach to Occlusion in Acoustic Tracking, Ph.D. disseration under the supervision of Gary Bishop, University of North Carolina at Chapel Hill, Department of Computer Science. (PDF)
Miller, Dorian and Gary Bishop (2002). Latency Meter: a Device for Easily Monitoring VE Delay, in Proceedings of SPIE Vol. #4660 Stereoscopic Displays and Virtual Reality Systems IX, San Jose, CA, January 2002. (PDF)
Welch, Greg, Gary Bishop, Leandra Vicci, Stephen Brumback, Kurtis Keller, and D'nardo Colucci (2001). High-Performance Wide-Area Optical Tracking -The HiBall Tracking System, Presence: Teleoperators and Virtual Environments 10(1). (PDF)
Welch, Greg, Gary Bishop, Leandra Vicci, Stephen Brumback, Kurtis Keller, D'nardo Colucci. 1999. "The HiBall Tracker: High-Performance Wide-Area Tracking for Virtual and Augmented Environments," Proceedngs of the ACM Symposium on Virtual Reality Software and Technology 1999 (VRST 99), University College London, December 20-22, 1999. Best paper designation. (PDF)
Welch, Greg and Gary Bishop. 1997. "SCAAT: Incremental Tracking with Incomplete Information," SIGGRAPH 97 Conference Proceedings, Annual Conference Series. ACM SIGGRAPH, August 1997, Los Angeles, CA. (PDF)
Azuma, Ronald, and Gary Bishop. "A Frequency-Domain Analysis of Head-Motion Prediction," Proceedings of SIGGRAPH'95 (Los Angeles, California, August 5-11, 1995). In Computer Graphics Proceedings, Annual Conference Series, 1995, ACM SIGGRAPH, 401-408.
Chi, Vernon L., "Noise Model and Performance Analysis Of Outward-looking Optical Trackers Using Lateral Effect Photo Diodes," TR95-012, Department of Computer Science, UNC at Chapel Hill, April 1995 (tar.gz)
Azuma, Ronald. "Predictive Tracking for Augmented Reality," Ph.D. Dissertation, University of North Carolina at Chapel Hill. Computer Science technical report TR#95-007, February 1995.
Gottschalk, Stefan and Vernon L. Chi, "Sensitivity of System Accuracy to Fabrication Tolerances in an Outward-looking Tracker," TR94-055, Department of Computer Science, UNC at Chapel Hill, October 1994. (ps.Z)
Gottschalk, Stefan, and John F. Hughes. "Autocalibration for Virtual Environments Tracking Hardware," Proceedings of SIGGRAPH'93 (Anaheim, California, August 1-6, 1993). In Computer Graphics Proceedings, Annual Conference Series, 1993, ACM SIGGRAPH, New York, 1993, 65-72.
Azuma, Ronald. "Tracking Requirements for Augmented Reality," Communications of the ACM 36, no. 7 (July 1993), 50-51. (Text, Images)
Azuma, Ronald and Mark Ward. "Space-Resection by Collinearity: Mathematics Behind the Optical Ceiling Head-Tracker," UNC Chapel Hill Department of Computer Science technical report TR 91-048 (November 1991), 23 pages. (ps.Z)
Wang, Jih-fang, Vernon Chi, Henry Fuchs, "A Real-time Optical 3D Tracker for Head-mounted Display Systems," Proc. 1990 Symposium on Interactive 3D Graphics, Snowbird, UT, 25-28 March 1990; in Computer Graphics, Vol. 24, No. 2, March, 1990, pp. 205-215.
Bishop, Gary and Henry Fuchs. 1984. "The Self-Tracker: A Smart Optical Sensor on Silicon," Proceedings, Conference on Advanced Research in VLSI at MIT (January 23-25, 1984), Artech House.
Office of Naval Research, contract no. N00014-0e-C-0349, "Behavior Analysis and Synthesis for Intelligent Training (BASE-IT)," February 2008-September 2010.
Previous
Office of Naval Research, contract no. N00014-01-1-0064, "Technology for Full-Body Tracking," October 2000-September 2001, under Dr. Lawrence J. Rosenblum, Program Officer for Visualization and Computer Graphics at ONR.
National Science Foundation Cooperative Agreement no. ASC-8920219 and Advanced Research Projects Agency: "Science and Technology Center for Computer Graphics and Scientific Visualization" (R. Riesenfeld, Center Director, PIs at the 5 participating institutions are: A. van Dam, Brown Univ.; A. Barr, California Inst. of Technology; D. Greenberg, Cornell Univ.; H. Fuchs, Univ. of North Carolina at Chapel Hill; R. Riesenfeld, Univ. of Utah).