|I am currently working in Prof. Henry Fuchs' research group, as part of the "Enhancing Human Capabilities through Virtual Personal Embodied Assistants in Self-Contained Eyeglasses-Based Augmented Reality (AR) Systems" project. We are envisioning a telepresence system available anywhere, anytime, using egocentric reconstruction, which enables users to reconstruct their environment and body using only eyeglasses-mounted cameras and a few body-worn inertial sensors.
|I continued my thesis research as a postdoctoral research associate. I set up an indoor environment instrumented with 8 PTZ cameras and ran experiments with the participation of USMC cadets, demonstrating that my real-time camera contol method could scale up in the number of cameras and targets.
|My thesis research was conducted as part of the "Behavior Analysis and Synthesis for Intelligent Training" (BASE-IT) project. I worked on real-time control of Pan-Tilt-Zoom cameras for 3D reconstruction. I performed a complexity analysis of two camera control spaces, developed strategies aimed at tractable control, proposed a novel objective function and implemented a specific method for on-line, real-time camera control.
|From Fall 2008 to Summer 2009, I worked in the "Avatar" project, under the guidance of professors Greg Welch and Henry Fuchs. We built upon the research on the Shader Lamps projection technique and used cameras and projectors to capture and map the dynamic motion and appearance of a real person onto a humanoid animatronic model. The other students working in this project were Andrew Nashel and Peter Lincoln.
|The main focus of my work for the "Behavior Analysis and Synthesis for Intelligent Training" (BASE-IT) project was on real-time controlling of Pan-Tilt-Zoom cameras for optimal image quality, aiming to best accomplish high-level tasks such as 3D reconstruction. My approach consisted in global optimization combined with local scheduling. To demonstrate my results, I extended the functionality of the ObjectVideo Virtual Video Tool to allow movement of targets under program control using a client-server paradigm.
|Since Fall 2007, I have been working in the "Behavior Analysis and Synthesis for Intelligent Training" (BASE-IT) project, under the guidance of professor Greg Welch. Our collaborators are the Naval Postgraduate School and the Sarnoff Corporation. I am working on real-time scheduling of Pan-Tilt-Zoom cameras for optimal image quality, aiming to best accomplish high-level tasks such as 3D reconstruction. The other student working in this project was Marc Macenko.
|From Fall 2006 to Summer 2007, I worked in the "Prototype for Two-station, Four-Person, Proper Eye-Gaze Telepresence System" project. We built upon the research in the Group Tele-Immersion project to provide a prototype of a two-station, four-person telepresence system. We used multiple cameras and autostereo displays that turn images like the one on the right into a 3D image. The other students working in this project were Andrew Nashel and Peter Lincoln.
In 2005 I continued working in the 3D Telepresence for Medical Consultation: Extending Medical Expertise Throughout, Between, and Beyond Hospitals project. I continued improving my Camera Calibrator by making a common camera interface for dynamically-loaded camera libraries, so that users can extend the program's functionality by writing their own camera libraries.
I also worked on the networking architecture of our system. I wrote a flexible, multi-threaded camera server, and corresponding clients for both the reconstruction application and my Camera Calibrator.
In Fall 2004, I started working in the 3D Telepresence for Medical Consultation: Extending Medical Expertise Throughout, Between, and Beyond Hospitals project. I continued looking into color calibration and incorporated geometric calibration from the Open Source Computer Vision library and enhanced corner detection from Vladimir Vezhnevets.
The photometric calibration results are available as a tech report, and appeared in the proceedings of the ICCV 2005 conference. The software itself is described on this page, and available upon request.
In Fall 2003, continuing my work in the Electronic Books for the Tele-Immersion Age project, I started looking into color calibration as a way to improve our 3D reconstruction.
I first looked at a pure image-based approach, in which captured images are processed before reconstruction to make their colors more similar. In Spring 2004 I switched to a hardware-based approach that uses the camera registers to make sure images are similar before they are captured.
Starting Fall 2003, as part of my work in the Electronic Books for the Tele-Immersion Age project, I worked on improving the 3D reconstruction approach by Ruigang Yang.
We reconstructed several dynamic sequences showing the tying of several knots and a pretend surgery, and processed the results to make them available on the web. The results are availabe on this page, prepared with Andrei State.
In Fall 2002, I continued my work in the Electronic Books for the Tele-Immersion Age project, under the guidance of professor Greg Welch. The other students working in this project were Haolong Ma and Ruigang Yang.
We worked with our collaborators at Brown University and Dr. Bruce Cairns at the UNC School of Medicine on various aspects of the system, ranging from 3D reconstruction to visualization.
In May 2002, I started working in the Electronic Books for the Tele-Immersion Age project, under the guidance of professors Greg Welch, Anselmo Lastra and Henry Fuchs. The other student working in this project was Kok-Lim Low.
In Summer 2001, I worked in the Being There project, under the guidance of professors Henry Fuchs, Anselmo Lastra and Greg Welch. The other student working in this project was Kok-Lim Low.
My duties included porting the Evans (our SGI Reality Monster) version of our renderers to a cluster of PCs, and my interests included occlusion culling, shadow removal and new calibration methods for the projectors.
|From Fall 2000 to Spring 2001, I worked with professor David Stotts, in the EPA Modelling project. My duties included integrating and working with the GRASS GIS, the PostgreSQL DBMS and the R programming language.
|In December 2002, I had a short internship at Mitshubishi Electric Research Labs, with former UNC graduate Ramesh Raskar. The other student working in this project was Jingyi Yu.
We worked on enhancing dark images by using information from other images taken from the same viewpoint. For details, see the project page. The results were published in the ACCV 2004 and NPAR 2004 conferences.
|In Fall 2003, I took COMP 256 (Computer Vision) with professor Marc Pollefeys. For the first homework, which dealt with photometric stereo, I used an implementation of the multigrid 2D integration algorithm from "Numerical Recipes in C++". See the homework report here, or download the integration program here.
|For the second homework in COMP 256, which dealt with epipolar geometry and stereo, I wrote a little program that allows easy setting of point correspondences between two images. Its features also include the ability to zoom in and out, add and delete correspondences, load and save them as a text file, as well as draw the epipolar lines if the locations of the epipoles are known. See the homework report here, or download the program here. The source code is available upon request.
|In Fall 2002, I took COMP 238 (Advanced Image Generation) with professor Anselmo Lastra. For my final project, I used the code base from NPR Quake and implemented additional rendering techniques to achieve a cartoon-like rendering. See the project report here.
|In Spring 2002, I took COMP 236 (Computer Graphics) with professor Gary Bishop. For my final project, I built an image mosaic generating application. I farmed the web for images, built an image library, then wrote a program that takes an image and constructs a mosaic image with the same look using the images in the library. See the project report here.
|From 2013 to 2016 I served as Software Consultant for KindHeart, Inc. I developed the software for a fluoroscopy simulator that used a camera to image transparent acrylic structures that simulate specific parts of the human anatomy, resulting in a training experience that did not involve radiation. The simulator was awarded a patent in 2014: Richard Feins, Hadley Wilson, Adrian Ilie, "Radiation-free simulator system and method for simulating medical procedures".
|From 2010 to 2015 I served as Software Designer in a theater project. I helped stage The Uncanny Valley, "a play for two humans and one robot" that played in Chapel Hill, NC and Brooklyn, NY. We used a custom version of the RoboThespian robot. We captured the actor's movements, likeness and voice, then broke them down into sequences that could be played back on-demand by the robot when facing the actor during the play.
|As part of my work on calibrating cameras, I use checkerboard patterns a lot. Later on, I needed patterns of a specific size to ensure that scale was preserved in camera images when shown on a specific display. It was always a pain to know the exact size of the squares, and making a pattern of a specific size was a tedious task. I wrote a program in Turbo C++ to easily make and print checkerboard patterns of a specified size. The program can be downloaded from here.
|I served as one of the Online Activities Chairs for the 2006 IEEE International Workshop on Projector-Camera Systems. I made several changes to the MyReview paper submission and review system, to make it suit our needs and incorporate an upload progress bar. The author of MyReview has incorporated the changes in version 1.9.3. I also helped with the 2007 IEEE International Workshop on Projector-Camera Systems, where we used an improved version of the same system.
|In December 2006 and January 2007, I designed a Microsoft Access database for the document disposition and retention schedules managed by the UNC Manuscripts Department at The Wilson Library, where my wife worked as a Research Assistant.
I also wrote a front-end in Turbo C++, the new free C++ programming tool offering from Borland. The application allows editing the schedules and outputs them as formatted HTML or Word documents.
|In January 2004, I designed a Microsoft Access database for the Inter-Faith Council for Social Service in Chapel Hill.
One year later, I wrote an easier to use front-end in Borland Delphi 3, using freeware components from Kiril Antonov and ProfGrid.Com.
|Last modified Tuesday, September 14, 2021 by Adrian Ilie