To date, we have primarily focused on the problem of supporting videoconferencing with digital audio and video. In this project we will initiate next-generation research into network and operating system support for distributed shared real-time objects. There are three aspects of this work. First, we will go beyond the now mundane problem of supporting ``talking heads in a window'' to consider shared VE and audio/video objects described in the two projects above. The goal here is to support high-bandwidth, non-textual multi-user interfaces across wide-area networks. Second, we will investigate the use of adaptive congestion control algorithms for coupling collaborative working sessions to ensure ``maximally interactive" collaborations given the level of congestion in the network connecting collaborators. The key idea here is to ``tune'' the level of interactivity in a collaborative session to match the level that is currently sustainable given current network conditions. Finally, we will investigate the application of the Internet Engineering Task Force's (IETF) integrated services model of network communications to collaborative environments. The goal here is to understand how to map application-level performance requirements for collaborative applications to quality-of-service levels supported by the network.
This project will give you experience with research in operating systems, networks, multimedia, and virtual environments. For more information, contact Kevin Jeffay or Prasun Dewan.