Systems that provide remote viewing of three-dimensional data with interactive viewpoint control must confront two key problems: latency and bandwidth. The straightforward approach of transmitting and displaying rendered images results in a delay of one round-trip between viewpoint change and the corresponding change in the displayed image. We avoid this delay by transmitting a representation of the scene to the user's machine, which then locally closes the viewpoint-to-display loop. If the scene representation is geometry-based, the bandwidth, user-side storage, and user-side graphics rendering capability required for updates to the scene are unbounded. We show that an image-based representation can allow for arbitrary scene changes, while requiring only fixed bandwidth, storage, and rendering power. We demonstrate a system that renders images on a "rendering server", and then transmits them to the user's machine where image warping using per-pixel disparity values compensates for system latency and synthesizes stero images for display. We also develop enhancements to the warping technique that improve its quality and speed.