We demonstrate a novel method how to integrate stereo content of a live act into a virtual reality (VR) environment. This opens a new level of immersion, when watching the recording of a stage event using a head mounted display. In contrast to existing solutions, we offer proper depth perception, allow for a low complexity recording and distribution and still avoid any motion sickness caused by objects being close to the observer. To this end, we separate the digitization of the event location from the recording of the actor’s performance. As a main contribution of the paper, we show how to achieve high visual quality when compositing the stereo recording showing the actors with the reconstructed 3D-model of the event location. To do so, we have created a tool that allows to simulate the 3D perception of the