Hey there, I'm a student at Art Center College of Design, and we have begun filming in 3D (Stereoscopic), with polarized viewing. I'm doing some research on Stereoscopic 3D in order to refine the workflow, from shooting, to editing, and then displaying of the 3D footage. My first question is with regards to the projection/display of the 3D footage. As of right now, we're using a Windows-based program called Stereoscopic Player. Simply put, what Stereoscopic Player does is simultaneously play two video files in two different displays or projectors. Is there is a Mac-based alternative to simultaneously playing video files (or perhaps live video footage streamed from the two cameras) in two different displays/projectors? I know that there are scientific applications for stereoscopic viewing of molecular structures, but I don't know how they work. I doubt that they'll work for viewing video footage. I'm also looking into Houdini Apprentice, which seems to have some sort of stereoscopic integration. But I haven't tried it yet. Anyone familiar with any of these applications wanna pitch in? As for editing, I was wondering if anyone knew of any Final Cut Pro or AVID plugins or techniques for live previewing of stereo footage. Currently, in oder to preview our footage, we have to individually export the two video tracks, boot into Winblows, and then play them through Stereo Player. Kind of a hassle. Lastly, I was wondering if anyone knew of a video splitter that would allow for taking the live footage from a single camera, and displaying it on multiple displays/projectors. Ideally we would need 3 output channels for each splitter (one splitter would be for the left, and the other for the right camera): one for the small, on-board stereo display, another for the larger stereo display, and the last one for the stereo projection system. Any help or input is appreciated. Thanks!