Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

UniEdit

macrumors newbie
Original poster
Jun 13, 2013
6
0
I am sure there is a good reason for this, but I don't understand it. I am making an app that does a lot of 3D rendering and I wanted to test my multi-screen windowing code. The only way I can get my hands on another display is to use Mavericks and airplay. The code works fine and I get two fully independent windows, but it switches from my main GPU renderer (3.3 ATI-1.12.42 with an AMD Radeon HD 6750M 512 MB) to a software renderer. I thought this was strange, but I suspected it was because Airplay is controlled by the CPU and it would be tough to get good FPS when the GPU has to keep sending data back to the CPU. This is not what I want, but it would be acceptable - if the software renderer actually produced similar results. Things like texture filtering and antialiasing don't seem to work with and of my software renderers.

I then tested it in Lion and Mountain Lion and discovered that it is the same. I am forced to use a slow software renderer when I Airplay anything, which makes everything look awfully slow - especially when I need to use multiple frame buffers. After some lengthy and very nerdy calculations, I realized that GPU rendering, even with a worse memory bus than the one I have, would be significantly faster and smoother. Is there any way to force it to use my main GPU (or one GPU for each screen) with Airplay instead of the software renderer?
 
I think you're posting in the wrong section.

In either case, airplay isn't necessarily meant to output high quality graphics, mainly because of the limit in bandwidth of ethernet/wifi required to output the feed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.