Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, if the GPU maker optimizes OpenGL, the same does not need to be done in every application that programs directly against a low level API. They get paid for that.

That is not how it works. The driver maker can't really "optimise" OpenGL. What the driver maker can do is optimise for your particular application (which again is something that you should be doing). , if they see purpose in that. The problem with OpenGL is that the interface it gives to the application does not really match how the hardware works. So the driver has to be very conservative in translating your OpenGL calls to stuff that GPU actually executes. And another massive problem is that with OpenGL, you have no guarantees whether something is going to be fast or slow. Just changing a trivial state (e.g. a blend equation) could trigger a shader recompile and pipeline flush on some hardware. This has disastrous consequences to performance. And again, there is not much that a driver can do here, since the driver does not know what you are going to do. Will you draw many or few objects? When will you use this texture you just loaded? How often will you write to this buffer? Do you still need the contents of the screen image after you've written to it or can it be safely discarded? Modern APIs are designed so that many of these questions don't even rise. This lets you express your intend more clearly, makes drivers simpler and more robust and in the end allows one to write write more efficient and better performing software.

If you have ever programmed with Metal, you'd know that its much easier to program with than using OpenGL. And you really need to go out of your way to shoot yourself in the foot. With OpenGL, you need to go out of your way NOT to shoot yourself in the foot.
 
I am not interested in learning CUDA, Metal or Direct3D. They are not portable.
 
Dropping OpenGL will be terminal for Mac gaming until developers adopt Vulkan (in about 2-4 years). Even then if Apple doesn't play nice with Valve, they might drop MoltenVK which will be terminal for Mac gaming, period.
Go over to to AppleInsider. The Apple buttlickers there will explain to you that the Mac is not for gaming, anyway. You know, because Apple.
 
In reality what you need for gaming is pretty much the same as what you need for what is considered pro use. So I don’t see why macs have to be bad for gaming.
 
  • Like
Reactions: Jyby
If I had to guess I bet Apple is super giddy about the idea of eGPUs. It means they can build prosumer products that can get a boost from existing computer graphics hardware. I think the next iteration of thunderbolt will solve the eGPU question completely.
Why would you need the next iteration of thunderbolt? What you need is apps that can take use of egpu, because right now most of them don’t use egpu, even if it’s connected
 
Why would you need the next iteration of thunderbolt? What you need is apps that can take use of egpu, because right now most of them don’t use egpu, even if it’s connected
Actually what I'm hoping for is Thunderbolt 4 with enough band-with to drive a 5k Monitor and a GPU at once.
Imagine a new Thunderbolt display with the Option to have an GPU inside. That would mean you take your Macbook pro which is quite trashy alone and put one cable inside connecting you to a monitor which gives you Workstation Performance (on the GPU side at least) and meanwhile even Charges your Computer. That would be simply awesome in my Opinion.

With such an thing it might become an Option for some Pro users and with more users there would be more Programs in consequence.
 
Actually what I'm hoping for is Thunderbolt 4 with enough band-with to drive a 5k Monitor and a GPU at once.
Imagine a new Thunderbolt display with the Option to have an GPU inside. That would mean you take your Macbook pro which is quite trashy alone and put one cable inside connecting you to a monitor which gives you Workstation Performance (on the GPU side at least) and meanwhile even Charges your Computer. That would be simply awesome in my Opinion.

With such an thing it might become an Option for some Pro users and with more users there would be more Programs in consequence.
You can do that all right now. Just connect the monitor to egpu and yes, egpu can also charge the computer too.
 
Yeah you can daisy chain them. But also I think having great bandwidth and lower latency will allow Macs to use all or more of the horses in the eGPUs. Cause right now I think they are limited in their ability to peak the eGPUs. And I think the limiting factor is not the CPU its the fact that thunderbolt is not reaching the speeds of a PCI equivalent setup.

I think Metal Performance Shaders will also shine when you can daisy chain eGPUs together as well.

Maybe Apple is also dropping OpenCL because it doesn't see that much use? But I thought OpenCL is how developers can spin up threads from a thread pool... Maybe thats just to run code on a graphics card...

Either way I bet the future is to put a pretty decent CPU in a laptop and optimize that thermal potential, then put high heat producing GPUs in an external enclosure... Linus Tech Tips even proved that PCI extenders don't really impact performance when they go out to like 5 feet.

More eGPU! Also they should support Nvidia on eGPUs for fairness for all of us. And I can scrap my gaming PC and put the GTX 1060 in an eGPU easily.
Check out egpu.io, they know more about this stuff than the folks here, all I will say is get an egpu now if you'd like one, they work pretty darn well already.
 
You can do that all right now. Just connect the monitor to egpu and yes, egpu can also charge the computer too.
Yes but i would look way nicer then having a Monitor and a little PC beside it. Just think about how much they could enter in those Imac Pros. Now imagine how great and thin this thing would look and still have a good GPU inside. Also if apple would make this Monitor you could be sure this would be integrated into Mac OS with a nice and smooth plug and play Process.
And also are you sure you can run a 5K monitor and a gpu at the same time with one single cable. As far as I know TB3 is only capable of 5k by a tweek from apple (doesnt work in Windows Laptops) I dont think this cable can take anything more than that already.
 
Yes but i would look way nicer then having a Monitor and a little PC beside it. Just think about how much they could enter in those Imac Pros. Now imagine how great and thin this thing would look and still have a good GPU inside. Also if apple would make this Monitor you could be sure this would be integrated into Mac OS with a nice and smooth plug and play Process.
And also are you sure you can run a 5K monitor and a gpu at the same time with one single cable. As far as I know TB3 is only capable of 5k by a tweek from apple (doesnt work in Windows Laptops) I dont think this cable can take anything more than that already.
5k is impossible through old DisplayPort, with usb c you can have multiple 5k monitors daisy chained. But the better option with egpu would be:

Mac ->connected to-> egpu

that egpu ->connected to-> multiple displays

same egpu ->connected to-> power

Makes sense? That way it doesn't matter what usb-c supports, it's the egpu that will be transferring the image. That way you can use and 8k monitor even if the usb-c doesn't support it. This egpu can be inside the monitor or wherever you want it.
 
5k is impossible through old DisplayPort, with usb c you can have multiple 5k monitors daisy chained. But the better option with egpu would be:
Mac ->connected to-> egpu
that egpu ->connected to-> multiple displays
and that egpu ->connected to-> power

Makes sense? This egpu can be inside the monitor or wherever you want it.
You do Know that USB-C is not what you mean... The cable is USB-C but it is Tunderbolt 3 which makes 5K Screen possible...
Old Displayport like you call it or what you really meant is Thunderbolt 2
They just share the same connector its not the same and not every USB-C Port is a Thunderbolt 3 Port. And not every mini Displayport is Thunderbolt 2
 
You do Know that USB-C is not what you mean... The cable is USB-C but it is Tunderbolt 3 which makes 5K Screen possible...
Old Displayport like you call it or what you really meant is Thunderbolt 2
They just share the same connector its not the same and not every USB-C Port is a Thunderbolt 3 Port. And not every mini Displayport is Thunderbolt 2
Ok, you made me go google. "Thunderbolt combines PCI Express (PCIe) and DisplayPort (DP) (specifically DisplayPort 1.2, same as the one used before Thunderbolt was introduced) into two serial signals" so what that means thunderbolt 2 just has a Mini DisplayPort inside it, just how usb c has thunderbolt 3, hdmi and DisplayPort and other things inside it. So you don't have to use T3 to transfer video, you could use hdmi for example. But for egpu you aren't transferring video, you transfer data for egpu to generate video that it will transfer through it's own ports.

But if you're aren't being an ass, all of these things are interchangeable in a conversation, ofc.
 
Ok, you made me go google. "Thunderbolt combines PCI Express (PCIe) and DisplayPort (DP) (specifically DisplayPort 1.2, same as the one used before Thunderbolt was introduced) into two serial signals" so what that means thunderbolt 2 just has a Mini DisplayPort inside it, just how usb c has thunderbolt 3, hdmi and DisplayPort and other things inside it. So you don't have to use T3 to transfer video, you could use hdmi for example. But for egpu you aren't transferring video, you transfer data for egpu to generate video that it will transfer through it's own ports.

But if you're aren't being an ass, all of these things are interchangeable in a conversation, ofc.
No thats just wrong. The Connector on a smartphone is also USB C but it still isn't thunderbolt 3. Also on the Macbook Pro 15" you have 4 usb-C connectors but (not completely sure) but only 2 of them are Thunderbolt 3.
 
You do Know that USB-C is not what you mean... The cable is USB-C but it is Tunderbolt 3 which makes 5K Screen possible...
Old Displayport like you call it or what you really meant is Thunderbolt 2
They just share the same connector its not the same and not every USB-C Port is a Thunderbolt 3 Port. And not every mini Displayport is Thunderbolt 2
He's referring to the old Displayport protocol. I think it was 1.2 (don't quote me on that) at the time the 5K iMac came out, which is why Apple had to design their own timing chip. Dell had a 5K display at this time and it required using two DisplayPort cables because the protocol simply wasn't built for that much bandwidth and Dell engineered an inelegant but functional design.
 
No thats just wrong. The Connector on a smartphone is also USB C but it still isn't thunderbolt 3. Also on the Macbook Pro 15" you have 4 usb-C connectors but (not completely sure) but only 2 of them are Thunderbolt 3.
I’m sorry if I offended you in any way, if you want to wait a decade until thunderbolt 4 is released before you can enjoy an egpu that’s more than fine by me. But I never said anything about usb c ports on smartphones supporting thunderbolt 3. Also all 4 ports on the MacBook Pro 15” are thunderbolt 3 and all 4 Of them are completely identical, what you meant was probably 13” where the ports on the right have reduced pci express bandwidth for thunderbolt 3. I recommend you read more about thunderbolt and usb c, it’s quite interesting stuff.
 
I’m sorry if I offended you in any way, if you want to wait a decade until thunderbolt 4 is released before you can enjoy an egpu that’s more than fine by me. But I never said anything about usb c ports on smartphones supporting thunderbolt 3. Also all 4 ports on the MacBook Pro 15” are thunderbolt 3 and all 4 Of them are completely identical, what you meant was probably 13” where the ports on the right have reduced pci express bandwidth for thunderbolt 3. I recommend you read more about thunderbolt and usb c, it’s quite interesting stuff.
Yeah then it was the 13" model.
Better example for you would be the 12" Macbook USB C without Thunderbolt 3 at all.
https://www.apple.com/macbook/specs/

I don't want to wait a decade I just said that a Display with integrated GPU could make en eGPU more popular and in Consequence developers would rather optimize their Programms for them. Having an eGPU which works with 20% of the Programs isn't really that great.

You never said phones had thunderbolt but you said USB C does include Thunderbolt 3 generally which was the point which was wrong. But whatever... we driftet a bit of the main topic.
 
I really wanted to see Vulkan support, but all Apple did was drop OpenGL support instead.

You and me both, was hugely impressed with Doom using Vulkan on my GTX 1080, Wolfenstein 2 also worked really great. Would like to see more devs shift to Vulkan rather than DirectX and if Apple supported it then it would be much more likely. Instead Mac will remain the vastly inferior platform for gaming.
 
You and me both, was hugely impressed with Doom using Vulkan on my GTX 1080, Wolfenstein 2 also worked really great. Would like to see more devs shift to Vulkan rather than DirectX and if Apple supported it then it would be much more likely. Instead Mac will remain the vastly inferior platform for gaming.
On Windows or Mac?
 
That is not how it works. The driver maker can't really "optimise" OpenGL. What the driver maker can do is optimise for your particular application (which again is something that you should be doing). , if they see purpose in that. The problem with OpenGL is that the interface it gives to the application does not really match how the hardware works. So the driver has to be very conservative in translating your OpenGL calls to stuff that GPU actually executes. And another massive problem is that with OpenGL, you have no guarantees whether something is going to be fast or slow. Just changing a trivial state (e.g. a blend equation) could trigger a shader recompile and pipeline flush on some hardware. This has disastrous consequences to performance. And again, there is not much that a driver can do here, since the driver does not know what you are going to do. Will you draw many or few objects? When will you use this texture you just loaded? How often will you write to this buffer? Do you still need the contents of the screen image after you've written to it or can it be safely discarded? Modern APIs are designed so that many of these questions don't even rise. This lets you express your intend more clearly, makes drivers simpler and more robust and in the end allows one to write write more efficient and better performing software.

If you have ever programmed with Metal, you'd know that its much easier to program with than using OpenGL. And you really need to go out of your way to shoot yourself in the foot. With OpenGL, you need to go out of your way NOT to shoot yourself in the foot.
Leman, what do you think about the new Metal 2 feature in Mojave? It seems that we're getting async compute and reusable command buffers. No geometry shaders though.

https://forums.macrumors.com/attach...1/?temp_hash=e2c553feefad2588c833d4617151c7cd

Aside from the slide they showed, I could not find a list of the new Metal features on Apple's developer website.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.