Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well, not much really. And it has been longer than 2015. Think about it. We hit 3Ghz 10 years ago and are still hanging out in that neighborhood. Apple doesn't make the processors.

As far as GFX goes, yes, I am sure a high end nvidia SLI rig, cooled with liquid nitrogen and backed up by a Tesla would be considerably faster. This is a laptop. If you want a high end gaming rig, buy one.

As others have pointed out, Radeon offers advantages in Apple's openCL applications like Final Cut. Nvidia's Cuda is nvidia only while openCL is open to any GFX maker.

As far as the higher end cards go like the 480, there is still the power and heat budget to consider as much as the price point. If anyone wants to build a better ecosystem, I am game to switch. However, I really like what I am getting from Apple. Is it the bleeding edge fastest? No. But, it gets out of the way and lets me work most of the time. That is what I want most.

Gigahertzs isn't everything. The amount of current through a CPU goes as the frequency times the junction capacitance per transistor times the total amount of capacitors. The power dissipated is the current times the voltage. So, while we've stayed at roughly 3 Ghz frequency, we've gained a hell of a lot more transistors. This is possible through reduction of transistor size, which lowers the junction capacitance (and hence drops the amount of current pumped per cycle).

The performance of a CPU goes as the clock speed times the work per clock (IPC) accomplished. You see, there are other ways to improve performance than clock speed of a CPU. We've seen massively parallel CPUs compared to the 3Ghz Pentium 4s - we've got quad core, 8 thread mobile CPUs. We've seen addition of instructions that greatly speed up specific tasks - AVX, for example. We've seen addition of better branch prediction, better pipelining and instruction merges, etc. All these features enhance total performance per clock and uses additional transistors to accomplish the task. Hence the focus of the last decade or so has been to enhance the total performance largely through parallelism and execution efficiency (better utilizing the transistors). As such, we've seen total performance increases not through much of clock speed increase, but rather, IPC and parallelism.

I think a lot of people would prefer if the CPU and GPU stayed separate. However, giving the huge push into heterogeneous computing, it would make sense that any CPU maker capable of making a GPU would integrate the GPU into the CPU to make eventual way for this heterogeneous compute future, where massively parallel instructions are sent to the GPU part automatically and the CPU takes care of the more serialized and loopy parts of the code. Such a future is coming. If one were to integrate a massively parallel compute logic into a CPU, you might as well add ROPs, tessellation, etc, to make it a fully functional GPU, so that you are not wasting precious silicon space with parallel compute units that aren't used often. I think we'll see a lot more benefits of this heterogeneous approach very soon. I am thinking that when Zen comes out, we'll see more of this Apple-AMD partnership with OpenCL and so on, bear fruit.
 
They're offering a 120 dollar GPU in a 2300+ dollar machine. This is penny pinching to the extreme. And they don't even offer a better option, no matter how much you want to spend. It's unbelievable. I thought they would give better options this time, but the only thing they've increased this year is the price.

It's still a powerhouse, and don't forget it offers Thunderbolt 3 as well which enables users to make use of external videocards in the future, graphics cards that didn't arrive yet on the market.
[doublepost=1477829509][/doublepost]
And 1060 and 1070 and 1080. I think gone are the days of Apple including an actually decent dGPU in their computers.

Exactly, gone are the days one need a integrated heavy graphic cards that only makes your laptop unnecessary hot and obsolete when new graphic cards enter the market each year.

See:


Thunderbolt 3 take cares of that, good work Apple!
 
It's still a powerhouse, and don't forget it offers Thunderbolt 3 as well which enables users to make use of external videocards in the future, graphics cards that didn't arrive yet on the market.
[doublepost=1477829509][/doublepost]

Exactly, gone are the days one need a integrated heavy graphic cards that only makes your laptop unnecessary hot and obsolete when new graphic cards enter the market each year.

See:


Thunderbolt 3 take cares of that, good work Apple!

That would be great. Except this breaks down as follows:

13" MBP w/TouchBar 16GB RAM, 512GB SSD - $2,199
Thunderbolt to PCI-E bridge for eGPU - ~$300
Nvidia GTX 1070 - $400
Halfway decent 4k Display - $400

Total: $3,299 for system with a dual core mobile CPU. $3,300 can buy you one heck of a desktop workstation.
 
That would be great. Except this breaks down as follows:

13" MBP w/TouchBar 16GB RAM, 512GB SSD - $2,199
Thunderbolt to PCI-E bridge for eGPU - ~$300
Nvidia GTX 1070 - $400
Halfway decent 4k Display - $400

Total: $3,299 for system with a dual core mobile CPU. $3,300 can buy you one heck of a desktop workstation.

You are right, but.... it also means that you can huy another new graphic card and start using that instead of buying a whole new computer. So on the long run you will save money
 
Well . . . . I'm about to give up on Tim Cook. I'd say he's just too old but I'm older and am so disappointed in recent products from Apple. Yes, there are some good products but not great. I can't believe Apple continues to build new systems off of old chip sets. I can't believe this laptop doesn't have some additional ports or mag safe charging. Tim cook is pretty much untouchable until Apple finally starts free falling. Right now, Tim is slowly but surely killing Apple. The problem is the top share holders, the board of directors are all so ridiculously wealthy that they can't see where they are falling short because they are surrounded by those who keep say how great a job they are doing. When everyone drinks your cool aid and because its bottled in a fine wine bottle . . . when your income from daily interest is greater than 95% of US citizens . . . . you evidently just can't see. I was at the apple store yesterday replacing my broken iPhone. I always strike up a conversation with the Apple employees and as diplomatic as they are . . . they agree that Apple is headed in the wrong direction.
 
Well . . . . I'm about to give up on Tim Cook. I'd say he's just too old but I'm older and am so disappointed in recent products from Apple. Yes, there are some good products but not great. I can't believe Apple continues to build new systems off of old chip sets. I can't believe this laptop doesn't have some additional ports or mag safe charging. Tim cook is pretty much untouchable until Apple finally starts free falling. Right now, Tim is slowly but surely killing Apple. The problem is the top share holders, the board of directors are all so ridiculously wealthy that they can't see where they are falling short because they are surrounded by those who keep say how great a job they are doing. When everyone drinks your cool aid and because its bottled in a fine wine bottle . . . when your income from daily interest is greater than 95% of US citizens . . . . you evidently just can't see. I was at the apple store yesterday replacing my broken iPhone. I always strike up a conversation with the Apple employees and as diplomatic as they are . . . they agree that Apple is headed in the wrong direction.

I've said this before and I still stand by it: The only person at Apple that was able to edit Jonny Ive was Steve Jobs. Ive is visionary in industrial design, Steve was visionary in practicality and distilling design and engineering to a product that was balanced in delivery of form and function. Steve was quick to learn from his mistakes too, the Cube was a marvel of Design and Engineering at the time, but it turned out to be highly impractical for all except a very narrow user group. Too expensive for casual users, and not powerful enough for professional users. We didn't see anything like the cube again until the MacMini, and Apple got the Mini right. Same thing with the first generation MacBook Air. Horrible hinge problems, super slow mini Hard Drives, weak battery and super high price point. Within two years, we had the current form of the air and it has been a huge success. Since Steve's passing, it just seems like Apple keeps doubling down on the same tropes: Thinner! Lighter! Still the same all day battery! To achieve this we've courageously eliminated a short list of absurdly practical/non-obsolete/industry standard items from the device.

No one has the authority to say "Jonny, this is great, but we need to make everything 1.0125mm thicker so the cameras don't stick out, we can include standard USB ports and the magnetic charing port you invented that everyone loves. It might also be nice to have the thermal envelope to make use of top shelf components available now that our users expect of a Pro labeled device."
 
  • Like
Reactions: Hanson Eigilson
It's still a powerhouse, and don't forget it offers Thunderbolt 3 as well which enables users to make use of external videocards in the future, graphics cards that didn't arrive yet on the market.
[doublepost=1477829509][/doublepost]

Exactly, gone are the days one need a integrated heavy graphic cards that only makes your laptop unnecessary hot and obsolete when new graphic cards enter the market each year.

See:


Thunderbolt 3 take cares of that, good work Apple!
It's still a powerhouse, and don't forget it offers Thunderbolt 3 as well which enables users to make use of external videocards in the future, graphics cards that didn't arrive yet on the market.
[doublepost=1477829509][/doublepost]

Exactly, gone are the days one need a integrated heavy graphic cards that only makes your laptop unnecessary hot and obsolete when new graphic cards enter the market each year.

See:


Thunderbolt 3 take cares of that, good work Apple!
It's a lot of things. 'Powerhouse' is most definitely not one of those things.

Any external GPU solution comes with its own problems and challenges. In fact I don't think Apple even supports it officially.

This would all have been so much better if Apple would simply have inked a deal with Nvidia and used their technologically superior Pascal architecture instead. Nvidia has been way ahead of AMD for a while now, especially on the high end.
 
I hope it isn't as "anti-robust" as the 2011MBP series was... especially with Apple's insane obsession with thinness meaning a halfway decent cooling system is the first thing that must be compromised. Anything more than web browsing and the thing will undoubtedly overheat.
 
  • Like
Reactions: thornslack
so that actually works with macs now ?
- If the box is too complex it might not work
- The box might not support certain cards without modification
- You might need to modify the box or replace its fans for proper cooling
- You might need to add connectors to the fans yourself
- You might need to remove the fans from the card
- You might need to reapply thermal paste to the GPU
- You can use a graphics card if Apple or NVIDIA makes a driver.
- You can use Metal if Apple makes a driver.
- You might not be able to boot Windows
- You might have to reset the SMC
- You have to disable SIP
- You have to modify kernel extension configuration files
- It might happen that drivers for 2 different GPUs cannot be loaded at the same time. This can be a lot of trouble.
- The card operation might not be optimized
- If the card does not get enough power the Mac might crash
- If the card gets too hot the Mac might crash
- Youtube and Netflix do not decode on the card, so 4K@60Hz full screen will stutter with TB1
- Using the eGPU and displaying on the built-in screen may run out of bandwidth (especially with TB1)
 
Last edited:
Great response. You know a lot more than I do about the individuals running the company. My family, and extended family shifted to apple products about 5 years ago. Apple was clearly a better product and you could feel it when you held it in your hand. All I know is I'm not being wowed and features that I consider important, apple removes. For at least 5 more years, I need HDMI ports, audio ports, and enough USB ports for my non apple products that I use with my macs. As far as phones, I'd prefer an iPhone a little thinker that had twice the batter life. I prefer an iPhone with flat edges iPhone 4/4s because its easier to hold. Give me the best camera, best and brightest screen OLED and enough memory so I don't have to constantly delete pics and apps. I'm selling some of our older Apple products on eBay, was listing them today. Had an iPhone 4s fired up and updated to latest software . . . man it felt nice in my hand and the old IOS looked great in comparison to the current. Apple leadership is kind of like our WW2 Generals were just prior to the Battle of the Bulge. It took Patton to bail the other generals out. Apple needs a Patton / Jobs to refocus the company.
 
They went for AMD because Final Cut is optimized for Open CL and AMD cards perform better with Open CL.

This in only true to the fact that AMD doesn't support anything else, I feel like the people who are not agreement with Apples poor choice for Graphics Chip, are usually quoting a google search or a Apple Brochure. Saying AMD is better for apple or final cut pro X is sometimes correct, but correct for the wrong reasons, because AMD chips don't support anything else.

Ever since NVIDIA did the 9xx chipset and now the 10xx chipset, AMD is dead, dead, dead. dying, dead.. It can't compete on a higher level anymore. YES they can make some $30,000.00 one off chip, thats fast, but I'm talking about their mainstream models. Their was a time where they where neck and neck, but that is NO MORE since the 10xx chipset from NVIDIA, AMD is done with cutting edge. I have worked in OpenCL and CUDA programming for video plugins and optimization and I have worked on teams programming for these two Languages/Frameworks, I have delved into the coding and optimization of both, in its current state CUDA is far superior to OpenCL on Apple OS X.

First off CUDA is faster anyway, Since the Nvidia Chips are faster. CUDA will always be faster than OpenCL for similar operations. Say you do a BLUR or an effect, both doing the exact same effect but from diff Languages, CUDA will always win. If Nvidia wanted to create an OpenCL optimized programming environment for their chips, Nvidia chips would be faster in OpenCL, for the same speed chip, without discussion.

Also Apple doesn't care about OpenCL, at all. Its so funny people quote it so much. OpenCL is only a part of OS X's current graphics layer, and is usually for specific tasks, not an overall OS X graphics effects, this is the problem with Apple and OpenCl, they are using other things to implement these Frameworks into their graphics architecture, OpenCL is a very small part of the OS design.

When Apple uses OpenCL to do something for Final Cut Pro X, say to do effects or process, they are only using it for that one time as they write that code for that Programs release. So if you have bad OpenCL code and some library item gets updated or some speed optimization is designed into OpenCL, your not going to see it with old code. If on the other hand you where using CUDA based Operations for a program, NVIDIA is updating CUDA all the time, if part of what you designed gets a speed bump my a new CUDA release, your gonna see it immediately. There is some grey area and overlap based on a design, but that is the way OpenCL is used by OS X.

OpenCL and CUDA when compared to programs like Final Cut Pro X, After Effects, Blackmagic Resolve, Maxwell CUDA are specific for specific operations, operations that CUDA will always be faster rendering, since the NVIDIA chips are faster.

OpenCL is crippled by the same company that boasts how great it is. Apple uses the AMD GPU's because they are able get the chips at cost and mark the crap up outta them, the choice of AMD is all about money, no other reason. Tim Cook the bean counter won, we all lost. AMD is not an optimized OpenCL chip.. thats not how its works. OpenCL is open souce it has nothing to do with AMD.

Anyone can argue for AMD being better, if they want, but at this point, doesn't everyone get it? I mean if they don't understand why these chips are bad, thats great for apple. It seems only about 20% of the people on these forums are talking about how great the new macbook pros are, so maybe apple will feel that 80% loss in sales and change their horrible thinking.

I loved Apple, and honestly I'm so bummed we are looking at Windows 10 for our next builds, but what other choice do we have.
 
NVIDIA is stuck at OpenCL 1.2, AMD is already at 2.x

It is Apple's fault if AMD on Mac is not up to date.

I don't care about CUDA. It is not portable.

NVIDIA fanbois are also bad.
 
  • Like
Reactions: dragje
NVIDIA is stuck at OpenCL 1.2, AMD is already at 2.x

It is Apple's fault if AMD on Mac is not up to date.

I don't care about CUDA. It is not portable.

NVIDIA fanbois are also bad.

I agree OpenCL is more portable, most open source is. The discussion was weather or not Apple's Choice of AMD chips was the best choice considering its an inferior chip, to put in a PRO Laptop. Their are lots of stuff I love about OpenCL, I thought it was going to be our savior and freedom from big companies and in some ways it is. Doesn't mean AMD chips are faster.
 
I agree OpenCL is more portable, most open source is. The discussion was weather or not Apple's Choice of AMD chips was the best choice considering its an inferior chip, to put in a PRO Laptop. Their are lots of stuff I love about OpenCL, I thought it was going to be our savior and freedom from big companies and in some ways it is. Doesn't mean AMD chips are faster.
I don't think Polaris are bad chips. They perform better than comparable NVIDIA with DX12.
 
I agree OpenCL is more portable, most open source is. The discussion was weather or not Apple's Choice of AMD chips was the best choice considering its an inferior chip, to put in a PRO Laptop. Their are lots of stuff I love about OpenCL, I thought it was going to be our savior and freedom from big companies and in some ways it is. Doesn't mean AMD chips are faster.
Inferior to what, that is first thing. Secondly, GPU with what thermal envelope was/is available from competition, and what they do offer in that Thermal envelope?

Find something from competition with higher performance/watt(1.86 TFLOPs in 35W TDP). Objectively. I am not saying Nvidia does not offer anything like this. However the picture is completely different from most people are describing on forums.

I will tell you. GTX 1050 Ti which can go to 35W TDP will have similar compute performance, because of down clocking required to put 75W GPU into 35W TDP. Apple does not care at all about CUDA, because first of all, they do not offer Nvidia hardware, secondly it is Nvidia proprietary API. So if they will port all of their software to CUDA, they will lock themselves to CUDA-only offerings, even if competition may offer better solutions.

All I see on this forum lately is complaining. Without even understanding the subject from wide perspective.

Secondly you say that CUDA is better because Nvidia hardware is better. You couldn't be more wrong. Highest end from last generation from both companies: Fury X vs Titan X. One GPU has 8.6 TFLOPs of compute power. Second, slightly over 6 TFLOPs. Which one will be faster? In applications that favor Compute performance the answer is obvious. The only reason why software can perform better on one hardware is because it is extremely optimized for it. Lately I have been watching an interview with Raja Koduri and he have changed my perception on software-hardware relationship. He said that 70% of overall performance for any application is software itself. If Software is good in using whole hardware - then you will get better results.

Apple software is very well optimized for Apple hardware. Why would Apple care about other options, when they do offer their own solution for their own platform? There is no logic here.
 
Inferior to what, that is first thing. Secondly, GPU with what thermal envelope was/is available from competition, and what they do offer in that Thermal envelope?

Find something from competition with higher performance/watt(1.86 TFLOPs in 35W TDP). Objectively. I am not saying Nvidia does not offer anything like this. However the picture is completely different from most people are describing on forums.

I will tell you. GTX 1050 Ti which can go to 35W TDP will have similar compute performance, because of down clocking required to put 75W GPU into 35W TDP. Apple does not care at all about CUDA, because first of all, they do not offer Nvidia hardware, secondly it is Nvidia proprietary API. So if they will port all of their software to CUDA, they will lock themselves to CUDA-only offerings, even if competition may offer better solutions.

All I see on this forum lately is complaining. Without even understanding the subject from wide perspective.

Secondly you say that CUDA is better because Nvidia hardware is better. You couldn't be more wrong. Highest end from last generation from both companies: Fury X vs Titan X. One GPU has 8.6 TFLOPs of compute power. Second, slightly over 6 TFLOPs. Which one will be faster? In applications that favor Compute performance the answer is obvious. The only reason why software can perform better on one hardware is because it is extremely optimized for it. Lately I have been watching an interview with Raja Koduri and he have changed my perception on software-hardware relationship. He said that 70% of overall performance for any application is software itself. If Software is good in using whole hardware - then you will get better results.

Apple software is very well optimized for Apple hardware. Why would Apple care about other options, when they do offer their own solution for their own platform? There is no logic here.

When has the AMD Fury X ever been inside of the Apple MacBook Pro? Lets stay on stack here. We are talking about the AMD in Macbook Pro, not AMD versus NVIDIA overall.
 
When has the AMD Fury X ever been inside of the Apple MacBook Pro? Lets stay on stack here. We are talking about the AMD in Macbook Pro, not AMD versus NVIDIA overall.
So I have asked you. Which GPU is faster than Radeon Pro 460 in 35W Thermal Envelope? Which GPU will be more powerful in this thermal Envelope.

I will give you a hint. 2.14 TFLOPs in 75W has GTX 1050 Ti. In the end, both GPUs will have exactly the same performance in this thermal envelope.
 
  • Like
Reactions: Maxx Power
So I have asked you. Which GPU is faster than Radeon Pro 460 in 35W Thermal Envelope? Which GPU will be more powerful in this thermal Envelope.

I will give you a hint. 2.14 TFLOPs in 75W has GTX 1050 Ti. In the end, both GPUs will have exactly the same performance in this thermal envelope.
RX 460 provides 2.2 TFLOPs at 75W.
 
So I have asked you. Which GPU is faster than Radeon Pro 460 in 35W Thermal Envelope? Which GPU will be more powerful in this thermal Envelope.

I will give you a hint. 2.14 TFLOPs in 75W has GTX 1050 Ti. In the end, both GPUs will have exactly the same performance in this thermal envelope.

This is getting disturbing, we are talking about the Macbook Pro, one of the most expensive laptops on the market, and the GTX 1050 TI is in the discussion?? A $100 video card. This is very very very sad.

OK lets compare the GTX 1050 Ti, a $100 video card, to the Radeon Pro 460, OEM cost UNKNOWN??, but the Radeon Pro 460 in inside a $5000 laptop, and that is just plain depressing... But why don't we wait until the Macbook Pro with the "Radeon Pro 460" is released and really see how slow it is before we do any more comparisons.
 
  • Like
Reactions: dragje
This is getting disturbing, we are talking about the Macbook Pro, one of the most expensive laptops on the market, and the GTX 1050 TI is in the discussion?? A $100 video card. This is very very very sad.

OK lets compare the GTX 1050 Ti, a $100 video card, to the Radeon Pro 460, OEM cost UNKNOWN??, but the Radeon Pro 460 in inside a $5000 laptop, and that is just plain depressing... But why don't we wait until the Macbook Pro with the "Radeon Pro 460" is released and really see how slow it is before we do any more comparisons.
Well you base your whole point on price, which is fair enough. But you do not see that there is 87W PSU in 15 inch computer, and that has to power 45W CPU and 35W(supposedly...) GPU, and whole computer. That is why we compare Radeon Pro 460 to GTX 1050 Ti as a possible alternative.

Secondly, you do not know how it will perform. Yet you claim it will be slow. Well, not good enough for you is good enough for others. But then again, you can search for other solutions also if you are not satisfied with MacBook Pro.

RX 460, with 2.2 TFLOPs has exactly the same compute performance level as GTX 960. And in compute applications that is reflected directly. So how will perform GPUs with similar compute performance?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.