Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
... I am sure this will be an advantage to Apple - customers, not so much. This is about cost control, not performance.
... AFA Intel Macs, anyone paying full price for an obsolete piece of hardware isn't very smart. Intel Macs literally have no future. ...

So, which is it? If the new Macs don't out-perform the Intel Macs, then it seems they have much more of a future than if not, right?

And how likely do you think it is that Apple can beat INTEL, the primary designer and manufacturer of CPUs in the world, at their own game by 2x to 4x?

Just having smaller instructions in the CPU that can each execute quicker doesn't make the overall computation faster, necessarily (since you end up needing many more operations to get the same thing done). RISC has been around a LONG time and is pretty well understood. I'm not sure it's the panacea some here seem to think it is. Although I will admit to being intrigued at the possibility of iPad and Mac apps being somewhat interchangeable.

I think it is extremely likely. I've heard from enough relative-insiders to understand Intel has a problem much bigger than technology alone. But, the bigger deal is that Intel is tied to legacy, while this frees Apple to control the whole game, including lots of specialized silicon to do things like the T2 does for video encoding. (Or, I earlier gave the example of what Sony is doing this year with the PS5 and storage speeds, and the implications for game-changing the platform... pun sort of intended.)

The bigger problem Apple faces is getting software developers on-board. But, if they can deliver the performance, I think they can pull that off. Apple is no longer the bit-player of the past with no real influence. Yes, they have a lot of real, stubborn bias to overcome. THAT is the main issue. But, I think they can do it.

As far as RISC vs CISC, I'm not a chip designer (though way back in my past, I have an electronics engineering background), but I've talked to some (and read some, even here on this thread!). In my younger days, I roomed for a while with a PhD computer scientist who taught at Ohio State and worked at Intel, among others. While maybe that knowledge is outdated, he was pretty opinionated about the RISC advantage. We used to get into conversations about it.

Every computer ever made literally has no future -- It's just a matter of time frame. An Intel Mac purchased today easily has about a three year window of viability, I think. Could be longer, but it almost certainly won't be shorter than that.

I think that is the problem, though, for those concerned (including me, a bit). When I bought my first Macs, I'm pretty sure I kept them for roughly 10 years. As time has gone on, that seems to have shrunk to a more realistic 5-ish years. The main driver of that is software/OS. Now, we're saying it might be 3? A multi-thousand dollar Mac isn't an iPhone we just swap every few years (and I don't even swap my iOS devices that often).

I think Apple has to properly support the Intel Macs for a minimum of 5 years, hopefully more. If they don't, it will be yet another black-eye. Maybe they don't care... we'll see. But, the good news is that they seem more aware of that these days (cf. keyboard, cylinder Mac Pro, etc.).
 
  • Like
Reactions: aid
Steve - if the software moves to ARM, the Intel macs are done - no future. If ARM hardware doesn't deliver what folks around here expect (and their dreams are built on a very, very slender reed), then Apple will find in nearly impossible to jump back.

What will most likely happen is that the True Believers will convince themselves that more performance isn't actually important (which a number of them are already doing.)

As an example - in the 7,1 - the I/O system is obsolete, the base video card is nearly 4 years old, and the CPU is the end of the line for 14nm+++++++++++++++++.

We have actually had people say that they don't need max performance, while dropping 5 figures on a 7,1.

Apple is exiting the PC world - they will drop support for Intel Macs just as fast as they did the G5.

Every time Apple stepped into competition, they got slapped around - In the Motorola days, they were beaten by the clones, who delivered faster clones quicker than Apple, and now they are getting hammered by Intel based OEMs (because they rarely update their lineup), and AMD (who are curb stomping Intel in performance).

ARM saves Apple a lot of money in development costs and they will slowly push everyone onto buying off the Apple store (taking their 30% cut).

The move to ARM is about control and costs - not performance.
 
  • Disagree
Reactions: Nütztjanix
Steve - if the software moves to ARM, the Intel macs are done - no future.
<snip>
That’s a fundamental misunderstanding. New software doesn’t move from x64 to ARM. New support for ARM is in addition to the existing support for x64.

Devs can simultaneously target both platforms by checking a box or two in Xcode, telling it to build for both architectures. If desired, the code for both x64 and ARM can even be packaged into a single file for ease of distribution.
 
And how likely do you think it is that Apple can beat INTEL, the primary designer and manufacturer of CPUs in the world, at their own game by 2x to 4x?


Primary by which measure? They are obviously the primary designer and manufacturer of x86 architecture CPUs though even that title is slowly slipping to AMD. AMD is beating Intel on almost any metric one could care about and the market is slowly moving that way.

In terms of shipped CPU's in the world that ARM has dominated for a while now. Estimates for 2020Q1 shipping of PC's puts it at between 50 and 55 million units, let's assume that's 100% Intel and compare that to smartphones for the same time window and Samsung alone met that number and in aggregate it's hitting 300 million. This doesn't include server (both Intel and ARM benefit here with ARM's embedded server roles), doesn't account for AMD and doesn't include stuff like Raspberry Pi builds either.

Apple have hired a lot of talented CPU engineers including former ARM, Intel and AMD employees. If you think those people aren't capable of replicating what they've done at those previous employers, including Intel, then I guess we have a different understanding of how experience works. They're not beating Intel at the x86 game though, they're defining their own path with custom silicon tied to an operating system and developed environment with the ability to leverage extra functionality in that silicon.
 
That’s a fundamental misunderstanding. New software doesn’t move from x64 to ARM. New support for ARM is in addition to the existing support for x64.

Devs can simultaneously target both platforms by checking a box or two in Xcode, telling it to build for both architectures. If desired, the code for both x64 and ARM can even be packaged into a single file for ease of distribution.

Yeah. Barely any software will be ARM-only any time soon.

(Unless you're counting iOS apps.)
[automerge]1595406403[/automerge]
And how likely do you think it is that Apple can beat INTEL, the primary designer and manufacturer of CPUs in the world, at their own game by 2x to 4x?

I think "primary designer and manufacturer" is… maybe a bit of an outdated view.

As for "2x to 4x", I don't think so. But I do think Intel will take a while to catch up.
 
  • Like
Reactions: PickUrPoison
Intel will be caught up by 2021 - they are moving to a Big, Little CPU layout (Golden Cove I believe is the code name).

It looks like their top consumer chip will be a 8 big cores and 8 Atom cores. Those Atom cores are a lot more powerful than people think - It isn't the same as what was in netbooks a decade ago.

AMD will have 4 way SMT by then - so their top end consumer chip will probably be a 16 core/64 thread chip.

Apple will have their work cut out for them
 
Steve - if the software moves to ARM, the Intel macs are done - no future. If ARM hardware doesn't deliver what folks around here expect (and their dreams are built on a very, very slender reed), then Apple will find in nearly impossible to jump back.


Yeah, I agree as far as that goes. I think if Apple doesn't gain considerable performance, then it is a wasted move, at least from the user standpoint. Apple makes gains in cost savings, control, etc. We lose software compatibility. And, yes, going back would be tough.

I don't think performance dreams are on any kind of slender reed, though. I think they are incredibly likely... in fact I can't see how we won't get some performance leaps, even if they aren't in core CPU speeds. Like I've said before, consider the T2 alone, and then multiply that across other specialized areas.

Now, maybe (worst case) that won't help a 3D renderer using a specific package, counting on multiple really fast CPU cores, but it will benefit the overall Mac market in many ways, even a lot of professional users.


What will most likely happen is that the True Believers will convince themselves that more performance isn't actually important (which a number of them are already doing.)

As an example - in the 7,1 - the I/O system is obsolete, the base video card is nearly 4 years old, and the CPU is the end of the line for 14nm+++++++++++++++++.

We have actually had people say that they don't need max performance, while dropping 5 figures on a 7,1.


OK, now I'm curious. What is this system you're speaking of that Apple faithful are ignoring, which blows away the old tech in the new Mac Pro? While I don't keep up on the latest and greatest these days as I used to, I'm not aware of anything like this. (For example, I've seen attempts at real-world Intel vs AMD CPU comparisons, for example C4D, that show they are close, with Intel often winning. And, if you're talking Nvidia vs AMD GPUs, that is highly dependent on what app you're using.)


... New software doesn’t move from x64 to ARM. New support for ARM is in addition to the existing support for x64.

Devs can simultaneously target both platforms by checking a box or two in Xcode, telling it to build for both architectures. If desired, the code for both x64 and ARM can even be packaged into a single file for ease of distribution.


Yeah, but to be fair, isn't this intended to be transitional? I don't think Apple's long-term plan is for the apps to stay both architectures.


As for "2x to 4x", I don't think so. But I do think Intel will take a while to catch up.

If not, then I'm probably more with ssgbryan, that this is about Apple's benefit, not ours. Maybe that won't all come in terms of raw CPU performance, but if we don't get something like 2x-4x overall benefit, then I can't see how it isn't a waste. We lose an awful lot too, so we'd better be giving it up for something good.
[automerge]1595442439[/automerge]
Intel will be caught up by 2021 - they are moving to a Big, Little CPU layout (Golden Cove I believe is the code name).

It looks like their top consumer chip will be a 8 big cores and 8 Atom cores. Those Atom cores are a lot more powerful than people think - It isn't the same as what was in netbooks a decade ago.

AMD will have 4 way SMT by then - so their top end consumer chip will probably be a 16 core/64 thread chip.

Apple will have their work cut out for them

I'm also curious... given the performance of Apple and Intel in recent years, why you are confident Intel will do so well in carrying out their future plans, while Apple won't (based on what you see as failed attempts back in the past).

Apple's trajectory seems to be from a company without a lot of resources/control to being an industry leader, while Intel is certainly the underdog at this point (ie. great past, but increasingly showing issues). I often root for the underdog too, but there comes a point when you have to be realistic, as well.

Now, AMD, you might have a point. And, I admit I'm a bit biased by past bad experiences.
 
Last edited:
  • Like
Reactions: Nütztjanix
Intel will be caught up by 2021

That is a bold claim.

It does seem like Tiger Lake will help them catch up. But that's not out yet, and judging from how some Ice Lake parts were delayed by more than half a year, and others quietly canceled, jury's frankly out.

- they are moving to a Big, Little CPU layout (Golden Cove I believe is the code name).

Yes, I'm guessing Alder Lake will use the Golden Cove microarchitecture.

It looks like their top consumer chip will be a 8 big cores and 8 Atom cores. Those Atom cores are a lot more powerful than people think - It isn't the same as what was in netbooks a decade ago.

We don't really know much about it at this point.

AMD will have 4 way SMT by then - so their top end consumer chip will probably be a 16 core/64 thread chip.

Apple will have their work cut out for them

I don't think Apple is that impressed by AMD.
[automerge]1595445091[/automerge]
If not, then I'm probably more with ssgbryan, that this is about Apple's benefit, not ours. Maybe that won't all come in terms of raw CPU performance, but if we don't get something like 2x-4x overall benefit, then I can't see how it isn't a waste. We lose an awful lot too, so we'd better be giving it up for something good.

Most people don't use anything at all, though. (In fact, most people will be happy to gain stuff like running iPhone apps out of the box.)

Some people, including me, do lose plenty.
[automerge]1595445211[/automerge]
(Perhaps the disconnect here is 2x-4x compared to… what? Compared to Ice Lake, I'm guessing not. Compared to 14nm chips like Coffee Lake, OTOH…)
 
AMD will have 4 way SMT by then - so their top end consumer chip will probably be a 16 core/64 thread chip.

There is nothing out there that indicates Zen 4 will be using SMT4. Putting 4 way multithreading doesn't automatically make CPU better, it is a design tradeoff.
 
Also, like, hold your horses, everyone — Zen 3 isn't even out.

Zen 3 is expected to be about 15% improvement from Zen 2, but core counts will remain the same. We do have some wild rumors that Zen 4 will have SMT4, but they are nothing more than wild rumors at this point.
 
  • Like
Reactions: SteveW928
Most people don't use anything at all, though. (In fact, most people will be happy to gain stuff like running iPhone apps out of the box.)

Some people, including me, do lose plenty.
[automerge]1595445211[/automerge]
(Perhaps the disconnect here is 2x-4x compared to… what? Compared to Ice Lake, I'm guessing not. Compared to 14nm chips like Coffee Lake, OTOH…)

Well, most people would be fine w/ Apple staying on Intel, too. I guess the point is who benefits from such a move, Apple, we Apple users, or both (and by how much).

If it is just so Apple can save some money, and maybe make batteries last a bit longer in laptops... I'll be pretty upset at the disruption caused by something so irrelevant (at least to me or most users).

re: compared to what - The future direction of it all. It doesn't matter if Apple 2x-4x current tech if that tech 2x by the time Apple gets there. I'd hope if Apple is making such a move, they are confident this puts them out ahead of the competition in 2 years, 5 years, 10 years, etc.

I suppose you're right about the average user being happy running iPhone apps, but unless that fundamentally changes something on the Mac side for the better, that is also wasted on the pro users. (It becomes like the whole Windows argument of old... we have more software. Well, all I need is one good app, not 100 crappy ones.)
 
Let’s just wait and see how executing on their roadmap works out.
Their track record is spotty lately, to say the least.

But, unlike Apple, they do actually have a track record. Intel is having it's "Bulldozer" moment. These things happen in chip design.

It is going to suck for Intel for the next 12 months or so - AMD will launch 4 different architectures (Server, Desktop, Console, APU) before Intel gets there next one out. AMD is on a 15 month cadence and they aren't slowing down at all. Take a look at the Levono TR Pro systems that ship in September - that is what the 7,1 should have been.

Now, maybe (worst case) that won't help a 3D renderer using a specific package, counting on multiple really fast CPU cores, but it will benefit the overall Mac market in many ways, even a lot of professional users.

OK, now I'm curious. What is this system you're speaking of that Apple faithful are ignoring, which blows away the old tech in the new Mac Pro?

If not, then I'm probably more with ssgbryan, that this is about Apple's benefit, not ours. Maybe that won't all come in terms of raw CPU performance, but if we don't get something like 2x-4x overall benefit, then I can't see how it isn't a waste. We lose an awful lot too, so we'd better be giving it up for something good.

I'm also curious... given the performance of Apple and Intel in recent years, why you are confident Intel will do so well in carrying out their future plans, while Apple won't (based on what you see as failed attempts back in the past).

I don't think that Apple will deliver performance because of Tim Cook. Tim is a bean counter, and doesn't see the need for performance. If you look at his stewardship of Apple, the Mac division has at best, been neglected. 2,000+ days between the launch of the 6,1 to the 7,1. How many obsolete parts are still shipping with current macs?

The Mac mini is shipping with 8th Gen CPUs (6 cores/6 threads or 6 cores/12 threads, integrated graphics for all), with the iMacs, only the i9 and iMac "Pro" have multi-threaded CPUs. The rest of them are 6 core/6 thread budget i5 CPUs, with Polaris based video cards, although you can be only 1 generation back with a throttled Vega GPU for the iMac "Pro".

In spite of the current Mac lineup, the True Believers are convinced that Apple is suddenly going to design a series of CPUs that will out perform the top of the line CPUs from both Intel & AMD that Apple was too cheap to put in their own lineup. They also believe that Apple will suddenly develop GPUs that will outperform both AMD and Nvidia, when 2080ti performance will be the base line for performance in the next 6 months.

Apple has delivered phone CPUs, maxing out at 2 big cores and 2 little cores - Intel is delivering 28 core/56 thread CPUs today. AMD is delivering 16 core/32 thread CPUs at the consumer level, 64 core/128 thread CPUs at the HEDT level, and multiple socket 64c/128t CPUs at the server level - today.

Apple has delivered phone GPUs - cool that they can drive a 14" screen in an iPad pro at 5K - but I haven't used a 14" screen on a computer since 1992. I have been using a dual screen setup for well over 15 years - I am looking at going to dual 4K monitors at Xmas. Neither you nor I know if Apple can/will support that - and keep in mind, 1440p 240Hz is mid-tier right now, never mind two years from now.

The next Mac Pro won't be competing with a TR3995XT based system - it will be competing with a Zen 4 or Zen 5 AMD (5nm) based system, or a Golden Cove (10nm - they will be there) based Xeon system.

AMD based systems are now moving from DIY, enthusiasts, and boutique vendors to tier one OEMs. It isn't just those Levono's I mentioned earlier - AMD has an entire range of desktop APUs that for now, will only be available to OEMs (8 cores/16 threads with Vega 8 graphics @65watts would fit very nicely in a Mac Mini; take the same CPU and add a dedicated GPU and that iMac is now a workhorse.) This is why I say that the move to ARM is about control, not performance.
 
I don't think that Apple will deliver performance because of Tim Cook. Tim is a bean counter, and doesn't see the need for performance. If you look at his stewardship of Apple, the Mac division has at best, been neglected. 2,000+ days between the launch of the 6,1 to the 7,1. How many obsolete parts are still shipping with current macs?

The Mac mini is shipping with 8th Gen CPUs (6 cores/6 threads or 6 cores/12 threads, integrated graphics for all), with the iMacs, only the i9 and iMac "Pro" have multi-threaded CPUs. The rest of them are 6 core/6 thread budget i5 CPUs, with Polaris based video cards, although you can be only 1 generation back with a throttled Vega GPU for the iMac "Pro".

In spite of the current Mac lineup, the True Believers are convinced that Apple is suddenly going to design a series of CPUs that will out perform the top of the line CPUs from both Intel & AMD that Apple was too cheap to put in their own lineup. They also believe that Apple will suddenly develop GPUs that will outperform both AMD and Nvidia, when 2080ti performance will be the base line for performance in the next 6 months.

Apple has delivered phone CPUs, maxing out at 2 big cores and 2 little cores - Intel is delivering 28 core/56 thread CPUs today. AMD is delivering 16 core/32 thread CPUs at the consumer level, 64 core/128 thread CPUs at the HEDT level, and multiple socket 64c/128t CPUs at the server level - today.

Apple has delivered phone GPUs - cool that they can drive a 14" screen in an iPad pro at 5K - but I haven't used a 14" screen on a computer since 1992. I have been using a dual screen setup for well over 15 years - I am looking at going to dual 4K monitors at Xmas. Neither you nor I know if Apple can/will support that - and keep in mind, 1440p 240Hz is mid-tier right now, never mind two years from now.

The next Mac Pro won't be competing with a TR3995XT based system - it will be competing with a Zen 4 or Zen 5 AMD (5nm) based system, or a Golden Cove (10nm - they will be there) based Xeon system.

AMD based systems are now moving from DIY, enthusiasts, and boutique vendors to tier one OEMs. It isn't just those Levono's I mentioned earlier - AMD has an entire range of desktop APUs that for now, will only be available to OEMs (8 cores/16 threads with Vega 8 graphics @65watts would fit very nicely in a Mac Mini; take the same CPU and add a dedicated GPU and that iMac is now a workhorse.) This is why I say that the move to ARM is about control, not performance.


Problem with your analysis is that you are only looking at number of CPU cores and GPUs when Apple now has the luxury of integrating specialized cores accelerating (just like Afterburner) particular relevant processes on Mac Apps. That is something you really can't have with Intel or AMD based solutions.
 
Apple has delivered phone GPUs - cool that they can drive a 14" screen in an iPad pro at 5K - but I haven't used a 14" screen on a computer since 1992. I have been using a dual screen setup for well over 15 years - I am looking at going to dual 4K monitors at Xmas. Neither you nor I know if Apple can/will support that - and keep in mind, 1440p 240Hz is mid-tier right now, never mind two years from now.

Just curious, what is the difference between driving a 14" screen at 5K resolution versus a 27" screen at 5K resolution?
 
I don't think that Apple will deliver performance because of Tim Cook. Tim is a bean counter, and doesn't see the need for performance.

One, that's a very simplistic view of Cook.

And two, he doesn't make decisions about hardware performance. He might make decisions on whether to acquire additional semiconductor companies, sure, but if anything, those decisions so far would suggest that he does care about performance.

Apple's chips have been indisputably leading the competition for years, and I see no reason to believe they don't want to bring that success to the Mac. They may not be able to, but even that isn't currently looking like it.
 
Problem with your analysis is that you are only looking at number of CPU cores and GPUs when Apple now has the luxury of integrating specialized cores accelerating (just like Afterburner) particular relevant processes on Mac Apps. That is something you really can't have with Intel or AMD based solutions.

Right,

I should just rebuild my entire workflow, and hope that software developers rewrite all of their software to accommodate Apple.

The computer works for me - I don't work for it.

Just curious, what is the difference between driving a 14" screen at 5K resolution versus a 27" screen at 5K resolution?

You are moving a LOT more pixels? Apple developed the ipad/iphone GPU - AMD developed the GPU in the iMacs


And chucker32n1,

If Tim Cook cared about performance - Apple computers would have been updated regularly. They wouldn't have shipped with obsolete parts on day 1. Those parts wouldn't be gimped parts either.
 
Pretty sure 5K pixels are 5K pixels regardless of screen size.

Yup, otherwise it wouldn't be 5K at that point. 5K is 5120x2880 or 14.7 million pixels. Doesn't matter how big the display is, if it's 5K then it's the same number of pixels. A 1440p monitors using a widescreen resolution of 16:9 would be 2560x1440 or roughly 3.7 million pixels, with two of those displays coming in at 7.4 million or roughly half of one 5K display.

The iPad display runs at 120Hz and the larger iPad Pro is 2732x2048. I find it easy to conceive that Apple could, on the desktop, double their GPU capacity to meet a dual 1440p display setup. You are correct though that they haven't announced if they'd support that or not though I suspect they'd at least shoot for parity capabilities with their existing products otherwise the reviews write themselves.
 
  • Like
Reactions: SteveW928
Right,

I should just rebuild my entire workflow, and hope that software developers rewrite all of their software to accommodate Apple.

The computer works for me - I don't work for it.

Of course, you choose your rig based on the software you use. No one says jump on the Apple Silicon Macs right away. Or move onto Windows or Linux based rigs if your preferred software works better there.

You are moving a LOT more pixels? Apple developed the ipad/iphone GPU - AMD developed the GPU in the iMacs

Go back and read what you have wrote and stop embarrassing yourself.
 
I'm guessing @ssgbryan read it as "a 12.9-inch iPad Pro's Retina Display has far fewer pixels to push than a 27-inch iMac's". Which, true, but I see no reason to believe Apple can't scale the GPU a little, given that they'll have way, way more thermal headroom.
He literally said a 27“ 5K display had „a LOT more pixels“ than a 14“ 5K display - which is rubbish. Hence my reaction.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.