Bloomberg: Apple Working on Next-Gen Apple Silicon Chips for MacBook Pro, iMacs, and Mac Pro Due to Launch Next Year

Au contrare, for heavily traded stock, the price moves almost instantaneously with news. Intel isn't being dumped, and for good reason. Yes, Apple's chips are awesome, but they are only available for Apple products, which are all high end. They also don't run Windows. Intel's price went up recently on news of advances in their quantum computing chips. And so on.

Here’s what I am expecting will happen to the PC market in the next couple of years.

Like many industries it operates in, Apple has garnered the lion’s share of profits in the PC market by aggregating the best customers. It does this by using its control over hardware, software and services to create a unique user experience that people are willing to pay a handsome premium for, which in turn means more money to reinvest in these areas.

Meanwhile, Windows PC makers have been surviving by selling PC hardware at near cost. They have managed to lock up over 90% of the PC market due to the (still) significant price difference between Macs and PCs, but I see that as a Pyrrhic victory at best. What’s the point of having all that market share when you don’t have the profits to show for it?

I see laptop makers continuing to be squeezed by Apple’s entry level Macs like the MBA. For just a little more, you get a significantly better product with way better performance and battery life. This in turn means less profits to funnel back into R&D at the end of the day.

In the long run, I expect the market to adopt Apple’s strategy and go with all-in-one chips on their computers. Which means that the value proposition of buying separate components (eg: intel processor, AMD graphics card) is not going to be worth it as this will mean worse performance at higher cost. I am willing to go so far as to predict that building a PC and upgrading it will be relegated to a very niche sub market of folks who build for fun, not due to technical or cost benefit.

I am not saying Intel can’t compete, but they do have an uphill task ahead of them, and if I were a betting man, I would start offloading any intel stock I may currently have.
 
Sorry dude - no matter how many times you repeat it, I’m pretty sure you are the only customer for that product...
Everyone complains about MBP’s loud fan, battery life, and the dongles to monitors and peripherals that make it less than portable, stuck on your desk. Wouldn’t it be better to separate the cpu from the laptop? All the wires and heat stay on the desk, and the laptop stays cool on the couch.
 
Here’s what I am expecting will happen to the PC market in the next couple of years.

Like many industries it operates in, Apple has garnered the lion’s share of profits in the PC market by aggregating the best customers. It does this by using its control over hardware, software and services to create a unique user experience that people are willing to pay a handsome premium for, which in turn means more money to reinvest in these areas.

Meanwhile, Windows PC makers have been surviving by selling PC hardware at near cost. They have managed to lock up over 90% of the PC market due to the (still) significant price difference between Macs and PCs, but I see that as a Pyrrhic victory at best. What’s the point of having all that market share when you don’t have the profits to show for it?

I see laptop makers continuing to be squeezed by Apple’s entry level Macs like the MBA. For just a little more, you get a significantly better product with way better performance and battery life. This in turn means less profits to funnel back into R&D at the end of the day.

In the long run, I expect the market to adopt Apple’s strategy and go with all-in-one chips on their computers. Which means that the value proposition of buying separate components (eg: intel processor, AMD graphics card) is not going to be worth it as this will mean worse performance at higher cost. I am willing to go so far as to predict that building a PC and upgrading it will be relegated to a very niche sub market of folks who build for fun, not due to technical or cost benefit.

I am not saying Intel can’t compete, but they do have an uphill task ahead of them, and if I were a betting man, I would start offloading any intel stock I may currently have.
There is merit to your perspective here. I think to compete with Apple the Microsoft - being the other large computing platform should partner with a chip manufacturer - AMD or Intel or Invidia - to optimize a chip design for their software and offer them as an integrated package.
 
You're right - I didn't read the GP post properly. Sorry. (I guess that in the spirit of the internet I should try to bluster through with a claim that 10.14 to 10.15 is a point update... but let's skip it :) ).

Still, forcing OS updates with new/updated hardware is an issue, and the Catalina issues did affect Mac Pro users - and although it's unavoidable with the first round of Apple Silicon & 11.0, we'll probably have MacOS 11.1 "SFO BART station" to contend with when the ASi Mac Pro comes out...
L'esprit d'Internet! :)

Definitely agree that forcing updates to a new OS that's on v x.1 is an issue.

Indeed, I think that's the main risk one takes being an early adopter of new Mac hardware that has a late fall release. The issue isn't necessarily the harware itself -- usually Apple's 1st-gen hardware is OK. The problem is that you're forced to use v x.1 of the latest OS, which is almost always a bad idea if you value smooth operation.

Back when I upgraded annually (I've had to stick with High Sierra for font rendering reasons), my SOP was to wait to upgrade until about v x.4+ (though with Catalina I would have needed to wait for v x.6).
 
Everyone complains about MBP’s loud fan, battery life, and the dongles to monitors and peripherals that make it less than portable, stuck on your desk. Wouldn’t it be better to separate the cpu from the laptop? All the wires and heat stay on the desk, and the laptop stays cool on the couch.

So the "computer" is on the desk, and the "laptop" is on the couch...

I guess you got a honking long cable from one to the other...?

Does not seem very Apple...
 
So the "computer" is on the desk, and the "laptop" is on the couch...

I guess you got a honking long cable from one to the other...?

Does not seem very Apple...

Hopefully they’ll unveil a new very Apple wireless video connection, but if it needs a USB C cable for faster video, at least it would also charge laptop, so that’s just one long cable to the couch. With MBP if you want to sit on the couch, and also use a monitor, and plug in a hard drive and audio interface, etc, you need four cables to the couch and annoying dongles. And then the fan turns on. It doesn’t have to be this way, people!! Just separate the laptop from the cpu.
 
There is merit to your perspective here. I think to compete with Apple the Microsoft - being the other large computing platform should partner with a chip manufacturer - AMD or Intel or Invidia - to optimize a chip design for their software and offer them as an integrated package.
The question in working with Microsoft would the the value proposition. While Qualcomm is focused where Microsoft is going, AMD, Intel and Nvidia would all MUCH rather be chosen for the myriad server farms popping up all over the place. The number of processors may pale in comparison to the consumer space, but the profit is there to soak up and make their bottom lines look better. Any unique projects they put forward is likely to be for this higher tier market.

Plus, there’s a balancing act, whoever Microsoft commits to will mean a sour partnership with the others. Additionally, if Microsoft wants to work with all of them together on a new MB standard that will be optimized for Windows to the detriment, perhaps, of Linux or other open source OS’s... then that takes us right back to where they’d LIKE to do something ground shaking like Apple, but they don’t have as much control over their market as Apple does over the Mac market.

If the Windows Market wanted and demanded something like the M1, they’d have had it, probably years ago. However, all the major players have been a part of framing the solution choices in a way that users HAVEN’T asked for that because they felt it wasn’t even worth asking for.
 
Because there are those that want to, I suppose, do NON-serious, non-real computing, like Logic Pro and Final Cut Pro as fast as they possibly can. If having a complex score or a huge media heavy timeline is an option, they would pay more to make that happen. In some cases, it could be as simple as getting more work done and approved in a day, in others it could be that a producer hates to “freeze” their tracks to avoid overtaxing their processor.
I can’t agree with your view here. Apple doesn’t sell Xeon to run these work for these people. They sell dedicated hardware accelerator for it. In the early day they proposed hardware solution more focused on solving I/O issue. Recent years they start to dictate all the software standard and now move to a full accelerator approach. If you are these Logic Pro or Final Cut Pro user you are likely running the software not on Xeon but their accelerator instead.
Your argument is that Apple is always right to its business. I couldn’t agree with it. If it’s that case they wouldn’t give up that trash can design that nobody buys and try to get back to mainstream general purpose design. In my opinion if they decide to fallback to that same motto again this product line will get difficult once more, not just to their user base but ultimately to Apple’s own Mac Pro line. They have seen this and they have felt that pain for seven years.
 
The Apple Afterburner hardware accelerator is for very specific video use, it has nothing to do with audio or Logic Pro X...

Created to transform the workflow for film and video professionals, Afterburner accelerates ProRes and ProRes RAW video codecs in Final Cut Pro X, QuickTime Player X, and supported third-party applications. A PCI Express card exclusively for Mac Pro, Afterburner can be installed in any full-length slot, but it delivers maximum capability in a PCIe x16 slot.

Compatible with Mac Pro (current generation)

PCI Express x16 card

Accelerates ProRes and ProRes RAW codec in Final Cut Pro X, QuickTime Player X, and supported third-party applications

Supports playback of up to 6 streams of 8K ProRes RAW or up to 23 streams of 4K ProRes RAW
 
I can’t agree with your view here. Apple doesn’t sell Xeon to run these work for these people. They sell dedicated hardware accelerator for it. In the early day they proposed hardware solution more focused on solving I/O issue. Recent years they start to dictate all the software standard and now move to a full accelerator approach. If you are these Logic Pro or Final Cut Pro user you are likely running the software not on Xeon but their accelerator instead.
What “accelerator” are you talking about? EDIT: If you’re talking about the Accelerator Boil mentions, no, Final Cut Pro does not “run” on the Accelerator, that part just makes actions that call the video codecs faster. The Intel written code runs on the Intel processor.

Your argument is that Apple is always right to its business. I couldn’t agree with it. If it’s that case they wouldn’t give up that trash can design that nobody buys and try to get back to mainstream general purpose design. In my opinion if they decide to fallback to that same motto again this product line will get difficult once more, not just to their user base but ultimately to Apple’s own Mac Pro line. They have seen this and they have felt that pain for seven years.
I can’t really understand what you’re trying to say here, sorry.
 
Can someone explain a little bit about graphics and gaming. I am reading a lot about how these chips can out pace older Nvidia cards. My main point is around "Shading" "reflections" DirectX Can these chips do DirectX?
 
Can someone explain a little bit about graphics and gaming. I am reading a lot about how these chips can out pace older Nvidia cards. My main point is around "Shading" "reflections" DirectX Can these chips do DirectX?
DirectX is a software layer for Windows. If Microsoft decide to support Windows on ARM on Apple Silicon, they may decide to write DirectX drivers for Apple’s GPU. At the moment, there is no support, just as there is no Windows support.
 
DirectX is a software layer for Windows. If Microsoft decide to support Windows on ARM on Apple Silicon, they may decide to write DirectX drivers for Apple’s GPU. At the moment, there is no support, just as there is no Windows support.
Ok. This helps.. Thank you. What do game devs for Apple use now? Is anything better than DirectX? anything coming down the pipeline?
 
Here’s what I am expecting will happen to the PC market in the next couple of years.

Like many industries it operates in, Apple has garnered the lion’s share of profits in the PC market by aggregating the best customers. It does this by using its control over hardware, software and services to create a unique user experience that people are willing to pay a handsome premium for, which in turn means more money to reinvest in these areas.

Meanwhile, Windows PC makers have been surviving by selling PC hardware at near cost. They have managed to lock up over 90% of the PC market due to the (still) significant price difference between Macs and PCs, but I see that as a Pyrrhic victory at best. What’s the point of having all that market share when you don’t have the profits to show for it?

I see laptop makers continuing to be squeezed by Apple’s entry level Macs like the MBA. For just a little more, you get a significantly better product with way better performance and battery life. This in turn means less profits to funnel back into R&D at the end of the day.

In the long run, I expect the market to adopt Apple’s strategy and go with all-in-one chips on their computers. Which means that the value proposition of buying separate components (eg: intel processor, AMD graphics card) is not going to be worth it as this will mean worse performance at higher cost. I am willing to go so far as to predict that building a PC and upgrading it will be relegated to a very niche sub market of folks who build for fun, not due to technical or cost benefit.

I am not saying Intel can’t compete, but they do have an uphill task ahead of them, and if I were a betting man, I would start offloading any intel stock I may currently have.
I agree with a lot of what you say. Except for a few things. I think both Intel and AMD will be furiously working away at SoC designs, and also looking at ARM designs, and probably already were. Also, you haven't seen prices for premium Windows machines if you think Apple has the high end locked up. This isn't like phones. The vast majority of businesses and individuals use Windows, and they either want premium or budget. There is no premium Apple Windows machine, so they must use other vendors (which offer some very nice machines, and without the dumb design flaws that Apple stubbornly insists on these days, yes I'm looking at you touchbar (although to be fair, many of them have their own dumb design flaws, e.g 15" models almost always having a numerical keypad, which horribly offsets the keyboard to the left of the machine)). Also, yes the M1 has great performance, but even the most basic Intel chip has enough performance for the majority of users. And unlike Apple, the other vendors haven't deliberately gimped the cooling systems on them, so they don't have any of the heat and fan noise issues like the Apple Intel machines do. So all those users aren't wishing they had a cooler, faster machine. For most Windows users, the only big advantage of the M1 is the battery life (and yes that is a huge and attractive advantage).

Another issue is the ongoing quality control issues with Apple in the last several years. Constant horrendous bugs in the software. And constant stubborn design decisions that are all about profits first, rather than about great product first.

BTW, I don't own any Intel or AMD stock, and I have no intentions to, as I definitely aren't betting on them as a buy. And I would certainly buy Apple stock, I think it has a lot more upside. I'm just saying, don't write off Intel and AMD quite yet, they still have a strangle hold on the majority of the market, and that's not going to disappear whilst most of the world is locked into Windows.
 
Hopefully they’ll unveil a new very Apple wireless video connection, but if it needs a USB C cable for faster video, at least it would also charge laptop, so that’s just one long cable to the couch. With MBP if you want to sit on the couch, and also use a monitor, and plug in a hard drive and audio interface, etc, you need four cables to the couch and annoying dongles. And then the fan turns on. It doesn’t have to be this way, people!! Just separate the laptop from the cpu.
Then why not just use a Chromebook and run a remote cloud compute instance? If you're not taking the "computer" with you anymore, just cut out the unnecessary hardware.

Can't help but think we've seen this idea of a separate terminal to the computer before somewhere...

1607554823539.png
 
the other vendors haven't deliberately gimped the cooling systems on them, so they don't have any of the heat and fan noise issues like the Apple Intel machines do. So all those users aren't wishing they had a cooler, faster machine.
Because I’m critical of Intel :) I’d say cooling system isn’t gimped, it’s configured to what Intel SAID they were going to produce. But, Intel hasn’t met their intended goals for awhile, soooo... yeah.

On the PC side, they make compromises for Intel‘s shortcomings. They use desktop memory because the processor doesn’t support LPDDR in high capacities, they make the cases larger so they can include larger batteries (making the system heavier) so that the run time is the same, configure the system so that it doesn’t use it‘s full rated speed unless it’s plugged in, OR, on the other end, provide a light performance constrained solution that still doesn’t have excellent battery life. Windows customers can complain, but there’s not much they can do about it. They can’t even “take their business elsewhere” because all the competition is responding to the situation in the same larger heavier or lighter lower performance way.

I don’t know if Intel can follow Apple into SoC... they are challenged just getting their CURRENT processors all on one die and that’s JUST the CPU! Start adding to that and Intel’s yields drop significantly. AMD might be able to, but neither can drop the Intel/AMD instruction set the way Apple has because no one is going to buy a PC that’s incompatible with the entire ecosystem. SOMEone would have to do a “Rosetta” type software translation layer. Could Intel and AMD get Microsoft on-board? Perhaps, but Microsoft’s really interested in having a solution they control that fits their needs that Qualcomm can provide. I think there are market forces that affect how “out there” they’d can be on the PC side.
 
If the higher end Apple Silicon Macintosh systems are as far ahead of the Intel/AMD higher end systems as these current M1 based systems are against their competitors, I would not be surprised to see a Windows on Arm port as the easiest way to support Apple Silicon. Baring that, from what we have seen, running translated will still out perform native x86_64 hardware.

For all of these, it will be a question as to how great a performance lead they have. It it is substantial, we will see ports to ARM, and if it is very substantial, we might even see native macOS versions.
With all due respect I don't think you know what you're talking about. Quite a few of those programs are not necessarily performance limited, they are highly specialised however.

The market is not therefore moving those programs to another architecture or even supporting a single re-compilation. Unlike the software engineering market, the professional hardware engineering market has very few Macintosh users.


Intel is not moving from the x86 architecture any time soon, I have worked at an Intel subsidiary so I know how the company operates...
 
Last edited:
I agree with a lot of what you say. Except for a few things. I think both Intel and AMD will be furiously working away at SoC designs, and also looking at ARM designs, and probably already were. Also, you haven't seen prices for premium Windows machines if you think Apple has the high end locked up. This isn't like phones. The vast majority of businesses and individuals use Windows, and they either want premium or budget. There is no premium Apple Windows machine, so they must use other vendors (which offer some very nice machines, and without the dumb design flaws that Apple stubbornly insists on these days, yes I'm looking at you touchbar (although to be fair, many of them have their own dumb design flaws, e.g 15" models almost always having a numerical keypad, which horribly offsets the keyboard to the left of the machine)). Also, yes the M1 has great performance, but even the most basic Intel chip has enough performance for the majority of users. And unlike Apple, the other vendors haven't deliberately gimped the cooling systems on them, so they don't have any of the heat and fan noise issues like the Apple Intel machines do. So all those users aren't wishing they had a cooler, faster machine. For most Windows users, the only big advantage of the M1 is the battery life (and yes that is a huge and attractive advantage).

Another issue is the ongoing quality control issues with Apple in the last several years. Constant horrendous bugs in the software. And constant stubborn design decisions that are all about profits first, rather than about great product first.

BTW, I don't own any Intel or AMD stock, and I have no intentions to, as I definitely aren't betting on them as a buy. And I would certainly buy Apple stock, I think it has a lot more upside. I'm just saying, don't write off Intel and AMD quite yet, they still have a strangle hold on the majority of the market, and that's not going to disappear whilst most of the world is locked into Windows.
Can vendors make ARM cpus with SOC hardware? What would happen to the BIOS? Seems like that would make for an interesting new world.. Step away from x86 would be a good start regardless of Intel or AMD or Apple M1...
 
Then why not just use a Chromebook and run a remote cloud compute instance? If you're not taking the "computer" with you anymore, just cut out the unnecessary hardware.

Can't help but think we've seen this idea of a separate terminal to the computer before somewhere...

View attachment 1690431
The hardware is still necessary to be in the room, just not on my lap. I use logic with a usb audio interface, instruments, and second monitor all connected to a dongle (and another usb C for charging, but it adds annoying noise when connected to charger). It’s difficult to move laptop, it can’t be disconnected from audio connection. Then fans come on. Imagine if the hot cpu and all the usb cables were taken out of the laptop and moved to a little portable headless mini, that connects to the now cool and light laptop with a single cable for fastest graphics and charging, or wirelessly.
 
With all due respect I don't think you know what you're talking about. Quite a few of those programs are not necessarily performance limited, they are highly specialised however.
I did not claim that they would do this because they needed the performance, but I do claim that one of three things is likely to happen if Apple is able to maintain this performance lead. People who need performance for some other applications will want to switch to Apple Silicon systems and will push these vendors to at least port to recompile for Windows on Arm, Microsoft will decide that they do not want to lose customers and will make a virtualized Windows on Arm available for Apple Silicon systems, or finally, the small number of people who need this not very resource intensive, specialized applications will just run them over the net on remote Windows servers.

The market is not therefore moving those programs to another architecture or even supporting a single re-compilation. Unlike the software engineering market, the professional hardware engineering market has very few Macintosh users.
From what we have seen, Apple Silicon can run transcoded x86_64 applications faster than on native hardware. If that advantage continues, people will either switch and the software will follow, or it will just be run via binary translation.
Intel is not moving from the x86 architecture any time soon, I have worked at an Intel subsidiary so I know how the company operates...
I am sure that is true, and as a result they will likely fall further and further behind in performance, making my predictions even more likely.
 
Because I’m critical of Intel :) I’d say cooling system isn’t gimped, it’s configured to what Intel SAID they were going to produce. But, Intel hasn’t met their intended goals for awhile, soooo... yeah.

On the PC side, they make compromises for Intel‘s shortcomings. They use desktop memory because the processor doesn’t support LPDDR in high capacities, they make the cases larger so they can include larger batteries (making the system heavier) so that the run time is the same, configure the system so that it doesn’t use it‘s full rated speed unless it’s plugged in, OR, on the other end, provide a light performance constrained solution that still doesn’t have excellent battery life. Windows customers can complain, but there’s not much they can do about it. They can’t even “take their business elsewhere” because all the competition is responding to the situation in the same larger heavier or lighter lower performance way.

I don’t know if Intel can follow Apple into SoC... they are challenged just getting their CURRENT processors all on one die and that’s JUST the CPU! Start adding to that and Intel’s yields drop significantly. AMD might be able to, but neither can drop the Intel/AMD instruction set the way Apple has because no one is going to buy a PC that’s incompatible with the entire ecosystem. SOMEone would have to do a “Rosetta” type software translation layer. Could Intel and AMD get Microsoft on-board? Perhaps, but Microsoft’s really interested in having a solution they control that fits their needs that Qualcomm can provide. I think there are market forces that affect how “out there” they’d can be on the PC side.
I agree that Intel has those shortcomings, but that's not to say they can't overcome them.

If you think Apple hasn't gimped the cooling systems of the Intel Macbooks, then you need to watch these vids. The short version is:
1) The heatsink is incorrectly mounted so it isn't flush against the chip, and thus is horribly inefficient. This same problem doesn't exist on the M1.
2) On the 2020 Intel MBA, the fan is offset a long way from the heatsink, which normally wouldn't be a problem, as you'd connect them with a heat pipe. But Apple took the heat pipe out! Yep, the fan spins up when the chip heats up (which it does quickly because the heatsink isn't mounted correctly), but the fan doesn't actually cool anything because there is no heat pipe, so it just sits there noisily spinning away achieving no actual cooling until the user stops using the laptop. Whereas the M1 MBA has dumped the fan and put a massive heat spreader in its place that connects (efficiently) to the heatsink. The same solution could have of course been used on the Intel MBA, and greatly improved its cooling, and thus performance, also without fan noise. Yes, the Intel chips do run hotter, so such a solution would result in quicker throttling than for the M1s, but it would be much slower to throttle than the actual gimped design Apple used.

 
If you think Apple hasn't gimped the cooling systems of the Intel Macbooks, then you need to watch these vids. The short version is:
Is the cooling system inefficient and not fit for purpose? Yes.

Is it VERY VERY likely that Intel sold Apple a bill of goods and even confidently stated that their next processors will most certainly work well with the proposed cooling system? Yes.

SHOULD Apple have done with PC Makers have done and changed their entire line to meet the fact that Intel failed them? Yeah, probably, but they didn’t. Well, not in the way PC makers did. The case was kept the same as they had intended, but, they likely had to redo some of the internals with last minute fixes, which, from the descriptions, sounds like what happened.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top