Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The partnership failed to deliver a faster RISC processor. Sounds like a failure in technical superiority to me.

Not at all. It was about the different priorities of the three players. Motorola (later Freescale) wanted to deliver volume and their volume customers cared about embedded systems (they dominated that market for a while). IBM cared about Big Iron and did not care about performance per watt, just raw performance. Apple needed to build laptops, had not enough money or volume to design chips that were most suited to their needs.

(I explained all this earlier.)

This time things are very different. Apple has both the money and the expertise to build the chips they want that do exactly what they want, tied together with software that is perfectly optimized for it. Last time, they had to beg, this time they are in control.
 
Yup, that is what I have said.

Today they are not, but they very much maybe in the not too distant future. It is estimated that Sony’s manufacturing cost on the PS5 is $450. They will either sell it at substantial or barely break even on it. It certainly seems possible that Apple could target one of their desktop SoCs to build an AppleTV that was competitive with those specs and sell it at a much lower price, as they would not have to pay for AMD’s profit on the chips. A competitive piece of hardware at a price where casual gamers can easily afford it, might be very compelling.


Apple TV already attempted to be a game boxen - how did that work out again?

Go to Steam.

Look at what is available for Windows.

Look at what is available for OSX.

Look at what is available for iOS.
 
Yup, that is what I have said.



Today they are not, but they very much maybe in the not too distant future. It is estimated that Sony’s manufacturing cost on the PS5 is $450. They will either sell it at substantial or barely break even on it. It certainly seems possible that Apple could target one of their desktop SoCs to build an AppleTV that was competitive with those specs and sell it at a much lower price, as they would not have to pay for AMD’s profit on the chips. A competitive piece of hardware at a price where casual gamers can easily afford it, might be very compelling.

I don't expect Apple to waste billions of dollars and engineering talent trying to enter console gaming market.
 
Apple TV already attempted to be a game boxen - how did that work out again?

Actually, it is now working quite well. They added support for PS3/4 and Xbone controllers and have sold lots of subscriptions to Apple Arcade. If they decide to build a really competitive box (using their new silicon), they can certainly do so.

Go to Steam.

Look at what is available for Windows.

Look at what is available for OSX.

Look at what is available for iOS.

Irrelevant. Apple did not care about gaming and gamers did not care about Apple, hence no steam games. Apple is now in a very different position. They can afford to build a box that is competitive with consoles at a price point that is much lower. If they do so, they have a good chance of success. For game studios, it is all about volume. If Apple delivers a volume platform that can support console level games (and that may also hand them iPad Pros and new Macs), it may very well be compelling. We will see.

Just like every generation of the console wars is a new playing field, Apple has an actual shot if they decide they want to play (still no guarantee they will, just saying they seem to care about gaming now in a way they have not before).
 
  • Like
Reactions: SteveW928
Apple will announce their new ARM based Macintosh systems and provide benchmarks showing them in a favorable light. Then the ARM versus x64 arguments will begin with each side showing their architecture as better. Benchmarks will be shown, architectural benefits will be presented, ad hominems will be thrown, and nothing will be resolved. Just like in 1995!

Not at all. Intel will just tell you, that benchmarks are irrelevant once they start loosing left and right.

https://www.pcgamer.com/intel-tiger-lake-move-away-from-benchmarks/
 
I don't expect Apple to waste billions of dollars and engineering talent trying to enter console gaming market.

They do not have to spend billions of dollars to do it. They have a solid SoC architecture that they control completely, and can easily repurpose work they are doing for other products. It would be a tweak for the AppleTV, not a completely new area for them. They rev these SoCs every year, while Sony and Microsoft cycle every few years. That would be their advantage.
 
  • Like
Reactions: PickUrPoison
Apple TV already attempted to be a game boxen - how did that work out again?

It's all about the titles, at least in terms of what we think of as console gaming. If Apple can attract them, then they will be competitive or win.

And, they made huge mistakes with the initial attempt on AppleTV (and iOS), in going with MFi controllers vs supporting controllers many already have (and are well known).

They've done extremely well in non-console games on iOS. The key to console gaming is the controller.

... For game studios, it is all about volume. If Apple delivers a volume platform that can support console level games (and that may also hand them iPad Pros and new Macs), it may very well be compelling. We will see.

Just like every generation of the console wars is a new playing field, Apple has an actual shot if they decide they want to play (still no guarantee they will, just saying they seem to care about gaming now in a way they have not before).

Yes, and great points. Though I think I agree with the others a bit in that they have some steps to go to meet the current (or coming in a few months) level of console gaming. They'd have to introduce an Apple TV that is in the same ball park as the consoles.

While I'm sure it is about volume, the A-list games are also going to demand a certain level of hardware performance.
 
Just like every generation of the console wars is a new playing field, Apple has an actual shot if they decide they want to play (still no guarantee they will, just saying they seem to care about gaming now in a way they have not before).

Issue is, that Apple is shooting for like 10W-30W TDP SoCs for their MacBook line in the first place - not like 150-200W power you need in order to shoot for console performance. And problem is not the CPU but the GPU, which consumes the majority of above power.
And even then, you would not possibly put such a monster into an Apple TV box - would be quite a different product.
 
Issue is, that Apple is shooting for like 10W-30W TDP SoCs for their MacBook line in the first place - not like 150-200W power you need in order to shoot for console performance.
As a CPU designer give me that problem every time. (“What? I have >100w power budget and all I have here is this 30W SoC? Gee, sounds hard. Call me in a year.” .... spends year playing solitaire .... “ok, here you go. Problem solved.” ... turns up power supply voltage and speeds up clock...)
 
They do not have to spend billions of dollars to do it. They have a solid SoC architecture that they control completely, and can easily repurpose work they are doing for other products. It would be a tweak for the AppleTV, not a completely new area for them. They rev these SoCs every year, while Sony and Microsoft cycle every few years. That would be their advantage.

That is not how it works. You can't just throw the product out there expecting developers to support them. Sony and MS aren't idiots spending billions trying to compete on that space. AppleTV name isn't synonymous with console gaming, which means they still have to spend tons of marketing money, provide development tools, and court developers to make games for it.
Apple might have unlimited pool of money, but they have limited engineering talent. They are not going to waste those on already crowded market.
 
Issue is, that Apple is shooting for like 10W-30W TDP SoCs for their MacBook line in the first place - not like 150-200W power you need in order to shoot for console performance. And problem is not the CPU but the GPU, which consumes the majority of above power.

Given they are not as power limited, they could use one of their iMac SoCs. Yes, they need a competitive GPU, but they need that for their desktops as well.

And even then, you would not possibly put such a monster into an Apple TV box - would be quite a different product.

That will be the question, will it not. Can they build a box within one to two years that will be competitive. Again, they have the ability to cycle more often than Sony or Microsoft, so it is certainly possible.
 
  • Like
Reactions: SteveW928
Issue is, that Apple is shooting for like 10W-30W TDP SoCs for their MacBook line in the first place - not like 150-200W power you need in order to shoot for console performance. And problem is not the CPU but the GPU, which consumes the majority of above power.
And even then, you would not possibly put such a monster into an Apple TV box - would be quite a different product.

But, they need to solve those issues for anything in the Mac lineup above the MacBook anyway, right? How hard that is, I can't say. Seems some here think the GPU aspect is a lot harder than Apple is up to, but I have no idea.
 
As a CPU designer give me that problem every time. (“What? I have >100w power budget and all I have here is this 30W SoC? Gee, sounds hard. Call me in a year.” .... spends year playing solitaire .... “ok, here you go. Problem solved.” ... turns up power supply voltage and speeds up clock...)

Its not about that you can trivially use that power by just turn up voltage and clock - this way you would miss the performance target by miles. Eventually you will come to the conclusion that in order to scale the performance linearly with power, that you just need a much bigger GPU and leave the voltage at a more healthy level :)
 
Its not about that you can trivially use that power by just turn up voltage and clock - this way you would miss the performance target by miles. Eventually you will come to the conclusion that in order to scale the performance linearly with power, that you just need a much bigger GPU and leave the voltage at a more healthy level :)
Well GPU is different. But for CPUs, yep, I’d turn up the voltage. (That’s actually what we typically did - we designed the cpu to run over a large range of voltages and clock multipliers, to meet whatever the power budget was. Sometimes these were marketed as different chips even though they were just different bins of the same chip.)
 
  • Like
Reactions: Gerdi
I wonder how they will fragment their lineup. Usually the biggest difference was processor speed. Will they have different versions of the Mac A14 version?
They’ll segment in much the same way they do today. I’m sure they’ll still have laptops, desktops and all-in-ones.

CPU and GPU performance; other specs like storage and RAM; cost; size; battery life... you can turn the dial on all these variables (and more) to come up from everything from a $800 MacBook Air to a $40,000 Mac Pro.
 
  • Like
Reactions: SteveW928
That is not how it works. You can't just throw the product out there expecting developers to support them. Sony and MS aren't idiots spending billions trying to compete on that space. AppleTV name isn't synonymous with console gaming, which means they still have to spend tons of marketing money, provide development tools, and court developers to make games for it.
Apple might have unlimited pool of money, but they have limited engineering talent. They are not going to waste those on already crowded market.

Oh yes, they'd have to get a lot more serious about attracting game developers. But, they've already started down that path, they just have to include the more A-list titles (which they don't have the hardware for yet).

But, I think the point is that they will need to make that hardware anyway, it is more a matter of pricing... and being in control of the whole thing makes it more likely they can do it than Sony or Microsoft, if they have the will to do it.
 
Yup, that is what I have said.



Today they are not, but they very much maybe in the not too distant future. It is estimated that Sony’s manufacturing cost on the PS5 is $450. They will either sell it at substantial or barely break even on it. It certainly seems possible that Apple could target one of their desktop SoCs to build an AppleTV that was competitive with those specs and sell it at a much lower price, as they would not have to pay for AMD’s profit on the chips. A competitive piece of hardware at a price where casual gamers can easily afford it, might be very compelling.

Games sell consoles. Consoles doesn’t sell games. Sony and Microsoft are willing to sell their hardware without profits because they make a fortune on the software and services (PS+, PSNow, Xbox Live and GamePass). The gaming market is worth more than the movie and music market combined.

If Apple ships a console without a library of killer games, then it’ll just go down in history as the 3DO, Atari Jaguar, TurboGrafx, Sega Saturn, and more.

Current iOS games are, well, not something most customers would pay more than $1.99 for. Most customers even play free games with advertisement. None of these would ever consider purchasing a several hundred dollar hardware for dedicated gaming. They won’t even pay a dollar to remove in-game ads.

Apple is probably very aware of this. Hence their entry into the gaming market can barely be called lukewarm.
[automerge]1592965957[/automerge]
People seem to forget that one of the primary reasons Apple changed from PowerPC (RISC) to Intel (CISC) was because the RISC processsors was horrible to cool. Anyone that’s owned the old PowerMac tower knows how functioned like a loud space heater that could easily heat up a loft on a winters day. The PowerBook was also very hot for a laptop.

So I’d wait and see how things pan out with desktop class ARM chips in the future. They might not run as cool as you’d wish. Certainly not that much cooler at the same performance as x86 chips of the same generation on the same manufacturing node (7nm for example).

Which brings me on to the fact that Apple will be designing chips. But they won’t make them. They’ll still be at the mercy of TSMC most likely to innovate their manufacturing processes and to keep their foundries up to date. So in that sense Apple won’t have full control. If something happens to TSMC Apple will be severely affected.
 
Last edited:
Which brings me on to the fact that Apple will be designing chips. But they won’t make them. They’ll still be at the mercy of TSMC most likely to innovate their manufacturing processes and to keep their foundries up to date. So in that sense Apple won’t have full control. If something happens to TSMC Apple will be severely affected.

only if their competitors are using fabs that don’t have whatever problem TSMC is having. And, frankly, it’s not THAT hard to port a design to another fab. I once designed a chip that was designed to be fabbed on either of two completely different fabs. Other times i designed chips that were designed to be fabbed on either of two different process nodes. You give up a little bit in performance because you have to design to the least-common-denominator - you space wires and transistors a little farther apart than they otherwise could be, etc. It’s a bit of a nuisance but certainly something Apple could do if, say GlobalFoundries or Samsung suddenly trounced TSMC in quality.
 
  • Like
Reactions: SteveW928
People seem to forget that one of the primary reasons Apple changed from PowerPC (RISC) to Intel (CISC) was because the RISC processsors was horrible to cool. Anyone that’s owned the old PowerMac tower knows how functioned like a loud space heater that could easily heat up a loft on a winters day. The PowerBook was also very hot for a laptop.

So I’d wait and see how things pan out with desktop class ARM chips in the future. They might not run as cool as you’d wish. Certainly not that much cooler at the same performance as x86 chips of the same generation on the same manufacturing node (7nm for example).

It has nothing to do with PowerPC being RISC, ARM processors are basically proof of that. Apple's A series potential limitation is ability to clock higher due to being very wide architecture, but not its ability to run cool. Architecture that is already efficient enough to run well on smartphone/tablet isn't going to suddenly run hot on laptop.
 
<snip>
Which brings me on to the fact that Apple will be designing chips. But they won’t make them. They’ll still be at the mercy of TSMC most likely to innovate their manufacturing processes and to keep their foundries up to date. So in that sense Apple won’t have full control. If something happens to TSMC Apple will be severely affected.
Which brings me on to the fact that Apple will be designing chips. But they won’t make them. They’ll still be at the mercy of TSMC most likely to innovate their manufacturing processes and to keep their foundries up to date. So in that sense Apple won’t have full control. If something happens to TSMC Apple will be severely affected.
I wouldn’t be surprised to see them do some kind of deal with Samsung for fab capacity. Apple doesn’t like all their eggs in one basket, however the TSMC partnership has gone perfectly so far.
 
  • Like
Reactions: SteveW928
People seem to forget that one of the primary reasons Apple changed from PowerPC (RISC) to Intel (CISC) was because the RISC processsors was horrible to cool. Anyone that’s owned the old PowerMac tower knows how functioned like a loud space heater that could easily heat up a loft on a winters day. The PowerBook was also very hot for a laptop.

man what planet are you on?? RISC processors are horrible to cool? that's got nothing to do with RISC and everything to do with IBM's particular design. plus have you heard an air-cooled intel or AMD processor running full bore?
 
It has nothing to do with PowerPC being RISC, ARM processors are basically proof of that. Apple's A series potential limitation is ability to clock higher due to being very wide architecture, but not its ability to run cool. Architecture that is already efficient enough to run well on smartphone/tablet isn't going to suddenly run hot on laptop.

ARM chips in phones and tablets have been throttling consistently for years already. Put an iPad or iPhone’s cpu to good use (export 500 images in Lightroom mobile for example) and witness the clocks plummet even before a minute has passed by.

A phone and tablet is used very different than desktop and laptop class workstations. Small bursts of compute is easy to handle for how tablets and phones are used. Consistent high and full loads is a whole different ball game, and will require capable active cooling. Energy will generate heat no matter what, and will need to be cooled down to be able to perform consistently.

Just like car engines or electric engines.

Higher performance engines are more and more sensitive to heat, and thus requires better and and more capable and consistent cooling. Energy is energy, and heat is heat. Energy in is energy out. Very basic and super simple stuff really. Just like calories in and calories spent. Spend less calories than you consume and you grow bigger consistently, and vice versa.
[automerge]1592967757[/automerge]
man what planet are you on?? RISC processors are horrible to cool? that's got nothing to do with RISC and everything to do with IBM's particular design. plus have you heard an air-cooled intel or AMD processor running full bore?

Yes. I have an air cooled CPU right now that can consume about 125-130w consistently. Air cooled by a Noctua NH-D15 air cooler with two 150mm fans barely spinning at around 700rpm each to dissipate the heat. The fans are basically inaudible, even if you put your ear up to the case and focus and listen. And when the CPU isn’t working hard, the fans barely spin at 300rpm and it’s still at around 40 degrees on average.

The CPU (AMD 3900x) never goes over 80 degrees Celsius, and maintains 4.2GHz boost clocks on all 12 cores even after hours of full load.

That’s a easily attainable $90 cooler including fans. Not rocket science. Most people use ****** OEM coolers though, or loud AIO’s with crappy fans and poor air circulation. It’s not the CPU’s or manufacturers fault that most people are clueless about airflow and proper cooling. And it’s not rocket science. Energy is heat and heat needs to be dissipated with proper air circulation, otherwise things get hot and loud and starts performing worse or even breaking.
 
Last edited:
ARM chips in phones and tablets have been throttling consistently for years already. Put an iPad or iPhone’s cpu to good use (export 500 images in Lightroom mobile for example) and witness the clocks plummet even before a minute has passed by.

A phone and tablet is used very different than desktop and laptop class workstations. Small bursts of compute is easy to handle for how tablets and phones are used. Consistent high and full loads is a whole different ball game, and will require capable active cooling. Energy will generate heat no matter what, and will need to be cooled down to be able to perform consistently.

Just like car engines or electric engines.

Higher performance engines are more and more sensitive to heat, and thus requires better and and more capable and consistent cooling. Energy is energy, and heat is heat. Energy in is energy out. Very basic and super simple stuff really. Just like calories in and calories spent. Spend less calories than you consume and you grow bigger consistently, and vice versa.
A chip can aircool (with a fan) around 10W per square centimeter of surface area (with appropriate heat sinks, etc). A phone gives you not a lot of surface area, and not a lot of volume of moving air. Put that same phone chip in a MacBook Pro and it will never throttle.
 
  • Like
Reactions: firewood
ARM ATX boards already exist. Clover can be adapted and ported. Hackintosh will still live.
once X86 Macs are no longer supported I'm hoping this.

if not worse case I expect already ok ARM emulators that exist get even better and allows for ARM Mac OS to be fully virtualized.

until i see more data on Apple's Silicon on the high end I have more faith my threadripper with one or several high-end GPU's can emulate an ARM Mac better than an ARM Mac can Emulate the high-end PC tasks I need to do.
 
Last edited by a moderator:
ARM chips in phones and tablets have been throttling consistently for years already. Put an iPad or iPhone’s cpu to good use (export 500 images in Lightroom mobile for example) and witness the clocks plummet even before a minute has passed by.

A phone and tablet is used very different than desktop and laptop class workstations. Small bursts of compute is easy to handle for how tablets and phones are used. Consistent high and full loads is a whole different ball game, and will require capable active cooling. Energy will generate heat no matter what, and will need to be cooled down to be able to perform consistently.

Just like car engines or electric engines.

Higher performance engines are more and more sensitive to heat, and thus requires better and and more capable and consistent cooling. Energy is energy, and heat is heat. Energy in is energy out. Very basic and super simple stuff really. Just like calories in and calories spent. Spend less calories than you consume and you grow bigger consistently, and vice versa.


Not going to argue on the obvious laws of physics. But I expect ARM based SoCs will perform more efficiently than x86 counterpart given higher wattage and active cooling on laptop/desktop chassis.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.