Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...
Higher performance engines are more and more sensitive to heat, and thus requires better and and more capable and consistent cooling. Energy is energy, and heat is heat. Energy in is energy out. Very basic and super simple stuff really. Just like calories in and calories spent. Spend less calories than you consume and you grow bigger consistently, and vice versa.

Hopefully the other analogies hold up better that the calorie one, heh. Because 1000 calories of broccoli doesn't equal 1000 calories of Twinkie. The body isn't nearly that simple of a system. I don't know about CPUs.

But, I think you made a good point (maybe?) that the A-chips are doing Intel-matching performance in bursts, not on the whole? But, is that a matter of cooling? And, are they using a lot more than, say 1 watt (or whatever iOS devices draw) for that short time-period?
 
Hopefully the other analogies hold up better that the calorie one, heh. Because 1000 calories of broccoli doesn't equal 1000 calories of Twinkie. The body isn't nearly that simple of a system. I don't know about CPUs.

But, I think you made a good point (maybe?) that the A-chips are doing Intel-matching performance in bursts, not on the whole? But, is that a matter of cooling? And, are they using a lot more than, say 1 watt (or whatever iOS devices draw) for that short time-period?

The reason it does bursts has nothing to do with any shortcoming of the processor. It’s simply the cooling solution can’t handle transfer heat fast enough in a phone form factor. There’s nothing a CPU designer does to make one chip better than another at running for longer bursts, other than reducing the overall power dissipation. If you take an A12 and water cool it, it doesn’t suddenly break because it no longer stops runs in bursts and instead runs continuously at high speed.
 
  • Like
Reactions: SteveW928
Hopefully the other analogies hold up better that the calorie one, heh. Because 1000 calories of broccoli doesn't equal 1000 calories of Twinkie. The body isn't nearly that simple of a system. I don't know about CPUs.

But, I think you made a good point (maybe?) that the A-chips are doing Intel-matching performance in bursts, not on the whole? But, is that a matter of cooling? And, are they using a lot more than, say 1 watt (or whatever iOS devices draw) for that short time-period?

That’s true. Our bodies ain’t that simple. Each individual can process and handle different types of calories and energy differently.

All I’m saying is that just because something is RISC or CISC or ARM or whatever, the general rule of thumb over the last 30 years of messing about with all sorts of computer hardware and building, modding, delidding CPU’s and properly flattening those surfaces out to optimize cooling (cause the manufacturers sure never did a good job at that) and playing around with air, water, LN2, and doing all sorts of unthinkable tests to see how each type of cooling solution affected performance and how that also affected energy consumption. Basically the trend is - the higher the performance, the higher the energy consumption, the more heat is generated, and the more sensitive each component becomes and will start throttling to prevent failure or instability.

Phones and tablets are very different than workstations. And if Apples performance numbers are true, then I’ll expect a warm and clammy keyboard during heavy workloads on the ARM MacBook Pro as well. Oh and some fans. And about the same battery life and energy consumption. It’ll probably be cooler on light workloads though, which is what I think will be the biggest difference.

Basically a 4 cylinder 600hp turbo charged tuned engine can consume just as much, and in many cases more fuel than a 600hp V12 engine. Push ARM chips to the limit and they’ll likely want a lot of energy just like everything else that is built to maximize performance.

Anyway that’s just my opinion from experience modding CPU’s, unlocking them when manufacturers tried to lock them down, modding gpu’s and reactivating shaders that was simply disabled to produce market segmentation, hacking all sorts of firmwares, controllers, bypassing T2 like designs in many consoles and equipment, just for the sake of seeing if it could be done for the fun of it. Oh and for the freedom of using my purchases how I want to regardless of how a corporation tried to tell me what I could and could not do with my purchases by deliberately crippling devices with micro-code, firmware code and controller code to up sell the more expensive model or to control a market. As you can probably understand, Apple has over time not exactly become my favorite company. Their sickness for having control and the way they abuse developers on the AppStore is the corporate equivalent of how China controls their citizens, media, and speech. I call it lunacy.
 
Last edited:
  • Like
Reactions: writerinserepeat
I want to see perf numbers - but I'm looking at the terms of an DTK, and a condition is to NOT run benchmarks. Now that's kind of silly. The 12Z isn't the final hardware, but running benchmarks on the would help provide some context.
 
I want to see perf numbers - but I'm looking at the terms of an DTK, and a condition is to NOT run benchmarks. Now that's kind of silly. The 12Z isn't the final hardware, but running benchmarks on the would help provide some context.

It wouldn’t provide contest other than “it will run a lot faster than this,” but the fact that people seem to think it would provide context is exactly why they won’t allow it.

99% of the people on here will pay no attention, see the benchmark, and start shouting “see! It runs slower than intel’s [insert chip here]” and then that narrative starts to spread and it’s all based on ********.
 
That’s true. Our bodies ain’t that simple. Each individual can process and handle different types of calories and energy differently.

All I’m saying is that just because something is RISC or CISC or ARM or whatever, the general rule of thumb over the last 30 years of messing about with all sorts of computer hardware and building, modding, delidding CPU’s and properly flattening those surfaces out to optimize cooling (cause the manufacturers sure never did a good job at that) and playing around with air, water, LN2, and doing all sorts of unthinkable tests to see how each type of cooling solution affected performance and how that also affected energy consumption. Basically the trend is - the higher the performance, the higher the energy consumption, the more heat is generated, and the more sensitive each component becomes and will start throttling to prevent failure or instability.

Phones and tablets are very different than workstations. And if Apples performance numbers are true, then I’ll expect a warm and clammy keyboard during heavy workloads on the ARM MacBook Pro as well. Oh and some fans. And about the same battery life and energy consumption. It’ll probably be cooler on light workloads though, which is what I think will be the biggest difference.

Basically a 4 cylinder 600hp turbo charged tuned engine can consume just as much, and in many cases more fuel than a 600hp V12 engine. Push ARM chips to the limit and they’ll likely want a lot of energy just like everything else that is built to maximize performance.
Basically, yes. However, there are a few more specific attributes to consider:

1. From what I've read, ARM instruction-based processing is more efficient per consumed watt (at least at some tasks) than some other instruction sets. So, if true, at max load, it should run somewhat cooler.
2. Apple may very well utilize the current A-series high-efficiency plus high-performance core setup in their desktop and laptop system CPUs, e.g., 8 high-performance and four high-efficiency. The weaker cores could be used for lower demand, lower priority threads, such as notifications, cloud drive syncing, update checking, downloads, etc. To go with your analogy, have a 50hp go-kart for the backyard and have the 600hp sports car for the professional races.
3. With Apple in control of the chip designs, they can test and implement whatever features pair best, which could be asynchronous core capabilities (as in #2), more or less on-die cache memory, further instruction set extensions, CPU versions with and without GPUs, offload specific tasks to co-processors, and more.

Will there be problems? Of course. However -- optimistically -- I think, it will be a good move overall.
 
Last edited:
  • Like
Reactions: Roode and SteveW928
Yes. I have an air cooled CPU right now that can consume about 125-130w consistently. Air cooled by a Noctua NH-D15 air cooler with two 150mm fans barely spinning at around 700rpm each to dissipate the heat. The fans are basically inaudible, even if you put your ear up to the case and focus and listen. And when the CPU isn’t working hard, the fans barely spin at 300rpm and it’s still at around 40 degrees on average.

The CPU (AMD 3900x) never goes over 80 degrees Celsius, and maintains 4.2GHz boost clocks on all 12 cores even after hours of full load.

That’s a easily attainable $90 cooler including fans. Not rocket science. Most people use ****** OEM coolers though, or loud AIO’s with crappy fans and poor air circulation. It’s not the CPU’s or manufacturers fault that most people are clueless about airflow and proper cooling. And it’s not rocket science. Energy is heat and heat needs to be dissipated with proper air circulation, otherwise things get hot and loud and starts performing worse or even breaking.

Yeah, I've never understood why manufacturers or typical designs don't put more emphasis on this. I had an audio amp at one point that I got rid of because it had this small, super-annoying fan in it. Laptops drive me NUTS!

I run my Mac mini (2018) i7 with TurboBoost disabled, and then it runs pretty quiet (fortunately). My Blackmagic eGPU is a thing of beauty... it can run 100% 24/7 without making any noticeable noise. I just don't get why more designs like that aren't made.

Do most people just not care?

... Basically the trend is - the higher the performance, the higher the energy consumption, the more heat is generated, and the more sensitive each component becomes and will start throttling to prevent failure or instability. ...

Yes, but I suppose different platforms have different amount of energy usage. But, in general, for sure.

I'm always surprised at some of the comparisons I hear between say, an iPhone and a console or gaming PC. While there might be differences in efficiency, it isn't like that iPhone is doing the same thing in 1 watt that the gaming console is doing in 130 watts, or the gaming PC is doing in hundreds of watts.
 
  • Like
Reactions: Realityck
I think the Mac gaming community is about to get turned on it’s head to be honest. Apple is clearly building some great graphics going forward.

Obviously that will do nothing for or existing games (especially on Windows).

That's not clear at all to be honest.
Professional apps are relying on AMD and nVidia for graphics acceleration. Graphic cards consuming 200W+ are not going to be beaten by whatever integrated graphics Apple is preparing. Sure, it will be good enough for most. Great? I don't think so.
So, would ARM based macs have support for eGPUs? Without Thunderbird support? I doubt it, but hope to be wrong.
ARM is interesting but this is not yet a solution for power users as far as I can see from the info we have atm.
 
That's not clear at all to be honest.
Professional apps are relying on AMD and nVidia for graphics acceleration. Graphic cards consuming 200W+ are not going to be beaten by whatever integrated graphics Apple is preparing. Sure, it will be good enough for most. Great? I don't think so.
So, would ARM based macs have support for eGPUs? Without Thunderbird support? I doubt it, but hope to be wrong.
ARM is interesting but this is not yet a solution for power users as far as I can see from the info we have atm.

I sure hope Apple isn't abandoning the eGPU. Even if Thunderbolt goes the way of the dodo, I'm guessing USB4 or other solutions will enable this kind of concept (and, yes, hopefully be backwards compatible to support the TB stuff we already have!).

I suppose Apple could produce higher end GPUs (or eGPU units) as well as the integrated stuff. Do they really want to be in that competition as well, though, or let AMD/nVidia work on it. At some point, even Apple could spread themselves too thin.
 
I think the Mac gaming community is about to get turned on it’s head to be honest. Apple is clearly building some great graphics going forward.

Obviously that will do nothing for or existing games (especially on Windows).
Right now Apple has made it increasingly difficult to even play games on a Mac. Go run a poll on what should you use to play games. It would be someones delusional fantasy to think that iPad's or Macs could be partnered with some of the biggest game producers out there. Go ahead Tim Cook, prove me wrong after so many years of neglect. :cool:
 
  • Like
Reactions: ssgbryan
ARM wasn't deemed any more superior than other alternatives.

Vastly superior in ARM management figuring out how to license the ISA to vast numbers of customers building vast numbers of SOCs, more than the other RISC ISA creators. Even Intel had an ARM license for awhile, I think Windows Mobile Pocket PCs used those chips.

ARM did so well, it saved Apple from bankruptcy back in the dark days. Look it up.
 
That's not clear at all to be honest.
Professional apps are relying on AMD and nVidia for graphics acceleration. Graphic cards consuming 200W+ are not going to be beaten by whatever integrated graphics Apple is preparing. Sure, it will be good enough for most. Great? I don't think so.
So, would ARM based macs have support for eGPUs? Without Thunderbird support? I doubt it, but hope to be wrong.
ARM is interesting but this is not yet a solution for power users as far as I can see from the info we have atm.
Not clear that apple’s not designing a GPU that sits in a separate chip package communicating with the CPU by a north bridge or point-to-point bus. We have no idea what they’re up to. Looking forward to finding out.
 
  • Like
Reactions: SteveW928
I sure hope Apple isn't abandoning the eGPU. Even if Thunderbolt goes the way of the dodo, I'm guessing USB4 or other solutions will enable this kind of concept (and, yes, hopefully be backwards compatible to support the TB stuff we already have!).

I suppose Apple could produce higher end GPUs (or eGPU units) as well as the integrated stuff. Do they really want to be in that competition as well, though, or let AMD/nVidia work on it. At some point, even Apple could spread themselves too thin.


I expect Apple to keep using AMD (or even Intel) for dedicated GPU solution. Nvidia is pretty much out of the picture unless Apple allows CUDA working again on MacOS.
 
  • Like
Reactions: SteveW928
The 12Z isn't the final hardware, but running benchmarks on the would help provide some context.

No reason to. You can build your benchmarks to run on an iPad Pro which uses the same chip. Put the iPad in contact with a dry ice pack if you're concerned about thermals causing throttling before the benchmark ends.
 
That's not clear at all to be honest.
Professional apps are relying on AMD and nVidia for graphics acceleration. Graphic cards consuming 200W+ are not going to be beaten by whatever integrated graphics Apple is preparing. Sure, it will be good enough for most. Great? I don't think so.
So, would ARM based macs have support for eGPUs? Without Thunderbird support? I doubt it, but hope to be wrong.
ARM is interesting but this is not yet a solution for power users as far as I can see from the info we have atm.

I wouldn't be surprised if they ditched third party GPUs and offered their own if they wanted expansion that way and made it all accessible via Metal. Supporting third party GPUs means writing and supporting drivers for them, or convincing those third parties to support MacOS on ARM. I just don't see it happening.
 
I wouldn't be surprised if they ditched third party GPUs and offered their own if they wanted expansion that way and made it all accessible via Metal. Supporting third party GPUs means writing and supporting drivers for them, or convincing those third parties to support MacOS on ARM. I just don't see it happening.

Those third parties will happily write drivers if it means selling more hardware. But Apple may not be interested in making it happen. I think it *will*, at least for the late 2022 update to the Mac Pro, but we’ll see.
 
Games sell consoles. Consoles doesn’t sell games.

Sort of. It is a bit of a chicken/egg situation. AAA game studios will not develop for platforms for which they do not expect to recover their money.

Sony and Microsoft are willing to sell their hardware without profits because they make a fortune on the software and services (PS+, PSNow, Xbox Live and GamePass). The gaming market is worth more than the movie and music market combined.

Right. Given Apple’s own silicon, they should be able to build an Apple TV that is competitive with those products. Then they need two things: customers for the hardware and killer games.

If Apple ships a console without a library of killer games, then it’ll just go down in history as the 3DO, Atari Jaguar, TurboGrafx, Sega Saturn, and more.

Those products sold very few units, and so got very few titles. Apple is in a very different position. First, they can easily afford to sell their ”console” at a much lower price given that they already have services revenue from it (content, fitness app subscriptions, etc.). Second, if they decide they want to be in this business, they can certainly out spend both companies on platform exclusives (or the more common these days, windowed platform exclusives). Remember, Sony makes about $8-10 billion a year, while Apple makes more than that a quarter.

An average AAA title costs $100 - 200 million to make. Apple has already shown a willingness to spend money producing video content ($1 billion - $7 billion depending on which estimates you believe). If $25 million gets one a platform exclusive for 6 months for a AAA title, $1 billion gets them 40 titles. $7 billion would get them complete ownership of 35 $200 million titles.

There is no question they could do this if they thought it mattered to them. They could make it an App Store exclusive for 6 months and have it support Macs, iPad Pros and AppleTV game systems. Make it all part of a great ecosystem whose sales support and reinforce each other and it might be worth it for them.

Who knows.
 
  • Like
Reactions: SteveW928
That's not clear at all to be honest.
Professional apps are relying on AMD and nVidia for graphics acceleration. Graphic cards consuming 200W+ are not going to be beaten by whatever integrated graphics Apple is preparing. Sure, it will be good enough for most. Great? I don't think so.
So, would ARM based macs have support for eGPUs? Without Thunderbird support? I doubt it, but hope to be wrong.
ARM is interesting but this is not yet a solution for power users as far as I can see from the info we have atm.
Indeed, the Apple GPU or whatever they refer to it as, certainly won't match or overcome high-end graphics cards -- they don't draw 200W+, have a large enough heatsink and fans the width of two or three PCI slots for elegance. And I do believe Apple will continue to make use of AMD GPUs. Nonetheless, it's evident the Apple GPU is quite capable if properly utilized based on game demos during keynotes. The point being, for most users, it should be more than sufficient.
As for eGPUs... I'm not certain, though USB4 is based on TB3 and intends to offer strong backward compatibility. So, I guess, eGPUs will probably be supported.
Right now Apple has made it increasingly difficult to even play games on a Mac. Go run a poll on what should you use to play games. It would be someones delusional fantasy to think that iPad's or Macs could be partnered with some of the biggest game producers out there. Go ahead Tim Cook, prove me wrong after so many years of neglect. :cool:
Sure, however, Apple's investment in Apple Arcade and third-party controller support (to me) indicates at least some sincerity in accomodating gamers. Will it be Windows or console equivalent? Probably not quite. Although, for a non-gaming centric company, they're not doing too poorly. With 4K mostly satisfied by the current Apple TV (and 8K nowhere near commonplace yet) -- which is probably why we haven't seen a hardware update -- my guess is there will be at least some additions, upgrades in the next Apple TV generation intended for gaming.
 
  • Like
Reactions: SteveW928
Indeed, the Apple GPU or whatever they refer to it as, certainly won't match or overcome high-end graphics cards -- they don't draw 200W+, have a large enough heatsink and fans the width of two or three PCI slots for elegance. And I do believe Apple will continue to make use of AMD GPUs. Nonetheless, it's evident the Apple GPU is quite capable if properly utilized based on game demos during keynotes. The point being, for most users, it should be more than sufficient.
As for eGPUs... I'm not certain, though USB4 is based on TB3 and intends to offer strong backward compatibility. So, I guess, eGPUs will probably be supported.

Sure, however, Apple's investment in Apple Arcade and third-party controller support (to me) indicates at least some sincerity in accomodating gamers. Will it be Windows or console equivalent? Probably not quite. Although, for a non-gaming centric company, they're not doing too poorly. With 4K mostly satisfied by the current Apple TV (and 8K nowhere near commonplace yet) -- which is probably why we haven't seen a hardware update -- my guess is there will be at least some additions, upgrades in the next Apple TV generation intended for gaming.


I get the feeling that their next big hobby after turning their attention away from the massive pile of work they have for the next year or so is to focus on gaming big time. We’ll see.
 
Indeed, the Apple GPU or whatever they refer to it as, certainly won't match or overcome high-end graphics cards -- they don't draw 200W+, have a large enough heatsink and fans the width of two or three PCI slots for elegance. And I do believe Apple will continue to make use of AMD GPUs. Nonetheless, it's evident the Apple GPU is quite capable if properly utilized based on game demos during keynotes. The point being, for most users, it should be more than sufficient.
The game demos they are showcasing look bad. Honestly, I don't know who are they targeting exactly showcasing these.
For gamers, those demos look like early 2000 gaming on PC, if not earlier. The Tomb Rider demo looked flat, no proper illumination, low poly count in a confined space (basically this is the lowest setting scenario for high framerate).
For casual gamers, sure, but an angry birds type of game mention would be enough.
Apple is not a gamers company but it's trying to seem like one and failing. I don't understand why even bother, it's not like this is important for Apple.
 
  • Like
Reactions: venom600
I don't expect Apple to waste billions of dollars and engineering talent trying to enter console gaming market.
The more I think about this statement the more I ask “why not?”

Apple already builds an AppleTV, just to deliver casual games, fitness apps, video content, music and to act as a HomeKit hub. Let us say that instead of targeting PlayStation 5/XboneX performance, they targeted PS4/Xbone. Instead of targeting new AAA titles, they paid the porting/remastering costs and then took their standard revenue split, they could very quickly build a large high end game collection. Charge only $15 a copy (cheaper than most remastered titles) and they should break even/make money.

That might change people’s perspective on Apple and gaming. Wait 18 months and release a new generation that is 80% of the PS5, and then pay for some windowed platform exclusives. Becomes an interesting story.

This is not a prediction, just a thought experiment.
 
  • Like
Reactions: SteveW928
Once its done, you won't need to run the Windows versions. You'd be running them in emulation anyway which only slows them down or uses up your system's resources needlessly since more than likely you have a subscription license of some kind which allows you to choose which platform you use them on. Don't forget Office already exists on iPad so that version can already run on ARM Macs without changes required.

Yes Office exists on iPad as a mobile iOS version of Office. For people wanting the authentic native full Windows experience of Office, I wonder if they are going to have to use Parallels after the ARM transition, because ARM kind of puts a question mark on Bootcamp. I guess we'll have to wait an see.
 
Yeah, don’t let proof get in the way of your opinions.

Here’s my PhD thesis, if that helps? https://www.ecse.rpi.edu/frisc/theses/MaierThesis/

Not in the least. If I post a link to a high court judge with the same name as me, it does not mean I am a high court judge.

[automerge]1592982933[/automerge]
They <Intel> are 'saddled' to x86. And, that experience is a lot easier to obtain, especially if you've already got the basis of it in blowing everyone else away in other related markets.

More fanboy spin. Apple are "saddled" with zero experience of making high core count enterprise desktops.

The fact remains that for desktops thermal envelopes are much less critical than in mobile and whereas the Apple CPU makes sense in a phone or perhaps even in a laptop, it is highly questionable whether there is any benefit at all for desktop owners, i.e. iMac or Mac Pro. How long do you think it will be before Apple CPUs can outpace a 64 core Threadripper? Or the 128 core processor which AMD will probably be offering in a couple of years. It will take absolutely ages before Apple can catch up, if ever.

And yet they could have put a Threadripper in a Mac Pro tomorrow and retained full x86 compatibility for all the various apps and plugins, which is lost with the move to Apple silicon.

So it is clear this move is if no benefit to desktop users. It's all about profits and what's best for Apple.
 
Last edited:
So it is clear this move is if no benefit to desktop users. It's all about profits and what's best for Apple.

Is this a surprise? This is the company that literally said it took courage to remove a headphone jack that everyone liked with no legitimate reason to do so They couldn't care less about what benefits users.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.