Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I agree and that's my measure too.
Yeah but the environmentalists will tell you electricity is cheap and that fossil fuels are bad.

I'm sorry, but I don't know of any company that decides on computing hardware based on how much electricity it uses. That's a mobile constraint. So is heat. When it comes to desktops, the more powerful it is, the better.

Been using computers for decades now and never once has anyone asked how much the system you just spec'd out uses in electricity before approving the purchase. The desktops have been heavy ass boxes that generated lots of heat and noise all to shave off mere seconds of render time. No one ever said, damn, I'd gladly wait longer for this to process if the machine weren't so heavy, or hot, or loud.

These new Mac Studio Ultras are already dated in that one leap in GPU power will make them yesterday's news. From the Pro side of things, being able to scale up ones hardware without actually replacing it is paramount in the business world. These's M1s are essentially throw away devices meant to be replaced more frequently than a desktop traditionally is. More for the landfill... but hey, it used less power when it was relevant for about 2 years.
 
Why, because ‘the Verge’? Be serious. ASA definitely wouldn’t have any issue with this.

So on the face of it, at the time of the Keynote, would those stats presented by Apple be:
  1. likely to deceive consumers; and
  2. are likely to cause consumers to take transactional decisions that they would not otherwise have taken​
Very debate-able. Verge can't even put a PC together.
 
RTX3090 energy cost per year including host machine: £981/year

M1 Ultra energy cost: £148/year

Even if it’s half the speed that’s a win.
Who (besides miners) runs their GPU at full power all the time?
 
  • Like
Reactions: bice
If these Keynotes were classified as adverts in the UK, which by extension they're long-form marketing tools, the ASA would be coming down hard on Apple right about now.
We’d hope the ASA wouldn’t be as unscientific and biased to “come down hard” on Apple based on one benchmark.

The Verge benchmark is somewhat like loading someone down with a 25 kg backpack and then criticizing that person for not still running a 13 minute 5k.
 
I don't think Apple has been quite as disingenuous as The Verge implied in their review. Since the start of the Apple Silicon launch, they have been marketing their M series chips as being more power-efficient. There is a reason the wattage is displayed in their "graphs".
I think it's up to the consumer whether or not a power-efficient desktop computer is important to them. However from the looks of things the performance it's getting in real-world use and not just benchmark scores aren't exactly making the people using them feel as though they've been coned by a favorable graph designed for marketing purposes. Watching some videos from Linus Tech Tips has shown me that Intel isn't exactly above fudging the numbers to make themselves look better in benchmarks either.
 
Yeah but the environmentalists will tell you electricity is cheap and that fossil fuels are bad.

I'm sorry, but I don't know of any company that decides on computing hardware based on how much electricity it uses. That's a mobile constraint. So is heat. When it comes to desktops, the more powerful it is, the better.

Been using computers for decades now and never once has anyone asked how much the system you just spec'd out uses in electricity before approving the purchase. The desktops have been heavy ass boxes that generated lots of heat and noise all to shave off mere seconds of render time. No one ever said, damn, I'd gladly wait longer for this to process if the machine weren't so heavy, or hot, or loud.

These new Mac Studio Ultras are already dated in that one leap in GPU power will make them yesterday's news. From the Pro side of things, being able to scale up ones hardware without actually replacing it is paramount in the business world. These's M1s are essentially throw away devices meant to be replaced more frequently than a desktop traditionally is. More for the landfill... but hey, it used less power when it was relevant for about 2 years.
Apart from the GPU and memory, what components do most people ever upgrade on their PC?

From what I can see if you buy a good memory configuration its only the GPU issue that would extend your system meaningfully. Could an eGPU do that as well?

Its pretty hard to upgrade CPU's and stuff because motherboards and other bits tend to be incompatible. In essence, apart from GPU , Custom PC tower builders end up keeping the £100 case and replacing everything inside and say they've "upgraded" their machine.
 
I believe Nvidia and Intel really need to be worried with the benefits and merits of unified architecture. First batch is already impressive. Future surely seems bright on the Mac side. Can’t wait to see MP.

There's nothing new about unified memory. PS5 and Xbox Series X do it but they do it properly with GDDR instead of DDR.
 
  • Like
Reactions: Ulfric
And every article will show that Apple is lying, and that their PR is nowhere near the truth. This paints a really false image of Apple, which is not good long term. People will quickly forget that the Ultra was compared to a 3090, but they will remember that Apple was lying and that the Ultra is slow (In comparison).


They would have done those things anyway, as they always do.


Well, the 3090 is not just laying on the floor doing these things by itself.. :p

There's no doubt that the Ultra beats the crap out of the 3090 power wise, but that's not the take home message people get. They see Apple was completely wrong and lied.
Every article reinforces the fact that Apple is has products that stomp all over 3090 for the tasks they were designed for.

The big takeaway for Apples target audience is that the 3090 is in fact laying there on the floor with no CPU or RAM or chassis or SSD or ports or anything else for that matter…
 
  • Disagree
Reactions: fisherman188
That is why they put footnote and legal speak to cover themselves up. I honestly don't think a presentation where you introduce a NEW CHIP (which NEEDS to be discussed from a performance perspective) wants to have 15 minutes dedicated to "Okay so here is our test set up, running Monterey XYZ, at X% load, running Software Y, Z amount of time in a room of X size, with AC cooling set to Y degrees........."

Have you watched GamersNexus? They go into so much detail on how they test things. Devoting to all that discussion in a keynote is not beneficial.
Totally agree.
 
I guess that logic works if you assume that Apple's target is people that are already invested into Apple. That's some MacRumors 100 IQ thinking.
I think anyone who is considering spending several thousand dollars on a Mac Studio and 1,500 on a Studio display is already invested in Apple, as you can find as powerful PC alternatives for less money. So yeah. You can call it MacRumors IQ, but I call my statement "accurate."
 
This dude thinks Prius is the fastest car on the planet.
No. This dude knows that renting 200 Priuses allows you to cover more distance in a set amount of time for lower operational expenditure than buying 50 Lambos.

Also the Priuses catch fire less often than the Lambos.
 
There's nothing new about unified memory. PS5 and Xbox Series X do it but they do it properly with GDDR instead of DDR.
Nvidia and Intel are already in the crisis room meetings stage of the game. Apple is competing and beating Intels highest end desktop CPU. A 12th gen effort vs version 1 from Apple. Nvidia is sweating as well. Nvidia is looking at the Ultra and wondering if they might have to up the power for their next gen 4090 to 800w To be competitive with Apple silicon.

The people who aren’t looking closely have mostly dismissed Apple‘s performance per watt goals. And that was and has been the whole ball game. The reason for this obsession focus was always tied to scaling.

All BS you say…?

Try running a 12900k+3090 on 200w…
 
And every article will show that Apple is lying, and that their PR is nowhere near the truth. This paints a really false image of Apple, which is not good long term. People will quickly forget that the Ultra was compared to a 3090, but they will remember that Apple was lying and that the Ultra is slow (In comparison).


They would have done those things anyway, as they always do.


Well, the 3090 is not just laying on the floor doing these things by itself.. :p

There's no doubt that the Ultra beats the crap out of the 3090 power wise, but that's not the take home message people get. They see Apple was completely wrong and lied.
But the ultra does beats out of the 3090 in content creation application benchmarks. That is probably what Apple is basing their charts off, not Geekbench compute.

would you be running Geekbench compute all day or actually using Lightroom, Capture One, Da Vinci Resolve and Adobe Premiere?
 
But the ultra does beats out of the 3090 in content creation application benchmarks. That is probably what Apple is basing their charts off, not Geekbench compute.

would you be running Geekbench compute all day or actually using Lightroom, Capture One, Da Vinci Resolve and Adobe Premiere?

Sure, but the whole point of Geekbench is to have workloads that are representative.

Here's what their "Compute" workloads do.
 
Nvidia and Intel are already in the crisis room meetings stage of the game. Apple is competing and beating Intels highest end desktop CPU. A 12th gen effort vs version 1 from Apple. Nvidia is sweating as well. Nvidia is looking at the Ultra and wondering if they might have to up the power for their next gen 4090 to 800w To be competitive with Apple silicon.

The people who aren’t looking closely have mostly dismissed Apple‘s performance per watt goals. And that was and has been the whole ball game. The reason for this obsession focus was always tied to scaling.

All BS you say…?

Try running a 12900k+3090 on 200w…
Thank you for being another rare voice of reason amongst the crying rabid wolves in here. What Apple is achieving is unprecedented. It blows my mind how desperately people will seek to find something to tear Apple apart over, no matter how great their products get.
 
Thank you for being another rare voice of reason amongst the crying rabid wolves in here. What Apple is achieving is unprecedented. It blows my mind how desperately people will seek to find something to tear Apple apart over, no matter how great their products get.

But, two things can be true at the same time: the M1 Ultra delivers impressive performance at low power, and Apple seems to have exaggerated its GPU performance.
 
But, two things can be true at the same time: the M1 Ultra delivers impressive performance at low power, and Apple seems to have exaggerated its GPU performance.
Depends on context. I honestly don’t think they did anything nefarious. However, many here are making such accusations.
 
Who (besides miners) runs their GPU at full power all the time?
No one does
majority of the time they're idling
RTX3090 energy cost per year including host machine: £981/year

M1 Ultra energy cost: £148/year

Even if it’s half the speed that’s a win.
You're being disingenuous with your numbers there. Apple's spec sheets say the Studio uses 370 watts max. A stock non overclocked 3090 uses about 350 watts and if you pair it even with a power hungry 12900K which draws 272w or 5950x which only draws 140w you're not going to see an eightfold increase in power usage vs the studio. In other words a 3090 is twice as fast with using twice the power.
Mac-Studio-review.010-1440x1080.png


Mac-Studio-review.011.png
 
Last edited:
You're being disingenuous with your numbers there. Apple's spec sheets say the Studio uses 370 watts max. A stock non overclocked 3090 uses about 350 watts and if you pair it even with a power hungry 12900K which draws 272w or 5950x which only draws 140w you're not going to see an eightfold increase in power usage vs the studio. In other words a 3090 is twice as fast with using twice the power.

In practice, the M1 Ultra draws 87 Watts.
 
In practice, the M1 Ultra draws 87 Watts.
That's with handbrake. I don't see any data yet with both the GPU/CPU going full tilt. The 12900K can pull 300 watts if you take off the power limiter but that's stupid as limiting to 175 watts only reduces performance by 5% while drawing much less power. The stock 12900 power draw is 125 watts or so. The 5950x keeps up at ~140watts or so with similar performance.
 
  • Like
Reactions: mi7chy
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.