Then they should show that chart. The one they made claims it’s more powerful.RTX3090 energy cost per year including host machine: £981/year
M1 Ultra energy cost: £148/year
Even if it’s half the speed that’s a win.
Yeah but the environmentalists will tell you electricity is cheap and that fossil fuels are bad.I agree and that's my measure too.
Why, because ‘the Verge’? Be serious. ASA definitely wouldn’t have any issue with this.
Who (besides miners) runs their GPU at full power all the time?RTX3090 energy cost per year including host machine: £981/year
M1 Ultra energy cost: £148/year
Even if it’s half the speed that’s a win.
This dude thinks Prius is the fastest car on the planet.RTX3090 energy cost per year including host machine: £981/year
M1 Ultra energy cost: £148/year
Even if it’s half the speed that’s a win.
We’d hope the ASA wouldn’t be as unscientific and biased to “come down hard” on Apple based on one benchmark.If these Keynotes were classified as adverts in the UK, which by extension they're long-form marketing tools, the ASA would be coming down hard on Apple right about now.
Apart from the GPU and memory, what components do most people ever upgrade on their PC?Yeah but the environmentalists will tell you electricity is cheap and that fossil fuels are bad.
I'm sorry, but I don't know of any company that decides on computing hardware based on how much electricity it uses. That's a mobile constraint. So is heat. When it comes to desktops, the more powerful it is, the better.
Been using computers for decades now and never once has anyone asked how much the system you just spec'd out uses in electricity before approving the purchase. The desktops have been heavy ass boxes that generated lots of heat and noise all to shave off mere seconds of render time. No one ever said, damn, I'd gladly wait longer for this to process if the machine weren't so heavy, or hot, or loud.
These new Mac Studio Ultras are already dated in that one leap in GPU power will make them yesterday's news. From the Pro side of things, being able to scale up ones hardware without actually replacing it is paramount in the business world. These's M1s are essentially throw away devices meant to be replaced more frequently than a desktop traditionally is. More for the landfill... but hey, it used less power when it was relevant for about 2 years.
I believe Nvidia and Intel really need to be worried with the benefits and merits of unified architecture. First batch is already impressive. Future surely seems bright on the Mac side. Can’t wait to see MP.
Every article reinforces the fact that Apple is has products that stomp all over 3090 for the tasks they were designed for.And every article will show that Apple is lying, and that their PR is nowhere near the truth. This paints a really false image of Apple, which is not good long term. People will quickly forget that the Ultra was compared to a 3090, but they will remember that Apple was lying and that the Ultra is slow (In comparison).
They would have done those things anyway, as they always do.
Well, the 3090 is not just laying on the floor doing these things by itself..
There's no doubt that the Ultra beats the crap out of the 3090 power wise, but that's not the take home message people get. They see Apple was completely wrong and lied.
Totally agree.That is why they put footnote and legal speak to cover themselves up. I honestly don't think a presentation where you introduce a NEW CHIP (which NEEDS to be discussed from a performance perspective) wants to have 15 minutes dedicated to "Okay so here is our test set up, running Monterey XYZ, at X% load, running Software Y, Z amount of time in a room of X size, with AC cooling set to Y degrees........."
Have you watched GamersNexus? They go into so much detail on how they test things. Devoting to all that discussion in a keynote is not beneficial.
Us lot in fintech using ML…Who (besides miners) runs their GPU at full power all the time?
I'm sorry, but I don't know of any company that decides on computing hardware based on how much electricity it uses.
I think anyone who is considering spending several thousand dollars on a Mac Studio and 1,500 on a Studio display is already invested in Apple, as you can find as powerful PC alternatives for less money. So yeah. You can call it MacRumors IQ, but I call my statement "accurate."I guess that logic works if you assume that Apple's target is people that are already invested into Apple. That's some MacRumors 100 IQ thinking.
No. This dude knows that renting 200 Priuses allows you to cover more distance in a set amount of time for lower operational expenditure than buying 50 Lambos.This dude thinks Prius is the fastest car on the planet.
Nvidia and Intel are already in the crisis room meetings stage of the game. Apple is competing and beating Intels highest end desktop CPU. A 12th gen effort vs version 1 from Apple. Nvidia is sweating as well. Nvidia is looking at the Ultra and wondering if they might have to up the power for their next gen 4090 to 800w To be competitive with Apple silicon.There's nothing new about unified memory. PS5 and Xbox Series X do it but they do it properly with GDDR instead of DDR.
But the ultra does beats out of the 3090 in content creation application benchmarks. That is probably what Apple is basing their charts off, not Geekbench compute.And every article will show that Apple is lying, and that their PR is nowhere near the truth. This paints a really false image of Apple, which is not good long term. People will quickly forget that the Ultra was compared to a 3090, but they will remember that Apple was lying and that the Ultra is slow (In comparison).
They would have done those things anyway, as they always do.
Well, the 3090 is not just laying on the floor doing these things by itself..
There's no doubt that the Ultra beats the crap out of the 3090 power wise, but that's not the take home message people get. They see Apple was completely wrong and lied.
But the ultra does beats out of the 3090 in content creation application benchmarks. That is probably what Apple is basing their charts off, not Geekbench compute.
would you be running Geekbench compute all day or actually using Lightroom, Capture One, Da Vinci Resolve and Adobe Premiere?
Thank you for being another rare voice of reason amongst the crying rabid wolves in here. What Apple is achieving is unprecedented. It blows my mind how desperately people will seek to find something to tear Apple apart over, no matter how great their products get.Nvidia and Intel are already in the crisis room meetings stage of the game. Apple is competing and beating Intels highest end desktop CPU. A 12th gen effort vs version 1 from Apple. Nvidia is sweating as well. Nvidia is looking at the Ultra and wondering if they might have to up the power for their next gen 4090 to 800w To be competitive with Apple silicon.
The people who aren’t looking closely have mostly dismissed Apple‘s performance per watt goals. And that was and has been the whole ball game. The reason for this obsession focus was always tied to scaling.
All BS you say…?
Try running a 12900k+3090 on 200w…
Thank you for being another rare voice of reason amongst the crying rabid wolves in here. What Apple is achieving is unprecedented. It blows my mind how desperately people will seek to find something to tear Apple apart over, no matter how great their products get.
Depends on context. I honestly don’t think they did anything nefarious. However, many here are making such accusations.But, two things can be true at the same time: the M1 Ultra delivers impressive performance at low power, and Apple seems to have exaggerated its GPU performance.
No one doesWho (besides miners) runs their GPU at full power all the time?
You're being disingenuous with your numbers there. Apple's spec sheets say the Studio uses 370 watts max. A stock non overclocked 3090 uses about 350 watts and if you pair it even with a power hungry 12900K which draws 272w or 5950x which only draws 140w you're not going to see an eightfold increase in power usage vs the studio. In other words a 3090 is twice as fast with using twice the power.RTX3090 energy cost per year including host machine: £981/year
M1 Ultra energy cost: £148/year
Even if it’s half the speed that’s a win.
You're being disingenuous with your numbers there. Apple's spec sheets say the Studio uses 370 watts max. A stock non overclocked 3090 uses about 350 watts and if you pair it even with a power hungry 12900K which draws 272w or 5950x which only draws 140w you're not going to see an eightfold increase in power usage vs the studio. In other words a 3090 is twice as fast with using twice the power.
That's with handbrake. I don't see any data yet with both the GPU/CPU going full tilt. The 12900K can pull 300 watts if you take off the power limiter but that's stupid as limiting to 175 watts only reduces performance by 5% while drawing much less power. The stock 12900 power draw is 125 watts or so. The 5950x keeps up at ~140watts or so with similar performance.In practice, the M1 Ultra draws 87 Watts.