Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Depends on context. I honestly don’t think they did anything nefarious. However, many here are making such accusations.
Many here aren’t really interested in hearing the truth.

All you have to do is rewatch everything Apple has said when presenting their Silicon from the M1. They pointedly describe Intel CPUs as “power hungry” “noisy” “hot”. Look at the chart the Verge uses. It’s a chart that is clearly measuring performance per watt, and somehow they went ahead and concluded that it was performance at all costs that was being measured.
 
But, two things can be true at the same time: the M1 Ultra delivers impressive performance at low power, and Apple seems to have exaggerated its GPU performance.
Apple used a performance per watt chart. It’s clearly labeled. imagine paying someone to paint your house and instead he painted your car…would you be ok with that…?
 
3090's are absolute pigs. I''ve got 3 running, mining by my feet at the moment and managing the thermals has been a nightmare. Apple performance without a burning hot backplate is excellent.
 
3090's are absolute pigs. I''ve got 3 running, mining by my feet at the moment and managing the thermals has been a nightmare. Apple performance without a burning hot backplate is excellent.
Ever try mining with the M1 sometime? If the ultra has double the hash rate of the M1 max, then it's about 20Mh/s. Pretty far off the 360Mh/s you're getting with your 3090's.
 
For me , it beats high end pc and that's enough for my workflow...
And if I use fcpx...go figure!

e79724b9d59efb54d8509b6f54d7c2bd.jpg
 
Thank you for being another rare voice of reason amongst the crying rabid wolves in here. What Apple is achieving is unprecedented. It blows my mind how desperately people will seek to find something to tear Apple apart over, no matter how great their products get.
Thanks for your kind words. I think it’s genuinely out of frustration really…the frustration comes from knowing deep down inside that Apple is going to change their world as they know it. And also they feel frustrated because complaining to their chosen favorite tech company isn’t going to do squat.

Before Apple silicon, the system was always about the next fastest top of the range gaming CPU or GPU. Wait for reviews from PC tech tubers and figure out which AIB model they could overclock or tune to eek out a little more oomph. Also the speediest RGB RAM sticks and SSD for those critical extra 2.5 frames.

It’s a whole existing ecosystem that just blew up less than 18 months ago. The supply problems created by the pandemic didn’t help either, but I think that before Apple silicon no one really looked at how these performance gains were being achieved. It clear as day now that it was mostly pumping more power and heat into these systems and very little actual architectural improvements. Otherwise how how could Johnny come lately be pawning them so badly…?

The first company in the tech industry to stitch together two separate GPU clusters on different dies and turn them into one is Apple. The main players have been trying for over a decade to achieve this goal. That’s what Apple means when they talk about thinking different.

I‘m rambling a bit, but this is a big change to the system, and the system is fighting back anyway it can and every way it knows how to.
 
  • Like
Reactions: dspdoc and Vaccaria
And after all the pages of tribalistic bickering it is time go get back on topic:


I just want some Houdini, Blender & Unreal Engine benchmark. Is it that hard to come by?

Currently I am using a 3600X & a GTX 1080, but if Ultra fits the bill, then I will be thinking for switching. For my use case I am using basically Blender, Houdini 18.5, Unreal Engine 4, Photoshop & Illustrator & Adobe software aren't that graphics hungry so they are not my concern.

But that little package & low energy consumption of Studio have me interested. I need to free up my desk (my tower & a cintiq pro 24 has already taken 90% of my desk space), I would keep my old PC for testing purposes.
 
This is particularly relevant for desktop class computing. Most people are not concerned with power consumption for non-portable devices. As expected, it is going to take YEARS for Apple to truly compete. And even then, I won't be surprised if eventually Apple caves and has dedicated graphics options.
In a few years anyone buying a new car or having multiple will be electric, predominately charging at home. Those without sourcing considerable amounts of energy from renewable resources will have a heavy impact on their energy consumption and thus costs.

We’ve already seen power grids in China and Texas in the last 8mths having considerably long blackouts and rolling brownouts where sections of the grid has no energy available regardless of reasons.

Still goes to stand that energy pricing is not going down, nor for a while. At that point in the near future people WILL be considering their computing performance per watt a lot more then.
 
o
The M1 Max in the MBP is already faster than a RTX3090 in a number of areas where the Mac Studio is marketed. The disparity will be even greater with the Ultra.



Professionals don’t sit and run Geekbench all day long.

The comparison probably should have been against workstation graphics, not the 3090.
 
Last edited:
  • Haha
Reactions: theotherphil
I can't believe people are actually defending this.

The chart insinuates that it beats anything from NVidia. That is an outright lie. If any other company said "Here, buy our $4000 machine - it beats the competition" and it didn't, there would be outrage - but it's acceptable from Apple?!?
 
And after all the pages of tribalistic bickering it is time go get back on topic:

I wonder how he was able to get a higher score on his 3090 at a higher resolution than The Verge did.

EDIT: oh I see the M1 Ultra doesn't have 100% utilization of the GPU only 83%, hmm
 
By being platform agnostic you are working at the lowest common denominator all the time. You aren't making use of either systems to their fullest extent.

I understand that as a software house that makes sense to you in terms of total cost saving or additional performance boosting in particular areas per developer seat. However, in my opinion, people who have invested years in using the features each OS provides fully will find speed and performance differences too negligible to warrant switching platform back and forth.

For example, a typical user may have external drives that are MacOS formatted / Windows Formatted, photo's on iCloud, tooling like Final Cut Pro or Logic etc etc.

Yes using a c++ compiler and a few cross platform tools may allow you to avoid the benefits of each OS, but on the whole that is not how these systems are used by the majority of users.
I can appreciate this position for sure, but our teams are familiar with the three main OSes for the sake of development. A lot of our app work is moving to PWAs as interest is accelerating in this area and will likely continue to do so for a number of years.

I think you'd be surprised how many software development teams look at the M1 series and realise that though there are more powerful options out there and likely our workflows could benefit from the speed bump, but the overall package especially with the new gen MBPs...quite simply there's nothing like it on the market at the moment especially in this era of mobile working.

The fact that I can pop into the office and plug in, or stay home and get the same performance everywhere I am running on battery and still do better than the desktops we replaced them with is frankly astounding. We were running 3070s in our machines with 10th gen Intel CPUs and we're receiving on-par or better performance across the board.

Granted the Mac Studio is supposed to be a desktop machine and I realise the argument kind of breaks down with power/performance a little bit, especaiily in the GPU department, but for offices running 10+ machines at 400 - 700W a piece without a use case for GPU-intensive workloads...these machines would look mighty appealing!
 
o


Don’t compare laptops with desktops or mac with window. it’s pointless. another thing professionals don’t do is chase performance across platforms or hardware types.

fwiw I would never buy the Studio or a RTX 3090. I think they both stink.
Uh why not? My Mac gets the job done faster than my custom built pc. So if it gets my work done faster, why not be able to compare it?
 
Honestly I haven't read all 17 pages of commentary on this, but if you look at the slide it is absolutely true to the point being made.
m1-ultra-performance-chart.jpg


People are reading the plot as though the vertical axis is the important one. Usually that is the case, but as horizontal dashed line labelled '200W less power' clearly indicates, Apple is focusing on the power consumption on the horizontal axis, not overall computational performance. Those calling for the Apple's marketing team to be fired over this didn't really pay adequate attention to the slide. Nothing in the slide states that the GPU couldn't draw more power.

Anyway, computers that draw hundreds of watts to achieve maximum performance strike me as being very wasteful and hence immoral at some level if there is a more efficient alternative.
 
Last edited:
Uh why not? My Mac gets the job done faster than my custom built pc. So if it gets my work done faster, why not be able to compare it?

if you have a cross-platform application that truly compares and you don't prefer either platform over the other have at it... :)
 
Honestly I haven't read all 17 pages of commentary on this, but if you look at the slide it is absolutely true to the point being made.
m1-ultra-performance-chart.jpg


People are reading the plot as though the vertical axis is the important one. Usually that is the case, but as horizontal dashed line labelled '200W less power' clearly indicates, Apple is focusing on the power consumption on the horizontal axis, nor overall computational performance. Those calling for the Apple's marketing team to be fired over this didn't really pay adequate attention to the slide. Nothing in the slide states that the GPU couldn't draw more power.

Anyway, computers that draw hundreds of watts to achieve maximum performance strike me as being very wasteful and hence immoral at some level if there is a more efficient alternative.

long-time Apple customers would know this chart is BS at first glance. When the advantage is much more clear they give you the better charts and graphs (and the multipliers of course). this one is designed to convey an idea without outright lying.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.