Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They've always told little lies, twisted the truth, or used meaningless stats that weren't so easy to disprove in order to get overenthusiastic "technology" writers to give them good press. However, lately their lies have been outrageous and easy to disprove.

It took "courage" to remove the headphone jack in order to force your gullible customers into spending hundreds of dollars on crappy wireless earbuds. Everyone with half a brain cell knew this was just a money grab. Last years' M1 was supposed to beat a RTX 3080. It couldn't even beat a 3060. This year's M1 Ultra was supposed to beat a 3090...

The real problem isn't Apple; it's the people who want to be seen as cool and technology savvy (they're not either) so bad that they actually do some impressive (and embarrassing) mental gymnastics to justify getting got by Apple. Removing the fans from a computer is just plain dumb. Bluetooth doesn't sound better than wired (although it is convenient), MacOS and having a powerful ARM chip are worthless if they can't run the applications you need for work or school (or gaming), and power efficiency is worthless if it takes you 2-3 times as long to do the same task.
What in the world are you even talking about?!
 
The real problem isn't Apple; it's the people who want to be seen as cool and technology savvy (they're not either) so bad that they actually do some impressive (and embarrassing) mental gymnastics to justify getting got by Apple.
Maybe they know their own use cases better than you do.

Of the seven computers I currently have in working order, two are Windows, and my main Windows machine dual-boots into Linux.

It's possible -- just possible -- that I know what I'm doing, and my preference for Mac comes from decades of actual working experience on different platforms as a developer, and not merely having been flimflammed by magical mystery dust from Cupertino.

And it's just possible that, after decades of using the Mac and seeing how Apple advertises it, I can take Apple's more extreme marketing claims with a grain of salt while finding they are basically correct about most things, including the central idea that their platform is the most productive.

There are those who think that maximal frames per second in a first-person shooter is the only measure of a computer that truly matters. And good luck to them.
 
Last edited:
RTX3090 energy cost per year including host machine: £981/year

M1 Ultra energy cost: £148/year

Even if it’s half the speed that’s a win.
I had a good chuckle at this. This spin on energy costs is ridiculous. Any professional isn't going to be counting the pennies that their machine uses in energy consumption.

The RTX3090 mauls the M1 Ultra with performance. Apple's stupid marketing slides are just embarrassing and need to stop. There's no win. Just an own goal.
 
Are we sure apple isn’t comparing to the leading discrete chip that runs on Mac OS which would be an AMD?
 
I had a good chuckle at this. This spin on energy costs is ridiculous. Any professional isn't going to be counting the pennies that their machine uses in energy consumption.

The RTX3090 mauls the M1 Ultra with performance. Apple's stupid marketing slides are just embarrassing and need to stop. There's no win. Just an own goal.
I'm fed up of this naive perspective. Not talking amateur usage here; we're talking rather large datasets and rather large amounts of GPUs. We are worried about energy costs because in data centres where we run this stuff, that's the most expensive thing.

We probably run 200 of your assumption of "professional" workloads concurrently...
 
  • Like
Reactions: tubular
What in the world are you even talking about?!
Apple would say that their phone/tablet/computer was 50% faster than the previous version, which was sort of meaningless when those devices where still slower or not on par with the competition. Or Apple would compare their optimized $4000 laptops or $800-1000 phones to not optimized $500 Windows laptops or $200 Android phones. The only time they would mention similar priced or similarly spec'd devices was to point out something obscure that their device had an "advantage" in like pixel count or quieter fans or a 2% more battery life.

Apple's is a one trick pony, which isn't necessarily a bad thing. They've made a lot of money off of being a one trick pony. Their devices look good, are great at video and music editing or other multimedia tasks that those devices are optimized for. They're not good at anything else; especially stuff they are not optimized for. I don't understand why they chose to lie blatantly lie about graphics performance, which is something EVERYONE KNOWS they are not good at.
 
  • Like
Reactions: Ulfric
I'm fed up of this naive perspective. Not talking amateur usage here; we're talking rather large datasets and rather large amounts of GPUs. We are worried about energy costs because in data centres where we run this stuff, that's the most expensive thing.

We probably run 200 of your assumption of "professional" workloads concurrently...
Let's be serious here. Which one is more important to you, the time it takes to do a task measured in dollars or the energy it takes to do a task measured in dollars? It's sort of a trick question because they are both tied to each other, and time is the driving factor.

If it takes 2-3 times as long to do something with an M1 Ultra than it does with a RTX 3090, which is using 2-3 times the power, is there really a savings switching to the M1 Ultra? Especially when you have to deal with the costs of switching and the lack of flexibility that we all know Apple product have?
 
I'm fed up of this naive perspective. Not talking amateur usage here; we're talking rather large datasets and rather large amounts of GPUs. We are worried about energy costs because in data centres where we run this stuff, that's the most expensive thing.

We probably run 200 of your assumption of "professional" workloads concurrently...
You're not running Mac Studios in a data centre though, are you?
 
Where did I say this? That is not the only thing I do, I also record educational videos for software development and lets play videos. Sometimes the long series are 10 hour of video which the Mac cuts down the export time drastically.
Ah so the windows game is a side hussle then. Cool. Why can’t you make the windows game on the Mac?
 
They've always told little lies, twisted the truth, or used meaningless stats that weren't so easy to disprove in order to get overenthusiastic "technology" writers to give them good press. However, lately their lies have been outrageous and easy to disprove.

It took "courage" to remove the headphone jack in order to force your gullible customers into spending hundreds of dollars on crappy wireless earbuds. Everyone with half a brain cell knew this was just a money grab. Last years' M1 was supposed to beat a RTX 3080. It couldn't even beat a 3060. This year's M1 Ultra was supposed to beat a 3090...

The real problem isn't Apple; it's the people who want to be seen as cool and technology savvy (they're not either) so bad that they actually do some impressive (and embarrassing) mental gymnastics to justify getting got by Apple. Removing the fans from a computer is just plain dumb. Bluetooth doesn't sound better than wired (although it is convenient), MacOS and having a powerful ARM chip are worthless if they can't run the applications you need for work or school (or gaming), and power efficiency is worthless if it takes you 2-3 times as long to do the same task.

Real “my use cases don’t work well on a Mac and Apple has personally wronged me, so people who defend them must be lying” vibes.

And no, nobody other has ever claimed Bluetooth “sounds better”.
 
  • Like
Reactions: tubular
Let's be serious here. Which one is more important to you, the time it takes to do a task measured in dollars or the energy it takes to do a task measured in dollars? It's sort of a trick question because they are both tied to each other, and time is the driving factor.

If it takes 2-3 times as long to do something with an M1 Ultra than it does with a RTX 3090, which is using 2-3 times the power, is there really a savings switching to the M1 Ultra? Especially when you have to deal with the costs of switching and the lack of flexibility that we all know Apple product have?
A compromise is the requirement. I mentioned this earlier. It's a function of cost, time and reliability and the RTX cards do not factor anywhere in the reliability front. We lose nodes all the time due to GPU failures. Not joking but the RMA queue is pretty long for a lot of higher power NVidia cards. If you buy them from HPE we're talking 6 months lead.

The point is we have one vendor point of contact as well.

The key thing is our training routines get left overnight usually anyway and take 4-5 hours to run. If they take 10 hours overnight and use half as much power and blow up 50% left often then that's a win.
 
Ah so the windows game is a side hussle then. Cool. Why can’t you make the windows game on the Mac?
I technically could. I am using both Unity and MonoGame for my project both of which are cross platform. But I find Visual Studio + Resharper on Windows to be the best for development (for me at least). So that is why I still have Windows PCs around and a few very very VERY old games/Windows only games I still like to play. It is unfortunate, as I really am not a fan of Windows. Which is why I nearly do everything else on my Mac (if I am working on my game and need a quick image for a sprite, I will use Photoshop on my Windows PC, but I try to do all of that on my Mac for example).
 
  • Like
Reactions: Orionfox
These new Mac Studio Ultras are already dated in that one leap in GPU power will make them yesterday's news. From the Pro side of things, being able to scale up ones hardware without actually replacing it is paramount in the business world. These's M1s are essentially throw away devices meant to be replaced more frequently than a desktop traditionally is. More for the landfill... but hey, it used less power when it was relevant for about 2 years.
Interesting….because where I work, we have a heap of generic windows boxes. We purchase high specs now and never open them up to replace components. We just buy new machines. This whole “modular, upgradable” components narrative people are being sold just doesn’t stack up. When a new GFX card comes out, there’s invariably new CPU/ Sockets/ RAM that also need replacing - it’s much easier to just buy a whole new machine with warranty and support.
 
A compromise is the requirement. I mentioned this earlier. It's a function of cost, time and reliability and the RTX cards do not factor anywhere in the reliability front. We lose nodes all the time due to GPU failures. Not joking but the RMA queue is pretty long for a lot of higher power NVidia cards. If you buy them from HPE we're talking 6 months lead.

The point is we have one vendor point of contact as well.

The key thing is our training routines get left overnight usually anyway and take 4-5 hours to run. If they take 10 hours overnight and use half as much power and blow up 50% left often then that's a win.
There's no guarantee that Apple will be any more reliable. You're just trading one demon for another, with no gains in efficiency or power consumption whatsoever (by your own admission).

I had a roommate who worked in the IT department for a large (US) government organization. He was a major Apple fan. He got excited when his organization launched a pilot program to switch some of their services and hardware over to Apple. The program failed because Apple couldn't meet his organization's demand, Apple was inflexible when it came to given any type of control to his organization's IT department, and Apple couldn't support all of the applications his organization ran. It just wasn't cost effective to switch even a few of their machines and services to Apple.
 
Interesting….because where I work, we have a heap of generic windows boxes. We purchase high specs now and never open them up to replace components. We just buy new machines. This whole “modular, upgradable” components narrative people are being sold just doesn’t stack up. When a new GFX card comes out, there’s invariably new CPU/ Sockets/ RAM that also need replacing - it’s much easier to just buy a whole new machine with warranty and support.
I agree with the "savings" of buying new machines. However, most large (and specialized) organizations have their own IT departments that provide support. Apple doesn't like to give up control and their machines are notorious terrible for onsight repair.
 
I agree with the "savings" of buying new machines. However, most large (and specialized) organizations have their own IT departments that provide support. Apple doesn't like to give up control and their machines are notorious terrible for onsight repair.

Yes, and IT departments rarely waste time installing upgrades. They buy machines on a lease and replace them at a cycle.
 
I had a roommate who worked in the IT department for a large (US) government organization. He was a major Apple fan. He got excited when his organization launched a pilot program to switch some of their services and hardware over to Apple. The program failed because Apple couldn't meet his organization's demand, Apple was inflexible when it came to given any type of control to his organization's IT department, and Apple couldn't support all of the applications his organization ran. It just wasn't cost effective to switch even a few of their machines and services to Apple.

Interesting…IBM swapped >100,000 of it’s employees to Macs and is saving $535 per Mac every 4 years in comparison to PC’s. IBM says it’s 3x more expensive to support running PC’s.….and that was when they were on Intel CPU’s.

 
Real “my use cases don’t work well on a Mac and Apple has personally wronged me, so people who defend them must be lying” vibes.

And no, nobody other has ever claimed Bluetooth “sounds better”.
Apple hasn't "personally wronged me" in any way. I learned how to use a computer on their machines first (Apple IIc in school) and was an Apple fan until I started building gaming rigs.

However, over the years I worked with the government I personally witnessed the shortcomings of Apple's products. Like I mentioned in one of my other posts, they are great for audio and video editing and other multimedia tasks that they are optimized for. Once you go outside of that optimization, Apple's products are no better than Windows machines at 1/4 to 1/2 the price. I have many friends who bought Macs and didn't know what they were getting themselves into.

I think it's dumb for Apple to advertise for things that they know their machines can't do. They might get a few suckers, but they will also make a lot of enemies.
 
  • Disagree
Reactions: neuropsychguy
Interesting…IBM swapped >100,000 of it’s employees to Macs and is saving $535 per Mac every 4 years in comparison to PC’s. IBM says it’s 3x more expensive to support running PC’s.….and that was when they were on Intel CPU’s.

IBM is a software services company now. Most organizations have large IT departments that service their HARDWARE.

Someone posted that I someone didn't like Apple because Apple didn't support my use. Wouldn't that be a valid reason not to "like" a company? Why would a person or organization like or support a company that doesn't support their use case? Apple supported IBM's use case, which wasn't hard to do given the fact the IBM has moved away from hardware to software services. Apple doesn't support every person or organizations' use case. I think some of you are failing to understand that.
 
There's no guarantee that Apple will be any more reliable. You're just trading one demon for another, with no gains in efficiency or power consumption whatsoever (by your own admission).

I had a roommate who worked in the IT department for a large (US) government organization. He was a major Apple fan. He got excited when his organization launched a pilot program to switch some of their services and hardware over to Apple. The program failed because Apple couldn't meet his organization's demand, Apple was inflexible when it came to given any type of control to his organization's IT department, and Apple couldn't support all of the applications his organization ran. It just wasn't cost effective to switch even a few of their machines and services to Apple.
I'm aware of that entirely.

We don't do anything without extensive qualification but the point is you throw an idea on the table and test it to destruction.
 
Forget the actual results. The worrisome part for Nvidia is Apple has a chip that is quite good and might overtake them at sometime even though it is not a discrete graphic card. I mean this stuff is built into the chip. Everyone used to laugh at built-in graphics. Not so much anymore.
I agree this is the issue.
This is basically a rerun of the argument a few years ago about whether Apple was *really* faster than Intel. Obsessing over minor benchmarking details and whether a particular comparison was "fair" or not missed the more important points:
- that Apple was providing competitive performance -- in a PHONE chip...
- that Apple was on a very different annual improvement trajectory than Intel.

We see the exact same thing here. This is Apple's *first* run at a desktop class GPU and already it's "competitive" with nVidia. Soon (end of this year? this time next year?) we'll be seeing M2's with the GPU that's the successor to the A15 GPU. And that A15 GPU was a substantial boost over the A14 GPU...

Most of these benchmarks are on code that was converted to Metal sometime in the past year and has not yet had the full optimization treatment applied. That, and Apple's annual trajectory, are the issue, not whether one can find a benchmark showing this curve goes higher than that curve.
 
I'm fed up of this naive perspective. Not talking amateur usage here; we're talking rather large datasets and rather large amounts of GPUs. We are worried about energy costs because in data centres where we run this stuff, that's the most expensive thing.

We probably run 200 of your assumption of "professional" workloads concurrently...
If Apple Silicon is overall cheaper as you imply, what is stopping cloud providers from using it themselves that and providing a cheaper service(by economies of scale) than owning the hardware outright?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.