Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I sincerely don’t think Microsoft will ever dump Intel for Arm, just due to the sheer volume of 16-, 32- and 64-but code written over the space of 40 years that corporations rely on and still use. One can dream of Intel finally getting their comeuppance for all the dirty sh*t they’ve pulled to stay on top, but life is seldom ever fair or just.
This is a great observation; however, I’m still amazed at how quickly my own company has been moving away from all of our custom apps and ERP software and into browser-based (cloud 🙄) and SaaS-based solutions.

Shoot we’ve even talked about decommissioning our VDI in favor of the Workspace type options.

x86 needs at the user’s endpoint really are diminishing quickly.
 
The point is more that Apple offers very affordable hardware that is more than capable of handling taxing workflows, and these are just their entry level devices. You have a Mac mini with Final Cut Pro (which is a one-time payment and not subscription based like premiere), and you also have the iPad Air with LumaFusion. Both more than capable of processing 4K footage with ease and then some.

It also makes one salivate at the thought of what their higher end Macs will be capable of.

Cool, but what software does it run? What hardware does it support?

Those reasons alone is where x86 wins and AMD should have been the right choice as the correct balance of performance versus usability.
 
Cool, but what software does it run? What hardware does it support?

Those reasons alone is where x86 wins and AMD should have been the right choice as the correct balance of performance versus usability.
Enough for me as a teacher with fairly modest software needs.

I mean, take the lowest hanging fruit. My M1 MBA (Apple's entry level laptop) can sustain 9-10 hours of zoom on a full charge, while staying silent (coz no fan) and icy-cool to the touch. And when my school moved to home-based learning in April last year, I was recording screencasts on my iPad Pro and editing them in Lumafusion.

I think people are focusing too much on specs in a vacuum, and not enough on what they mean for the end user. Too much of "Here's what it can do, never mind that many people who don't actually use it that way" and not enough of "Here's how a particular task is done that makes it better than the competition".

And this is why I think Apple is really on to something big here. Macs already do most of what people need to get done on them anyways. The key differentiator will be in how it gets those tasks done, and which offers the better experience in doing so.
 
Cool, but what software does it run? What hardware does it support?
Well, given that he was talking about video editing, I can say Apple’s Final Cut Pro, and Blackmagic Design’s DaVinci Resolve. I am not sure what you mean by “what hardware does it support”, as they systems are currently laptops and the Mac Mini, USB and Thunderbolt peripherals?
Those reasons alone is where x86 wins and AMD should have been the right choice as the correct balance of performance versus usability.
You mean worse performance with no gain in usability? Seems like an odd argument.
 
Sure it will, and good it does, but does it matter that much?
I'm happy that my general purpose M1 MBA is quiet, fan-less and accept the throttle(didn't notice so far).
Know your usage, and choose wisely...
In all honesty it really doesn’t, but based on what I’ve heard from M1 users it keeps the fans fairly quiet and performs very, very well, whereas by contrast intel had a major throttling issue in most of their Mac setups.

I find it funny that I haven’t seen that mentioned, looking at it in the sense of ‘as long as we’re throwing stones at the moment Intel, how’s that glass house coming along?’
 
Yes in the MacBook Air which is passively cooled, no/very little in the in the MacBook Pro, and I believe not all in the Mini.
In which case with all the stones that Intel is throwing, how many are rebounding as smashing their glass house? Lol

Fan noise is a massive arse ache when I’m running DAW and making music to the point where TB switcher is a godsend. I find this whole intel thing quite the opposite of what the company’s name suggests
 
Well, given that he was talking about video editing, I can say Apple’s Final Cut Pro, and Blackmagic Design’s DaVinci Resolve. I am not sure what you mean by “what hardware does it support”, as they systems are currently laptops and the Mac Mini, USB and Thunderbolt peripherals?

You mean worse performance with no gain in usability? Seems like an odd argument.

Not all hardware are supported by Mac's anymore, especially $2000 professional hardware equipment which now only works on Windows thanks to Apple.

What worse performance? An AMD 15W mobile CPU beats the M1 in multi score benchmark. And this is even an older generation AMD CPU on 7nm. So AMD can definitely catch up to Apple their ARM cpu if they switch to 5nm at some point.

Point is, the software availability on Mac OS X is not that great which x86 solved by allowing Mac users to have Windows on the side. So that is why AMD would have been the best option if the performance of Intel was the concern.
 
Hmm
 

Attachments

  • B0DEB54B-CF3B-4A71-A54B-49FAFA76F564.jpeg
    B0DEB54B-CF3B-4A71-A54B-49FAFA76F564.jpeg
    128.2 KB · Views: 60
Not all hardware are supported by Mac's anymore, especially $2000 professional hardware equipment which now only works on Windows thanks to Apple.

If it's just a question of software support for the hardware, a lot of that can simply be recompiled - not hard if the company is still in business and wants to do it. And frankly, if they aren't going to do it, then support would've been lost eventually anyway. If it's a professional eGPU, support will likely come later. Technically it still works actually - Corellium using their M1 Linux port showed an eGPU working on the M1.

What worse performance? An AMD 15W mobile CPU beats the M1 in multi score benchmark. And this is even an older generation AMD CPU on 7nm. So AMD can definitely catch up to Apple their ARM cpu if they switch to 5nm at some point.

Point is, the software availability on Mac OS X is not that great which x86 solved by allowing Mac users to have Windows on the side. So that is why AMD would have been the best option if the performance of Intel was the concern.

Not really. Zen 2 chips have to run at around 2x the wattage to beat the M1 in multicore. Yes, those Zen 2 chips are "15W" chips but truthfully for x86 mobile chips, the “TDP” numbers are almost meaningless, suggestions if you will. Some of those "15W chips" peak at near 60 and sustain at 30.

However, from what I've seen, Zen 3 actually running at 15-20W *can* beat the M1 in multicore - which yes is impressive at it still is on 7nm! But looking at single core performance numbers, I doubt just moving up a node would help enough - though of course it would close the gap. You can down clock a Zen 3 single core to a firestorm core's wattage and still be within 20% of performance, but it is unclear, doubtful in fact, that the node shift from 7 to 5 has the promised 15% performance gain (which still doesn't quite get you there) at that point in the Zen 3's power curve (node shift performance/power improvements aren't linear over an entire power curve, cmaier would know more about this and will no doubt correct me if I've made an error). Zen 3 is a great uarch, no question and on top of this uarch SMT allows Zen 3 to improve performance and performance per watt for fully saturated workloads, but it's still behind in single threaded programs. Every design has pros and cons.

But yes, by moving to its own silicon, beyond massively improving performance and performance per watt over Intel, Apple has also increased control over its own destiny and is more tightly vertically integrated - something the company values very highly. They wouldn't get that with AMD. Also: Apple can afford for their own silicon to be on TSMC's newest node and put out a new CPU every year. AMD is still too small to guarantee that.
 
Last edited:
It's because drawing more power is generally a sign of an inefficient design/fabrication. You've got it back to front: AMD can achieve the same (and better in multicore) with less power - if Intel could do that they would. It's one of the reasons (also die size) why Intel had a drop in cores in its desktop line. It is true that in desktops - well big towers/workstations anyway - power draw is less of a concern than laptops, but it still means more/louder cooling necessary to keep the chip from melting and, if something is meant for more continuous operation like HPC, more costs. In smaller desktops power draw can still matter for performance as they can be harder to cool.

Oh and I forgot a big one: sustaining higher power through the silicon to achieve better, stable performance also means tighter binning (quality control) on that silicon which is more expensive, so higher costs. Truthfully Rocket Lake was never meant for 14nm, it had to be backported from 10nm because of Intel's fabrication woes. (again why Intel had to drop total cores) It still wouldn't quite be as good as Zen 3, just as Tiger Lake (essentially mobile Rocket Lake) isn't as good as mobile Zen 3, but it would be better.
Are you saying AMD could run with more power, more heat and *faster single-core performance* but they’re choosing to cap their maximum performance, even in desktop, for the sake of power efficiency? If so that seems like an odd choice to me. They’re basically turning down sales. I think it’s more likely that they haven’t been able to make the high power draw work for them.
 
Just not correct. The Blackmagic Design Ursa Mini Pro 12K is 10K and is a great camera. There are quite a few others at under 20K, and a Canon R5 under 5K.

It has 40Gb/s Thunderbolt 3 and can support a nice array with lots of video.
That’s why I said camera „setup“, this usually doesn’t end with the „raw“ camera body purchase. Somebody who is seriously and currently in need of 8k raw footage, will need different lenses, mounts, stabilizers, lights, trails, follow focus, additional storage for camera/editing and backup, monitoring devices, additional sound recording devices, mic‘s, and much more...

A Mac, does not even worth to mention, it’s totally outweighed by the rest of the equipment.
 
Are you saying AMD could run with more power, more heat and *faster single-core performance* but they’re choosing to cap their maximum performance, even in desktop, for the sake of power efficiency? If so that seems like an odd choice to me. They’re basically turning down sales. I think it’s more likely that they haven’t been able to make the high power draw work for them.
AMD is selling every chip they can make and the rumor mill today is that they’re announcing faster Ryzen processors next week. :) We’ll see if that actually pans out as it is very much a rumor, but they have the thermal headroom to do it relative to Intel. But remember I also said that higher frequencies (ie more power with still stable silicon) requires tighter binning of the silicon. That means fewer and more expensive chips for those performance guarantees. AMD is currently supply constrained, they can’t make enough to sell to everyone who wants one. Last generation Intel actually guaranteed too much performance on its top end chip and couldn’t really make enough of them that eventually it introduced a lower-guaranteed-performance-higher-TDP chip in its real top slot that it could actually bin enough silicon for and people could actually buy. In fact Intel used to (may still) auction extremely rare, superb silicon (like a tiny percent of production) that could boost just that ever so much higher to high frequency traders. I think it’s called the Black line or something and it isn’t actually possible to buy in most cases except at these special, invite only auctions. Manufacturing for super high frequency is not easy.

Getting back to Apple ... I’m not sure it’s known how tightly they have to bin their silicon to achieve their performance consistently across manufactured SOCs. My guess is nowhere near as tight. Someone who knows more could probably make a better guess. But here’s my thinking: They have to sell 10s of millions of iPhones with very similarly specced cores (M1 is less than 10% higher clock) that all have to hit that frequency without blowing up (figuratively ... and also literally). Given their performance to power curves and manufacturing needs, Apple probably has a lot of thermal headroom if they wanted to push their chips faster. And this link between thermal headroom and binning is another reason why the M1 (and potentially ARM in general) is thought to be superior to most x86 implementations (at the moment): it’s also cheaper to get good performance out of the silicon. Not bad for the bottom line.
 
Last edited:
Intel has nobody to blame but themselves. For years they got lazy as AMD struggled to release a competitive lineup on the x86 side and refused to put proper resources on the growing mobile market.

From the 3rd gen core i5/i7 series processors up until around the 8th gen one could theoretically just upgrade to an SSD, slap 8GB of RAM and do a few GPU upgrades and play any game on the market in full HD/1440P with ease. It is now that many are jumping from ivy/haswell processors as the extra cores are becoming more important.


Office computers? I’ve seen outlets still operating with 2nd Gen processors and see no rush to upgrade their systems as software has become more efficient and not swapping out hundreds or in some cases thousands of desktops for gains that will go unnoticed by most employees.
 
  • Like
Reactions: Veeper
That’s why I said camera „setup“,
Sorry, your argument is just silly. No one needs to buy an extra $90,000 in gear for a $10,000 camera just to shoot a project.
this usually doesn’t end with the „raw“ camera body purchase.
Aside from the reality that most people would rent much (if not all) of this gear (certainly lenses), one could purchase everything that one would need for an outdoor shoot for an additional $15K shooting on sticks or $25,000 with a gimbal. Add a few thousand more for lights if one is shooting indoors.
Somebody who is seriously and currently in need of 8k raw footage, will need different lenses, mounts, stabilizers, lights, trails, follow focus, additional storage for camera/editing and backup, monitoring devices, additional sound recording devices, mic‘s, and much more...
Having shot a two day short film project with two cameras at 6K 60P in Blackmagic Raw just before the pandemic, I can tell you that I spent under $1,500 in gear rental and under $1,000 in additional gear purchased for the project. I would have shot at 12K but the Ursa Mini Pro 12K had not yet shipped.
A Mac, does not even worth to mention, it’s totally outweighed by the rest of the equipment.
Except that one typically purchases computers and one usually rents production gear.
 
  • Like
Reactions: Zdigital2015
AMD is selling every chip they can make and the rumor mill today is that they’re announcing faster Ryzen processors next week. :) We’ll see if that actually pans out as it is very much a rumor, but they have the thermal headroom to do it relative to Intel.
That’d be nice - I still haven’t picked up a new CPU (mostly because I can’t get a new GPU and I want a whole new gaming PC), so if AMD announce a new one with even better single core performance I’m in!
 
So this is worst than we think! What a disaster !! I don’t know what Intel wants to get from this... Switching to PC to play rocket league ? That’s really weak, and if needed I can play through GeForce now on Mac 😂

I hope this makes Apple to accelerate the transition to silicon. It’s really unrespectful
I think Apple all along planned to underpromise and overdeliver the transition.
 
If your flipping through your photo library smoothly, zooming in and out, rotating, adjusting... your on a Mac with a touchpad that is a dream to use. Something that no PC has equalled, they just hope that their clumsy sticky touch screen fills in for their crappy input devices.
 
  • Like
Reactions: Veeper
That’d be nice - I still haven’t picked up a new CPU (mostly because I can’t get a new GPU and I want a whole new gaming PC), so if AMD announce a new one with even better single core performance I’m in!
If it does happen and if they interest you, then, given AMD’s supply issues, I’d recommend you buy ‘em quick because they’re going to sell out *really* fast. Of course the GPU supply issues will sadly be bad for awhile too ...
 
That’s not what IBM thinks….





I’ve never worked in a large corporation that upgraded existing kit or repaired it.
 
  • Like
Reactions: Veeper
As many of the commenters here have made clear, this campaign is totally setting them up for ridicule.

I don’t expect anything different from Intel in the short run though. They have been stagnating for quite a while already, and I doubt a new CEO will be able to just waltz in, snap his fingers and somehow magically just make their transition to 7nm or even 5nm possible overnight.

In the meantime, adverts like this may be the only thing Intel can do to remind the PC world that they still exist.
 
  • Like
Reactions: DFP1989
Sorry, your argument is just silly. No one needs to buy an extra $90,000 in gear for a $10,000 camera just to shoot a project.
Sorry, that's non-sense. The Blackmagic URSA Mini Pro 12k Rig alone, like it's adverted on their site, comes very close to that, and that's without any special additional accessories. Talking of € here! A single Zeiss Supreme Lens cost 17k€, if you decide to get two or three it sums up quickly, and the full lens set is 90k€. Of course not always all lenses is needed, but 1-3 is very common, which probably is the reason why you even used two cameras, swapping takes time and breaks the workflow, and with two decent camera setups you pass that even easier.

Aside from the reality that most people would rent much (if not all) of this gear (certainly lenses), one could purchase everything that one would need for an outdoor shoot for an additional $15K shooting on sticks or $25,000 with a gimbal. Add a few thousand more for lights if one is shooting indoors.
Sounds like you're not talking for a company, these kind of things are written off and helps to lower the overall tax.
Renting is waste of money at long term, specially if the main business is film making.
Renting might worth for the occasional camera guy, or for the ones who "quickly" needs something because it's missing in his inventory, or can't put a higher amount of money on the table. Good Studios own these things.

Having shot a two day short film project with two cameras at 6K 60P in Blackmagic Raw just before the pandemic, I can tell you that I spent under $1,500 in gear rental and under $1,000 in additional gear purchased for the project. I would have shot at 12K but the Ursa Mini Pro 12K had not yet shipped.
Just because it suit your needs, doesn't mean it suit all needs, and since you're renting, you're out of the game anyway. Must have been a quick small project, few weeks or months of renting two cameras would have exceeded that sum by far, regardless of the currency.

Except that one typically purchases computers and one usually rents production gear.
Sorry, but i think there isn't any case studies out there showing that cameras is primary rented and not purchased.
Anyway, the ones who rents need to purchase.

In any case a 699€ M1 Mac does not even worth mentioning for 8k raw footage recording and editing.
It goes down like a toilet paper roll between all that higher priced equipments.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.