It's a whole 6 smaller!Jason Snell did.
And Apple’s specs say the noise has gone from 15dB on the M1 to 6dB on the M2. That’s quite a difference.
(Oblig Spinal Tap reference)
It's a whole 6 smaller!Jason Snell did.
And Apple’s specs say the noise has gone from 15dB on the M1 to 6dB on the M2. That’s quite a difference.
You ask, I deliver!I know it sounds daft but I wish the mini and studio came in colours
Imagine a blue shaded one or gold ?
A black one to match modern av stuff would be great too
Is this from real life experience as mine begs to differ.M1 Ultra was at best 40% faster than the Max only in a few very specific situations.
Otherwise single digit in most other things.
Clearly there were issues with software not utilizing it well.
Almost everyone is better off getting the Max.
Not getting hopes up that is rectified yet.
You have to test it in many different real life ways to really see it's issues.
How many M2's are we going to get?
M2, M2 PRO, M2 MAX, M2 ULTRA, M2 HYPER!!!
View attachment 2219016 View attachment 2219006
Pivot screen is a nice idea, but useless in the real world for graphic/photo editing. There is a very noticeable color shift from landscape to portrait orientation. It's okay for word processing or spreadsheet where color isn't critical.I got something better and useful for you.
View attachment 2219153 View attachment 2219155 View attachment 2219154 View attachment 2219156
If that was the goal, they would simply pull it. Problem solved, migration complete.
A plausible explanation is it is a place holder that can be used by hardware developers to write macOS drivers for PCIe hardware. It would serve the same purpose as the early "developer" Apple silicon minis. When the real thing is released with an M3 SuperDuper, the hardware will be ready and tested for the platform.
Jason Snell did.
And Apple’s specs say the noise has gone from 15dB on the M1 to 6dB on the M2. That’s quite a difference.
And Apple already did.
Oh, come on, it was obviously inspired but I meant that the lamp iMac was quite a lot more elegant and every time Microsoft tries to "go Apple", it delivers weird and almost-very-cool hardware that ultimately failsAnd Apple already did.
NetherRealm Studios or CIA?I’m saving up for the MkUltra Mac Studio.
Don't sugar coat it like that kid, tell 'em straight.Spoken like someone who doesn't have a clue about the actual engineering, and who thinks words matter more than the engineering.
No-one in this space GIVES A FSCK about the words that are used. What matters is upcoming technical details. First GAA, then BSPDN, then ForkFET, then clock on the backside, then CFET. Every one of these is a huge, important, and difficult step forward. And they will be marked by a succession of names like A24, A20, A18 or whatever.
There is NOTHING stopping you from learning enough about the technology to be part of this great adventure, to understand that the significant step in each one of these new names is whatever it is – a new type of transistor, a new way to deliver power, or whatever. But you don't want to do that, do you? You'd rather sneer about "marketing" and how you're not smart enough to be "fooled" (like TSMC cares what you think? are you buying space in TSMC fabs? ) than spend a day educating yourself about one of the most interesting adventures of our time.
Hmmm .. @User 6502 🤔 Which "professionals" are you referring to? I don't recall protests from professionals disappointed with the performance of the M1 GPU in the devices they typically purchased -- i.e., MacBook Pros with M1 Pro/Max SOCs.
In fact, I recall the opposite: many reviews from professional photographers and videographers endorsing the M1 and extolling the speed (processing, rendering and I/O), efficiency and other virtues of Apple Silicon and hailing the return of the MacBook Pro.
Net-net: The overall conclusion I recall from "professionals" was that the M1 raised the bar for performance and efficiency substantially (so much that it triggered speculation about the potential beginning of the end for Intel). Are you referring to some other type of "professional"?
Or this can be a placeholder processor for the Mac Pro until they get the engineering for an "Extreme" (x2 or even x4 Ultras) worked outApple wants to kill the Mac Pro. It doesn’t make sense any other way I can think it out.
Given it's the same process, increased performance should come at thermal cost.
Has anyone compared this to the M1 version to see fan speeds and noise levels?