This makes me want to buy one even though have absolutely no use for it except for commenting on Macrumors.
Better wait for the M3 Ultra. Safari is super snappy on the M3 Ultra which boost your productivity with commenting on MacRumors.
This makes me want to buy one even though have absolutely no use for it except for commenting on Macrumors.
Wow, some people just can’t take good news.With respect to the "Pro" laptops: starting at 8GB in 2023 is terrible
With respect to the iMacs leapfrogging over M2 is weak
My base M2 Mini has TB4, while base M3 has TB3, hardware regression there. Although, 3 & 4 both support 40Gb/s ... if past Apple history proves anything they'll use version 3 versus 4 as an artificial cut off point for Mac OS upgrades in the future...
And then finally the 150/300/400/800 bandwidth of their unified memory is kind of ridiculous. Before it was 85-ish, 100, 200 or 400 throughput in versions one and two. They have the die shrink available to them, but they opted not to push the boundaries, and to me that seems lazy, as they own the whole stack now.
And finally, they rarely compared anything of real value to M2. It was all M1 or Intel stuff, so clearly they just are trying to lure you into an upgrade, all thepeople on old Intel hardware, and that's a money decision, not a technology, decision, and I thought we were told that, "we ain't seen nothing yet." They are just itching to drop support for Intel machines and it's gonna come sooner than later I think.
To me, it's clear that they artificially slowed things down since they were so far ahead of the competition, imo a sad trombone is warranted.
And really minor, they took away support for high impedance headphones... Really just give us a decent dac/amp which is already in my m2 base Mini.
I will have however, cut them slack for Wi-Fi seven, the lack of...
Finally, in the reading of multiple websites, nobody's running out to trade in their M2 machines handover fist, so it seems like initially it's going to be a slow roll until M4.
I absolutely could be wrong though...
Why use an OpenCL benchmark, a standard macOS doesn’t support and hasn’t for a while? That’s essentially like testing the speed of an emulator, which will never look good. Aztec Ruins Offscreen and other tests such as Metal-based tests are much better and put the GPU’s far ahead of the weak OpenCL scores. The Windows equivalent would be to turn off DirectX 11 and 12 and expect Windows machines to do well in graphics performance.
Why use an OpenCL benchmark, a standard macOS doesn’t support and hasn’t for a while? That’s essentially like testing the speed of an emulator, which will never look good. Aztec Ruins Offscreen and other tests such as Metal-based tests are much better and put the GPU’s far ahead of the weak OpenCL scores. The Windows equivalent would be to turn off DirectX 11 and 12 and expect Windows machines to do well in graphics performance.
That’s precisely my point. Using OpenCL benchmarks are not going to express the true abilities of the GPU cores, but rather a handicapped version that isn’t even available in macOS when first installed and has to be installed from other sources. Using OpenCL is worthless as a measurement of the true graphics performance, so I wonder why anyone still does those benchmarks. If Apple were still supporting OpenCL, then I would agree that a benchmark would be valid.Because Metal benchmarks don't show up yet.
And OpenCL isn't fully deprecated. It's still used by many apps including Photoshop which Apple benchmarks themselves. There's no Metal version of Photoshop.
So comparing the CL benchmarks versus other Macs is fine. Just don't compare against PCs because they use more modern CL version.
It's the level of an AMD Radeon RX 6850Mobile XT
If it maintains the same performance improvement toward the M2, the M3 Ultra will be at the level of a Nvidia GeForce RTX 2080 Ti or a AMD Radeon RX 7800 XT..
Those AMD and nVidia cards are more powerful…only when they’re plugged in. And they consume what, 5x the power?
I think whole Mac Pro strategy went to bin as manufacturing was well delayed and design failures resulting we never saw Extreme version. I expect Apple strategy was Mac Pro with Ultra and Extreme chips. If they managed to fix design failures we may see M3 Extreme next year.Guys, Apple is just maintaining the tradition of releasing a new gen Mac Pro and then outclassing it six months later with a cheaper machine.
That’s precisely my point. Using OpenCL benchmarks are not going to express the true abilities of the GPU cores
Indeed the performance jump is outstanding vs the M2 Mac Pro. Yet you gotta think just what on earth where Apple engineers thinking designing it?Holy smokes! All aboard the 3 nanometer express ! A laptop nearly besting the Mac Pro, a mere 4.5 months after its debut.
It makes perfect sense given that the M1 and M2 Max chips had 12 cores (8 performance + 4 efficiency), while the M3 Max has a big jump to 16 cores (12 performance + 4 efficiency). 50% more performance cores, in addition to all the cores being faster, would logically lead to a result like this.
^ ? Not if the work their doing on the Studio Ultra's ... still pay them $$ to cover the mortgage payments, food and other expenses with lots of leisure time, and their getting a LOT of use out of them. Value is always perceived by both the outsider looking in vs the user using the product/service. Sure an inflection point will be reached but not this soon.Mac Studio Ultra users just lost half of its value.
I am actually wondering if this is why they never produced the Extreme version with M2. We could see this with M3 since this is the big jump of which there won't be the same jump like this for a number of years making an M3 Extreme make more sense.Holy smokes! All aboard the 3 nanometer express ! A laptop nearly besting the Mac Pro, a mere 4.5 months after its debut.
Long gaps to non-mobile machines were almost always due to Apple not prioritizing those systems. That still exists today. Apple went to their own chips because 80% of their sales were in mobile systems and, due to missteps by Intel, they were forced to, for example, ship laptops with desktop memory because Intel didn’t support high capacity LP RAM.I think you missed my point. The chips are progressing fast. Slow chip updates can no longer be used as an excuse for long gaps between hardware updates. The Mac Studio is likely to lag a long time behind the chip release again.
TB4 started at M1 family. tbMBP and base M3 MBP14 have Thunderbolt 3 and same with iMac 2021 and iMac 2023. It wouldn't be different when the iMac is updated with M2. It looks like they segment TB4 only for higher end MBPs and their headless desktops. It really doesn't surprise me at all when they announced it.My base M2 Mini has TB4, while base M3 has TB3, hardware regression there. Although, 3 & 4 both support 40Gb/s ... if past Apple history proves anything they'll use version 3 versus 4 as an artificial cut off point for Mac OS upgrades in the future...
The M3 Ultra, assuming a 50%+ jump over the Max (doubling the cores will be more like a 50% improvement than a doubling (100% improvement) will be scary close to some VERY large PC workstations. Another 50% improvement from a hypothetical Extreme should beat ANY single-CPU workstation by quite a bit, including things running the new 96-core Threadripper and the like. The GPU will similarly beat any single card I can think of (maybe not some exotic AI-centered card - I know nothing of what they're like).
Even then, there will be PC workstations that can beat it - dual top-line Xeons or Epycs, with two to four GPUS.
There's a huge difference in livability, though. M2 Max is about a 90 watt chip (CPU plus GPU), and I think M3 Max should be similar (more performance cores add power draw, but die shrink takes it away). M3 Ultra will run on about 180 watts. A hypothetical M3 Extreme is still under 400 Watts, including the GPU load. Oversize the PSU a bit to be sure, and add in the load from (potentially a lot of) PCIe storage and the like, and it could run off a 750 watt power supply, with 1 KW providing quite a bit of headroom in case someone did something unexpected with the PCIe slots. Heck, if they put it in a Mac Studio case and avoided the possibility of heavily loaded slots, you could run the thing off a 600 watt PSU quite comfortably (you might not be able to cool it in the Mac Studio case).
The fans would be nearly silent in the Mac Pro case - that thing is WELL cooled. It plugs into a household 15 amp circuit, going through any 1500 VA UPS you pick up at Best Buy. It doesn't even need a dedicated circuit - the monitor, (although not three monitors or a laser printer), a phone charger and an LED desk lamp will fit just fine. It only takes a little caution to make sure the coffee maker is not inadvertently on the same circuit.
Have you seen what big, exotic Windows workstations take for power? EACH of their two CPUS draws as much as the whole M3 Extreme (including the GPU). Then there are two to four 350 watt GPUs. The power supply is not A power supply, it is multiple power supplies, each well over 1 KW. Forget about plugging THAT into a 15 amp outlet - many of them won't fit on a 20 amp outlet. Most commercial buildings (unless they're old) have 20 amp outlets, but anything beyond that is a custom wiring job. Your top designer isn't going to love hearing that her new office is the former copier room, because the only place her workstation will plug in is the outlet the copier used to plug into. Some big workstations will run off a 2200 VA UPS, and some won't. A 2200 VA UPS will JUST fit on a 20 amp office circuit - that's why they are popular. Once you get into a 3000 VA UPS, you're into strange outlets or hardwired power.
Big Windows workstations also run HOT and NOISY. If you're Pixar, it doesn't matter (except for environmental reasons) - these things live in the server room and they just use remote rendering - the computers on employees' desks are relatively ordinary, while the big iron is in racks in dedicated space. The server room has hundreds or even thousands of amps of power with hardwired UPS and dedicated cooling available . Heck, some of those server rooms used to have Crays in them....
What if you're a freelancer or part of a five person firm? Your office might be in an old three-story building downtown. Wouldn't you rather just set your workstation on the floor next to your desk, rather than paying the electrician to install it and then dealing with its banshee wail?
There are three reasons why Mac is so inferior in gaming. One is the inadequacies of Metal over far more mature technologies like DirectX. Two is that Windows have sold so many more units than Macs over the decades that gaming companies simply don’t bother with the Mac. The opposite holds true in mobile gaming where iOS is the place to be. Three is that Microsoft evangelized gaming to a much larger degree, even buying up gaming companies to do it themselves, while Apple’s efforts have been at best half-hearted. Apple needs to really step it up and buy a few gaming companies themselves if they ever want to get serious. It’s the one area where Apple simply can’t compete, but they could if they actually tried. Every time we think they are, Apple drops the ball and doesn’t follow through. Why they don’t just defies explanation. Apple is now pushing mesh shading and ray tracing, something PC’s have had for eons, but will Apple follow through and take advantage? I‘m not holding my breath. They’ll probably disappoint us again with three or four AAA games, but nothing more. Microsoft just closed the deal on Activision-Blizzard, one of the biggest names in gaming. Here I am hoping Microsoft doesn’t shut down one of the few native Mac AAA games, World of Warcraft, and make it exclusively Windows/Xbox. I’d have to go build another gaming PC.3. The niche borderline scenarios you’re dreaming up aren’t usually a factor. I’d wager that’s <1% of the target population. Having said that, there are limitations of the Mac OS environment which is why Windows has a hold in the area of gaming, development IDEs etc.
there are limitations of the Mac OS environment which is why Windows has a hold in the area of gaming, development IDEs etc.
As a photographer, I'm actively unhappy to see Game Mode in Sonoma, because it means that some of the instability games bring to Windows (the hooks that allow games to function, even when not playing them) may be coming to the Mac. Games want to access the hardware more directly than Photoshop or Final Cut Pro, and allowing that makes the OS less stable for other uses. I'd rather have a machine that can't game past the Candy Crush level, but doesn't have the instabilities, than one that runs AAA games but crashes more.