Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
While I can't afford to buy a computer mouse, I would like to see if not new designs, at least new colors for macs of all sizes.
Bring some life into them! Keep silver and white as options if you want, but ffs add some color!
 
  • Like
Reactions: jasoncarle
I keep hearing the M2 is a stop gap I'm waiting for the M3 - I find this odd as no body knows what or when the M3 drops and how it will be configured regarding performance, economy and GPU cores.
It’s quite simply because the M3 will be the first to use the 3nm process. The M2 originally was supposed to be, but the process was not ready in time, so the resultant M2 was a last minute “stopgap” that fell short of the original plans/roadmap.
 
Totally Agree with you. I video edit with DaVinci Resolve on both a high powered PC (RTX4090) to use PC only plug-ins and on my MacBook Pro M2 Max. For editing, stability and render times with Pro Res or H265/264, it's the MBP hands down. The PC is limited to applications that are PC only plug-ins but video editing on it is just not as smooth with occasional crashes that happens twice as often. It is what it is - you choose the right tools for the job. For video editors it makes no sense to pay through the nose for the RTX4090 when overall workflow is simply smoother/quicker on the Mac. I can't speak to 3D rendering or Blender, but creators who are video editing can run circles around 30X0 PCs with a lighter (cheaper) more efficient M2 Macbook Air. I have yet to meet a video editor who doesn't prefer editing on an M2 Mac if they're using Resolve or Premier.


If this is true, then you probably have a bottleneck somewhere else in your PC setup (like SSD read/write speed) or you are doing something wrong with your settings. Encoding used to be a CPU intensive task, but for the last couple of generations NVDA’s GPUs have been designed specifically to do these tasks in hardware, and the 4090 is the current encoding king by a mile.
 
That's because Apple GPU sucks. Simple. Beside, their Metal API also sucks that a lot of software don't event use it unless they really need it such as Adobe.

Apple isn't trying to make better GPU while focus on software aspects such as Metal API which is a huge problem. The hardware itself is slow so CUDA stuff is pointless. You better check TFLOP of Apple GPU compared to Nvidia GPU. Hell, they even advertised that M1 Max = mobile 3080, M1 Ultra = 3090 and yet none of them were able to reach that.

Apple is just not good at making good GPU, that's all. Nothing new since Mac was never known of great GPU performance.
This is halfway true, but you would have to say that AMD is bad at making GPU’s too then, since they are basically in the same boat as Apple with inferior realtime raytracing etc…

Nvidia is famously horrific to work with, and Apple kicked them off their platform. Steve Jobs axed them himself, another difficult guy to work with…


Where Apples GPU cores suffer is with raytracing. AND M3 is supposedly addressing this... So Apple HAS to reach Nvidia performance… and they licenced the RT raytracing technology in 2019 to do that. Which was too late for it being in for M1 and M2. OR as the rumours say, the RT raytracing cores were too power hungry, causing Apple to scrap them for M2/A16.


A17 and M3 will see the RT RayTracing, and you wont need to say Apple GPU sucks. Because it will accellerate games and rendering of 3d and advanced gfx work (often uses 3d rendering engines to render frames)


To say that their GPU cores are bad is a bit of stretch imho. Apples gpu strategy is different and in ways it is more powerful then AMD/Nvidia aka access to a big shared memory pool. HOWEVER, as described above, ME and others who are in NEED of powerfull raytracing… Are waiting for M3 Max or Ultra or Extreme (if they decide to make it)


What I hear from industry is that the M1 Ultra and M2 max are better to work with on big 3d scenes then a Nvidia workstation since is not as loud and hot, and has gobbles of VRAM. But for rendering, the big loud Nvidia workstation is faster and preferred.
 
What do you all think are the chances Apple discontinues the m1 air?

I kind of like the design better than the m2, and not interested in a 15” laptop…wondering if i should order one before they’re gone. But a 3+ year old computer sounds like not a great idea at this point. Would be great if they kept it on and dropped the price another couple hundred bucks..

I don't think they will discontinue the M1 Air immediately since these are all going to be relatively pricey machines, but it might become more available as a refurbished item or for sale as a used item as early adopters upgrade to a 15". For many folks the screen size, plus light weight, plus not as expensive as 16" MacBook Pro, will be very compelling. $2,499 for a 16" MacBook Pro is too much for a lot of users who have zero need for the power in the machine. A $1,799 MacBook Air with 15" screen will open a lot of wallets. (Prediction, it also comes with 16gb RAM as base model.)
 
As a further aside, I think gpu cores will meld with Ai cores and run all gfx through Ai.

I think the RT raytracing cores are just a stump in gfx technology, it makes more sense to output from Ai cores when Ai can simulate infinite rays or just make it look like a picture/video.

What is even the point of 3d graphics when Ai can do much better with alot less input/work? Is the aesthetic of 3d graphics really that good? Either way, Ai can make something look like 3d graphics. But I think that will quickly fall aside when people realize the possibilities.


Everything you thought you knew about computer gfx is up-ended by Ai… I am not sure if I am excited or horrified.

Ai can make any 2d video into sterescopic 3d video. It can conjure up the image that the stereo pair is missing based on the one frame.

Anyone who has touched cameras or done any 3d graphics will be floored by what this means. The amount of work to do that manually would be in the hundreds of millions$ previously (If at all possible)… Ai can do it in probably a few hours with enough power, and in realtime fairly soon.


So, I would say Apple doing RT raytracing is somewhat of a non-issue, its not the future.
 
  • Haha
Reactions: spaz8
They are also starting to use the neural engine cores. Tasks like AI-based masking, noise reduction and generative AI-based image editing are *extremely* GPU intensive and have absolutely nothing to do with 3D performance.

Your implication that a GPU only matters for 3D application use is incredibly narrow-minded and completely ignorant to how they are used broadly in the market.
Indeed, and performance will get better as the API's and drivers mature.
 
You really think a student buying a $499 Mac mini (actually, $697 since they'll need a mouse and keyboard for it) is going to drop $1499 on the Studio display? It would make sense to offer a reasonably priced display for those type of buyers.
Folks have been saying Apple should sell really low end stuff forever. Low end is not the market they choose because there are scores of cheap competitors making displays for Minis.
 
Sure, but when NVIDIAs flagship desktop GPU struggles to power high end VR apps with 650 watts of power, the unprecedented performance you speak of just isn’t feasible in a significantly smaller form factor…you’re basically arguing Apple could take over the GPU market overnight, but it chooses not to.
Hot boxes, 650 watts of power, etc. are just wrong on many levels. No, Apple could not take over the GPU market overnight, but it can (and IMO should) aggressively move in a new more sustainable direction.
 
Hopefully not. M2 was underwhelming, very marginal improvements compared to M1. One would hope M3 will be a more substantial improvement.
Sorry but I disagree. M2 is a fine upgrade, including Bluetooth and WiFi as well as the performance improvement. Plus of course the thousands of additional Apple engineering hours made M2 a solid upgrade to M1.

Moving to a smaller SoC process will bring M3 a power and efficiency upgrade but WiFi 6E and Bluetooth 5.3 are already here with M2. And at this point M2 is high yield while M3 probably is not there yet, so pricing is an issue.
 
for current M1 Max Studio owners.... Can the M1 Max or whatever the best one is handle 3 streams of 6k 10 bit h.265 video and fast color grading adjustments without render times each time I adjust the color?

If so then I'll buy the best M1 Studio when the M2 or M3 drops because why would I need anything better? Why not pay 50% of original price and get very similar real world performance?
My understanding is that you may be discussing the weak point of M1 performance. You may find a substantial improvement with M2's added graphics cores. Also more RAM probably improves the long term.
 
You really think a student buying a $499 Mac mini (actually, $697 since they'll need a mouse and keyboard for it) is going to drop $1499 on the Studio display? It would make sense to offer a reasonably priced display for those type of buyers.

exactly. very few people need or are going to spend more on a monitor than the entire rest of the computer.

when I bought the first Mac mini g4 I just used what monitor I had been using with my quicksilver.

and if it was my first computer and didn't already have a monitor I certainly wouldn't have spent more on one than I had on the mini itself
 
No doubt devices will be configurations of M2 chip as they have to do something to use up inventory of what for me was an underwhelming change from the very successful M1.

I won't be touching M2 in any configuration, but am looking forward to the M3. Hopefully Apple will use up its m2 inventory on phones and start thinking in terms of chips for tasks and that means a different mindset to developing its major product these days, the iPhone, then purposing chips for its computer line up, which just doesn't cut it for some of the extensive power demands some users have.

Of course these represent a smaller market turnover than iPhones, but they still require attention if Apple is to be known as a manufacturer of good computers, rather than an iPhone manufacturer.

Needs separation, quite possibly with iPhone/iPad and accessories separated from laptops/mini's/studio/iMac etc. etc.

I also suspect Apple has sufficient M3's for other products, as the timescale TSMC gave and the celebration at TSMC at production of M3 seem at odds with subsequent revised timings suggesting lack of chips. More likely is an over production of M2 which Apple took and has yet to sell.

Some who suggest M2 was a worthy upgrade are in my opinion mistaken, probably due to the success of the M1 iMac being such a fine little machine.

PR wasn't helped by Apple's decision both on RAM and SSD, and where although most users may not notice a slower SSD performance in some configurations, the PR was catastrophic, with many articles pouring over the perceived problem, and where although it may not be noticeable to most, the idea of speeds of some SSD configurations going backwards was not a compelling PR success.
 
Last edited:
M2 is barely out of the gate and you guys want M3 already...

I want apple to slow down, release finished products, and stop with this stupid aritificial and pointless upgrade cycle that benefits no one.

And while I am at it... I want a new Mac Studio to have ECC memory and I want you guys and gals at Apple to get ZFS working flawlessly on Apple Silicon so I can use some of the wonderful thunderbolt storage options and retire a NAS on my network that just acts as a data depository for important data that I can't trust to a less reliable and robust file system. You'll do this if you're serious about a Mac Pro and Pro users.
 
You really think a student buying a $499 Mac mini (actually, $697 since they'll need a mouse and keyboard for it) is going to drop $1499 on the Studio display? It would make sense to offer a reasonably priced display for those type of buyers.
Sure... and if Apple offered a $600 'affordable' display, most of them would still buy a $400 Dell or use a TV - which are cheap because they sell in vast numbers and most of the profit comes from upselling people to extended warranties and $100 decaffinated copper HDMI cables. Apple are not going to get out of bed to compete in that market.

Apple don't have a history of 'me too' products - they tend to focus on areas where they can offer something distinctive. In the case of the Pro XDR display and Studio displays, Apple have about the only 220ppi displays on the market, so they can charge a hefty premium (AFAIK the 5ks from Samsung et. al. announced earlier this year are still vapourware, so it's really just the LG Ultrafine - and last I looked those were like hen's teeth).

You're unlikely to see a 5k display for much less than the thick end of $1000 - and while the disadvantages of a 27"+ 4k display on a Mac have been grossly exaggerated, they do exist and Apple might not want to put their badge on it.

I'm not sure how we got the sub-$2000 entry-level 5k iMacs - which were always a steal by Apple's standards. Some of that pricing might have been based on an assumption from 2014 that 5k displays would become commodity items within a few years - turns out that only Mac users give a wet slap about 5k because of the way MacOS (doesn't) handle UI scaling. Although I doubt Apple were making a loss I doubt they were getting the profit margin they usually enjoy.

Also... OK, I wouldn't personally buy a Studio Display (no alternative display input, $silly extra for a decent stand vs. all-or-nothing VESA option, fixed - effectively - mains lead, laptop charging capability I don't need, speakers I don't need + I prefer a dual-display setup which would be ridiculously expensive). Your mileage & opinion may vary. However, there's no doubting that the picture quality is excellent and the resolution hits the sweet-spot for MacOS - so if you consider that it is something you could be using for the next 5-10 years over several computer upgrades it might not be a bad investment.
 
Sure... and if Apple offered a $600 'affordable' display, most of them would still buy a $400 Dell or use a TV - which are cheap

A $400 display isn't cheap, though. That's already the high end for what most people spend on a display.

The problem with the Studio Display being $1600 (without a good stand) isn't that it's slightly pricier, like a lot of Apple products. It's that it's about way above what people spend. It's as if an iPhone didn't start at $400, but $2,000. Yes, Apple has high-end iPhone Pro Max configurations that are in the ballpark, but they also have phones that are much cheaper, and that most people go for.

 
M2 is barely out of the gate and you guys want M3 already...

I want apple to slow down, release finished products, and stop with this stupid aritificial and pointless upgrade cycle that benefits no one.

The M2 was only a relatively minor update over the M1, and the M1 is two and a half years old. Seems reasonable to be curious what an M3 will bring. I do think that should happen this year.
 
A 27in. iMac?—maybe, just maybe. But not likely next week.

A 32in. iMac?—nope, not a chance. And certainly not anytime soon.

In both scenarios I suspect Apple believes they can do better by having the Mac Studio with which customers can opt for the Studio Display or a monitor from another brand. It’s a bitter pill for 27in. iMac fans, but there it is. And if M2 is the current thing to focus on then we could well see a revised M2 Mac Studio alongside the 15in. MacBook Air.

If M3 is referenced next week I think it will be in regard to forthcoming models by the end of the year. It would be a helluva surprise otherwise.

Consuming 4K and 8K HDR AppleTV+ content on their own branded iMac XDR displays would be a selling point that no other studio could provide. Providing a single-cord solution for those 4K/8K HDR content creators is the no-brainer component, IMHO. Paired with their single-cord XDR MacBookPro or iPadPro, it's the "killer app" for HDR.

Add to that...schools, libraries, colleges, businesses small and large, studios, institutions, etc., etc. all looking for a single-power-cord computing solution have long-been Apple's, er, "niche" market and I do not see them walking away from that iMac consumer/business base anytime soon.

Methinks, folks are getting itchy to replace their 27" i9 with some Apple silicon, the pandemic is over, supply chains are coming back and 4K/8K HDR is here...
________

Bringing full-screen HDR-video content creation to the masses using their current 254ppi XDR MBP display technology...a quad-sized 14.2" panel would yield a 28.4" 6048x3928 display (to replace/update the 27") and a quad-sized 16.2" panel yielding a 32.4" 6912x4468 display (to satisfy the old 32" rumor), both with 1016 dimming zones.

Kicking things up a notch to address the 8K HDR video-content folks, a 9-times MBP XDR display would work out to 9072x5892 @42.6" diagonal or 10368x6702 @48.6" diagonal panels, both with 2,286 dimmable zones! Again, just using and upsizing their existing MBP XDR display tech and specs.

________

...the time is ripe for the 27"/32" XDR iMacs, one cord to rule them all!

For the power niche who needs more blocks of aluminum sitting on their desks with more power cords and connecting wires, Apple will be happy to sell them a Studio or MacPro and Pro Displays using those same XDR display panels! LOL

Simple. :)
 
...oop...meant to type, "...the time is ripe for the 28.4"/32.4" XDR iMacs, one cord to rule them all!"...

And, yeah, I'd buy a 32.4" XDR model in a flash! Ha! The perfect companion for my HDR-shooting Panasonic S1's and GH6! ;)
 
M2 is barely out of the gate and you guys want M3 already...

I want apple to slow down, release finished products, and stop with this stupid aritificial and pointless upgrade cycle that benefits no one.

And while I am at it... I want a new Mac Studio to have ECC memory and I want you guys and gals at Apple to get ZFS working flawlessly on Apple Silicon so I can use some of the wonderful thunderbolt storage options and retire a NAS on my network that just acts as a data depository for important data that I can't trust to a less reliable and robust file system. You'll do this if you're serious about a Mac Pro and Pro users.

If they cared about Mac Pro there would be one

And at the risk of starting a semantic argument that’s basically a trope here at this point about what is “pro” user, they spent time and resources porting logic and Final Cut in to garageband pro and iMovie pro…..
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.