Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Safari likely uses high-performance cores for stuff like JS. It might force stuff like downloads to run on efficiency cores, though.

JS is largely single-threaded, so you benefit more from very fast cores (which the M1 has), not many cores (since only one will truly be busy).

Also, 32 GiB RAM is way overkill for Safari.
So you're saying it can use high performance cores for JS, but very rarely?

I thought 16gb (and 8 and 4 and so on) was over kill just as an internet machine. My Facebook tab is taking over 1.5 gb right now and I havent opened it for an hour or 2. I have some live view tabs that are very heavy RAM usage and then a ton of others too. I almost always have swap going on with just Safari on my 16gb machine, and I've read all the issues with swap issues on M1 machines so I figure I'll play it safe and over buy Ram...again lol
 
Safari itself isn't the problem. It's certain types of website/webapp.

I have a pinned slack tab for a client, that chews close to a GB pretty much constantly.
Yep. Pretty normal for tabs to be 500mb-1gb. Range seems to be about 250mb-1.5gb I've seen
 
So you're saying it can use high performance cores for JS, but very rarely?

I would say in typical usage:

  • it will max out one performance core if a webpage is busy.
  • but unless you have multiple very-busy webpages open, it probably won't max out more than one performance core. It cannot spread most of the work that one webpage does to multiple cores, making that one core the bottleneck.
  • for downloads and other relatively lightweight tasks, it probably uses efficiency cores. It might actually force those tasks to use efficiency cores; I'm not sure.

I thought 16gb (and 8 and 4 and so on) was over kill just as an internet machine. My Facebook tab is taking over 1.5 gb right now and I havent opened it for an hour or 2. I have some live view tabs that are very heavy RAM usage and then a ton of others too. I almost always have swap going on with just Safari on my 16gb machine, and I've read all the issues with swap issues on M1 machines so I figure I'll play it safe and over buy Ram...again lol
That sounds like a lot for Facebook. Do you have any extensions, maybe?
 
Safari itself isn't the problem. It's certain types of website/webapp.

I have a pinned slack tab for a client, that chews close to a GB pretty much constantly.
Sure, but you don't tend to have twenty apps of the Slack caliber.
 
Don't be fooled, the current M1 has 4 high-performance cores and 4 energy-efficient (slower) cores. The M1x would have 8 high-performance cores and only 2 energy-efficient cores, together with more RAM and double or triple the GPU cores.

My guess is that this would practically double the speed of the M1, this is huge.

And what if they are not M1 cores, but "M2" (or whatever) cores based on the A15 coming this fall — i.e., with Apple's customary 15-20% annual uplift in performance that we've seen for a decade? This could be quite big indeed.
 
Sure, but you don't tend to have twenty apps of the Slack caliber.
I don't, no, but I tend to use native apps whenever possible. I'm pretty sure there are people who would quite easily max-out a 16GB machine just through safari tabs though. Google docs/drive used to be pretty horrific for memory use too, IIRC.
 
Sure hope so, but people have been saying it's a M1 limitation (heck if I know though). If they support multiple displays, I've got two 2016 MacBook pros waiting to be retired.
Unlikely this machine will be using the same M1 we have seen thus far. It’ll either be an M1X or M2.
 
i'm 99% certain that he is joking. no professional actually needs 8tb on a laptop. anyone with such massive amount of storage would probably use a more redundant and reliable solution.

It’s kind of hard to tell on the site because there are a lot of developers and what not here. But yeah you’re probably right.
 
And what if they are not M1 cores, but "M2" (or whatever) cores based on the A15 coming this fall — i.e., with Apple's customary 15-20% annual uplift in performance that we've seen for a decade? This could be quite big indeed.
Yeah, it'll be summer, so it's probably already the A15/M2 generation.
 
It’s kind of hard to tell on the site because there are a lot of developers and what not here. But yeah you’re probably right.
There's something to be said about having it all internally, but… I would strongly advise against it at those prices. Just get a 512 GB or 1 TB model and then connect a USB-C SSD. They're plenty fast.
 
Where are you finding 128-core CPUs? Even next gen AMD is only 96-core.

https://wccftech.com/amd-epyc-genoa...res-12-channel-ddr5-5200-sp5-lga-6096-socket/
I didn't find 128-core CPU, but I did see at least one CPU with 850,000 cores 😉

But it is too big, won't fit in most of these recent Apple computers.
 
  • Like
Reactions: Homy
if GPU cores is proportional to GPU performance, then a 16 core GPU would be as fast as the 5600M and a 32 core GPU would be somewhere between a 2070 Super and a 2080
 
if GPU cores is proportional to GPU performance, then a 16 core GPU would be as fast as the 5600M and a 32 core GPU would be somewhere between a 2070 Super and a 2080

Yes, GPUs render images more quickly than a CPU thanks to parallel processing architecture so in theory you can get 4 times better performance with 32 cores. M2 GPU would be on par with 5700 XT, 2070 Super, 2080 or 1080 Ti in games like Borderlands 3 at 1440p Ultra. :)
 
  • Like
Reactions: Rashy
Yes, GPUs render images more quickly than a CPU thanks to parallel processing architecture so in theory you can get 4 times better performance with 32 cores. M2 GPU would be on par with 5700 XT, 2070 Super, 2080 or 1080 Ti in games like Borderlands 3 at 1440p Ultra. :)
considering that apple's mac retina line up always has the same pixel size, any game played at 50% resolution would have the same density of a 27in 1440p monitor. i'd have to assume that this 16 core GPU would achieve 60fps on many games with max settings if we play it like that.
 
  • Like
Reactions: Rashy and Homy
Well, yes, I'm running a 12-core (2x3.46GHz) fully upgraded 2009 Mac Pro right now, and certain things do scale nicely, like the aforementioned Handbrake. But there is a LOT of software that still doesn't, including quite a lot of Apple's own software. I quite frequently see far less multi-threading than I would like.

So that's really what I'm getting at here but probably didn't articulate properly; has Apple engineered a way around the problem that poorly written software simply doesn't scale well across cores? How will this effect massively multicore GPUs (and CPUs) under MacOS going forward? Will we finally start seeing all the performance potential of these machines or will we forever be at the mercy of the coding of the specific apps themselves?

For example if any software using the GPU will automatically take advantage of all 32 GPU cores in the (rumored) high-end GPU option then that's going to offer very fast real-world GPU performance. If software has to be written properly to take advantage of the GPU cores, then that's going to offer very fast theoretical GPU performance in most cases. One is useful, the other, not so much.
This is where IMO the "walled garden" approach is best for consumers (perhaps with the option of going out of your way to opt-out, but in a way (maybe a disclaimer?) that discourages less optimal apps to pressure developers). Case in point, requiring 64-bit app support (should have been done a decade ago, plenty of time after the G5 was introduced), usage of OpenCL, then Metal, etc. Once they add a new standard, there should be only a couple of years or so for the developers to then be required to utilize it if it's possible. I have the trashcan Mac Pro, and I wish apple would force developers to utilize both GPUs
 
if GPU cores is proportional to GPU performance
Basically, yes.

This is an oversimplification, but think of it like rendering an image line by line. If you have two cores, one can start at the top, and one at the middle. If you have 16 cores, just split the image into 16 parts, have each core do a part, and then all you have to do is merge the result together.
 
And what if they are not M1 cores, but "M2" (or whatever) cores based on the A15 coming this fall — i.e., with Apple's customary 15-20% annual uplift in performance that we've seen for a decade? This could be quite big indeed.
This hasn't ever happened though right? Someone in this thread pointed out that Apple has always Done A12>A12X and intel did laptop>desktop. I dont think anyone thinks they are going to release the A15/M2/M2X at the same time or whatever
 
This hasn't ever happened though right? Someone in this thread pointed out that Apple has always Done A12>A12X and intel did laptop>desktop. I dont think anyone thinks they are going to release the A15/M2/M2X at the same time or whatever

i do, because that's what they are going to do. The new pro machines will be announced at WWDC and will have new core designs.
 
  • Like
Reactions: opeter
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.