Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This hasn't ever happened though right? Someone in this thread pointed out that Apple has always Done A12>A12X and intel did laptop>desktop.

That depends a lot on how you look at it. If you look just at the A series, and then the AX series, yes.

If you go by just A itself: the A4 was actually released on the iPad first, then the iPhone 4 later. Same for the A5: iPad 2, then iPhone 4S.

Likewise, recent versions of the S-series chips in the Apple Watch are actually the same cores as the efficiency cores in the A-series chips. So, arguably, that's a precedent of shrinking the same design down.

Is there precedent for Apple shipping a chip in the iPad Pro first, then the non-X version in the iPhone? I don't believe there is, no. But would I be shocked if they ship the A15/M2-generation cores in a Mac in late summer, then in the iPhone? Not at all.

 
I still wonder why Gurman claims the M1X is 10 high-performance cores and 2 high-efficiency cores. Wouldn't 8 high-performance cores and 4 high-efficiency cores be better in terms of battery time per charge? Now, 16 GPU cores and access up to 64 GB of RAM is a likely given with an upgraded Unified Memory Architecture controller.

I believe the 14" MacBook Pro will come with 16 GPU cores on the SoC; the 16" MacBook Pro may come with 32 GPU cores on the SoC as an extra-cost option.
 
can someone explain to me exactly how the 8 cores on the m1 chip works? so it has 4 efficiency cores and 4 performance cores, right? what happens if i am doing something that requires fast multicore performance? will the m1 chip use all 8 cores simultaneously? or will it just use the 4 performance cores? also, what is the speed difference between a single efficiency core and a single performance core?
 
can someone explain to me exactly how the 8 cores on the m1 chip works? so it has 4 efficiency cores and 4 performance cores, right? what happens if i am doing something that requires fast multicore performance? will the m1 chip use all 8 cores simultaneously? or will it just use the 4 performance cores? also, what is the speed difference between a single efficiency core and a single performance core?
All cores can be used simultaneously (with the exception of the Apple A10 Fusion, which could only use either the performance or efficiency cores).

I believe I saw a benchmark of the efficiency cores somewhere, but can't find it right now. I vaguely recall something like ~1/6th of the performance.
 
  • Like
Reactions: SWAON
Especially if the rumour that they aren't going with miniLEDs this time around ends up being true.
aren't they going along with the mini led this time? I mean I don't mind as far as I get the other ports. I want to use for sound production and engineering.
 
aren't they going along with the mini led this time? I mean I don't mind as far as I get the other ports. I want to use for sound production and engineering.

Considering the production problems they've been having with the 12.9" iPad Pro due to the mini-LED display, I won't be surprised if these MacBooks do not have mini-LED, especially if they are coming out as soon as this summer. Mini-LED MacBooks may not be until 2022 (according to some rumors I've seen).
 
  • Like
Reactions: Abhay Bobby
So is the integrated GPU in the M1(x)(2) better than a discrete GPU? Is there utility in Apple adding discrete GPUs (either Nvidia or AMD) to the ARM MBPs?
 
  • Like
Reactions: peanuts_of_pathos
I still wonder why Gurman claims the M1X is 10 high-performance cores and 2 high-efficiency cores. Wouldn't 8 high-performance cores and 4 high-efficiency cores be better in terms of battery time per charge? Now, 16 GPU cores and access up to 64 GB of RAM is a likely given with an upgraded Unified Memory Architecture controller.

I believe the 14" MacBook Pro will come with 16 GPU cores on the SoC; the 16" MacBook Pro may come with 32 GPU cores on the SoC as an extra-cost option.

He didn’t say anything about an M1X.
 
So is the integrated GPU in the M1(x)(2) better than a discrete GPU? Is there utility in Apple adding discrete GPUs (either Nvidia or AMD) to the ARM MBPs?

What makes the GPU in the M1(x)(2) “integrated?” Just because it is in the same package as the CPU? What if it is on a separate die in the same package?

Apple won’t be adding nvidia or AMD GPUs, given that even the M1 GPU is quite serviceable; if they quadruple the number of GPU cores, the performance should be pretty competitive with all but the most expensive nvidia/AMD GPUs, and will do so while burning a fraction of the power and generating a fraction of the heat.
 
What makes the GPU in the M1(x)(2) “integrated?” Just because it is in the same package as the CPU? What if it is on a separate die in the same package?

Apple won’t be adding nvidia or AMD GPUs, given that even the M1 GPU is quite serviceable; if they quadruple the number of GPU cores, the performance should be pretty competitive with all but the most expensive nvidia/AMD GPUs, and will do so while burning a fraction of the power and generating a fraction of the heat.
If the upcoming Apple Silicon CPU architecture is new, it stand to reason that the GPU architecture will also be new with better performance compare to the GPU core of the M1? If that's the case, the performance will be even better than just extrapolating the M1 GPU core counts. Maybe another 10-15% improvement on top?
 
If the upcoming Apple Silicon CPU architecture is new, it stand to reason that the GPU architecture will also be new with better performance compare to the GPU core of the M1? If that's the case, the performance will be even better than just extrapolating the M1 GPU core counts. Maybe another 10-15% improvement on top?
Not sure about GPU year-over-year changes, but CPU are about 20%.

A11 vs. A10 was 25.1%, A12 vs. A11 was 20.1%, A13 vs. A12 was 19.7%, A14 vs. A13 was 19.4%.
 
The truth is sometimes hard to hear, isn’t it?



There’s a couple of usb4 models now (some maybe still in preorder though) but they all require ac power.

It’s hard to see how Apple would “sell” (as in, marketing speak) a reduction in ports. I mean sure the whiners would probably buy one with nothing but a hdmi and USB-a ports, but for anyone who actually uses it for something other than connecting to a projector the loss of ports is gonna be a hard sell.
Do you mean it's hard for you to swallow? Well not my fault you know.
 
  • Like
Reactions: dgdosen
Do you mean it's hard for you to swallow? Well not my fault you know.
What exactly are you expecting me to... swallow?

There is a oft-repeated rumour/leak that the next MBP will feature a dedicated HDMI port. It's mentioned in the original post of this thread.


A number of people including myself have routinely said that providing a dedicated HDMI port (or any other relatively high-speed, single-use port) that provides a slightly more convenient option for some users, will almost certainly make things harder if not impossible for other users, because it's highly likely to come at the cost of physical TB3/USB4 ports, and it's possibly even coming at the expense of the number of supported display streams to whatever TB3/USB4 ports remain.


Your post then says:

Welcome back to common sense.
I just hope it's a joke about HDMI, it really should be DisplayPort.

Thus highlighting both sides of the argument, perfectly.


You consider single-use ports "common sense", but then immediately identify the problem with single-use ports: they may not suit your use.


Your post is like the poster-child for irony:
"Ah yes, that's right give me back dedicated ports. No more dongles for me. Dongles can suck it.".

"Wait what do you mean it's not the port I want. How do I plug in my display then? What do you mean I can't even get a dongle for it?"



So, let me ask again. What exactly are you expecting me to "swallow"?
 
It’s hard to see how Apple would “sell” (as in, marketing speak) a reduction in ports.

And yet they did exactly that with the 2016 MacBook Pro.

I mean sure the whiners would probably buy one with nothing but a hdmi and USB-a ports, but for anyone who actually uses it for something other than connecting to a projector the loss of ports is gonna be a hard sell.

I mean sure the whiners would probably buy one with nothing but four Thunderbolt ports, but for anyone who actually uses it for something other than connecting to a dock, the loss of ports is gonna be a hard sell.
 
i'm 99% certain that he is joking. no professional actually needs 8tb on a laptop. anyone with such massive amount of storage would probably use a more redundant and reliable solution.

I want the fastest available storage for my Lightroom catalog, and I always want it with me.

Of course I have all this both on a local NAS with redundant drives and cloud storage, but I want the live data on my primary computer.

The Samsung X5 spec maxes out at 2800 MB/s read and 2300 MB/s write, and also is only available in a 2 TB configuration which costs over $500.

Apple doesn't publish SSD speed specs, but practical benchmarking has shown results for the low-end M1 Macs released so far that are better than that. Of course, I expect the pro M1X/M2/M2X Macs to have even faster storage.

Also, internal storage means a port that can be used for something else, fewer cables to manage, less to carry and less that can be lost, broken or stolen.

Going from 512 GB to 8 TB on the current 5K Intel iMac costs $320 per TB. This isn't entirely unreasonable for fast storage, internal or otherwise, and is actually significantly cheaper per TB than getting 1-4 TB. (And only $70 more per TB compared to the Samsung X5.)

If 8 TB is offered on the 14" MacBook Pro, and doesn't cost more than ~$400/TB, I am definitely going to get it. (This is of course assuming that it is equipped with everything I need, and Apple releases a decent monitor option. I am definitely looking forward to going back to mobile only after many years on a 5K iMac.)
 
  • Like
Reactions: Roadster Lewis
And yet they did exactly that with the 2016 MacBook Pro.
Sorry, how could I forget the 30Hz HDMI port, and the USB2 SD card reader they dropped.

Literally one $27 portable USB hub provides more connectivity than all the HDMI, USB-A and card-reader ports on the 2015 MBP. Not an exaggeration.

Swapping USB type-A and HDMI for ports that can trivially do those same things, OR do anything number of other things is only "less ports" if you want to whine about it.


I mean sure the whiners would probably buy one with nothing but four Thunderbolt ports, but for anyone who actually uses it for something other than connecting to a dock, the loss of ports is gonna be a hard sell.
Or connecting to a USB-C/HDMI cable. Or a USB-C/DP cable. Or a TB3 display. Or literally any other device you can think of, like they've done for the last... half a decade.


How many types of devices can you attach to a HDMI port, or an SD slot? I'll wait.
 
Last edited:
Going from 512 GB to 8 TB on the current 5K Intel iMac costs $320 per TB. This isn't entirely unreasonable for fast storage, internal or otherwise, and is actually significantly cheaper per TB than getting 1-4 TB. (And only $70 more per TB compared to the Samsung X5.)

If 8 TB is offered on the 14" MacBook Pro, and doesn't cost more than ~$400/TB, I am definitely going to get it. (
The 13" M1 MBP upgrade price from 512 to 2TB is the same as the 16" Intel MBP price for the same upgrade (512 to 2TB). As with a lot of things, the cost of upgrades is generally quite similar across product lines for Apple. The 16" upgrade from 512 to 8TB is $2400 (or $2200 from 1TB), so right around your ~$320/TB cost.
 
  • Like
Reactions: Arctic Moose
What makes the GPU in the M1(x)(2) “integrated?” Just because it is in the same package as the CPU? What if it is on a separate die in the same package?

Apple won’t be adding nvidia or AMD GPUs, given that even the M1 GPU is quite serviceable; if they quadruple the number of GPU cores, the performance should be pretty competitive with all but the most expensive nvidia/AMD GPUs, and will do so while burning a fraction of the power and generating a fraction of the heat.
Q: why are apples gpus more power efficient than nvidias and amds?

Is it is simply due to using a more advanced process node or is it some inherent architectural advantage akin to the cpu energy efficiency apple has running arm vs x86?
 
Q: why are apples gpus more power efficient than nvidias and amds?

Is it is simply due to using a more advanced process node or is it some inherent architectural advantage akin to the cpu energy efficiency apple has running arm vs x86?

I honestly have no idea. I design CPUs, not GPUs. I would assume that 20% of that is because they use CPU design techniques to design the GPUs (unlike nvidia or AMD, at least at the time I left AMD). Another chunk is the unified memory architecture. Another chunk is the ability to optimize for their own software/OS. Beyond that, beats me.
 
Sorry, how could I forget the 30Hz HDMI port, and the USB2 SD card reader they dropped.

Literally one $27 portable USB hub provides more connectivity than all the HDMI, USB-A and card-reader ports on the 2015 MBP. Not an exaggeration.

Cool.

And yet the amount of times anyone in the history of everything has ever run into "if only I had a fifth Thunderbolt 3 port!" is zero.

The amount of times people run into "wonderful, I forgot my HDMI adapter, but the good news is, if I had thought to bring it, it would only have cost me $27!".

Having four Thunderbolt 3 ports is nice. Having three USB4 ports and an HDMI port? Way nicer.

How many types of devices can you attach to a HDMI port, or an SD slot? I'll wait.

Let me know the next time you actually run into a customer whose displays take HDMI.
 
  • Like
Reactions: Rashy
Having four Thunderbolt 3 ports is nice. Having three USB4 ports and an HDMI port? Way nicer.
Literally only if you use exactly one HDMI device.

If you use four TB3/USB-C devices now, having a HDMI port doesn't help, there are no simple/cheap solutions to let you actually use that port. But hey, at least forgetful/disorganised people will finally stop whining huh? Yeah I didn't think that was likely to actually happen either.

Let me know the next time you actually run into a customer whose displays take HDMI.
Huh? Did you mean "don't take" HDMI?

Plenty of displays either don't support HDMI (hint: every single display Apple has sold, either their own or 3rd parties - for the last decade) or are limited to lower resolution/refresh rate over HDMI (hint: any display > 4K, which means any good display over about 24")

Regardless of that - plenty of use-cases for TB3 ports, are completely unrelated to displays - because surprise ****ing surprise TB3 ports can connect to all number of things...
 
And yet the amount of times anyone in the history of everything has ever run into "if only I had a fifth Thunderbolt 3 port!" is zero.
How could I miss this line before.

I mean where do I start. Do we address the simple math issue where 3 is less than five? Nah. Too easy.


I'm so glad you completely understand the entirety of computer use on the planet, and have deemed that no one needs any more than three TB3 ports, and that HDMI is in fact a universal standard that everyone uses everywhere.
 
  • Disagree
Reactions: Rashy
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.