Especially if the rumour that they aren't going with miniLEDs this time around ends up being true.no man, its gonna stay at the same price tag.
Especially if the rumour that they aren't going with miniLEDs this time around ends up being true.no man, its gonna stay at the same price tag.
This hasn't ever happened though right? Someone in this thread pointed out that Apple has always Done A12>A12X and intel did laptop>desktop.
All cores can be used simultaneously (with the exception of the Apple A10 Fusion, which could only use either the performance or efficiency cores).can someone explain to me exactly how the 8 cores on the m1 chip works? so it has 4 efficiency cores and 4 performance cores, right? what happens if i am doing something that requires fast multicore performance? will the m1 chip use all 8 cores simultaneously? or will it just use the 4 performance cores? also, what is the speed difference between a single efficiency core and a single performance core?
aren't they going along with the mini led this time? I mean I don't mind as far as I get the other ports. I want to use for sound production and engineering.Especially if the rumour that they aren't going with miniLEDs this time around ends up being true.
aren't they going along with the mini led this time? I mean I don't mind as far as I get the other ports. I want to use for sound production and engineering.
I still wonder why Gurman claims the M1X is 10 high-performance cores and 2 high-efficiency cores. Wouldn't 8 high-performance cores and 4 high-efficiency cores be better in terms of battery time per charge? Now, 16 GPU cores and access up to 64 GB of RAM is a likely given with an upgraded Unified Memory Architecture controller.
I believe the 14" MacBook Pro will come with 16 GPU cores on the SoC; the 16" MacBook Pro may come with 32 GPU cores on the SoC as an extra-cost option.
So is the integrated GPU in the M1(x)(2) better than a discrete GPU? Is there utility in Apple adding discrete GPUs (either Nvidia or AMD) to the ARM MBPs?
If the upcoming Apple Silicon CPU architecture is new, it stand to reason that the GPU architecture will also be new with better performance compare to the GPU core of the M1? If that's the case, the performance will be even better than just extrapolating the M1 GPU core counts. Maybe another 10-15% improvement on top?What makes the GPU in the M1(x)(2) “integrated?” Just because it is in the same package as the CPU? What if it is on a separate die in the same package?
Apple won’t be adding nvidia or AMD GPUs, given that even the M1 GPU is quite serviceable; if they quadruple the number of GPU cores, the performance should be pretty competitive with all but the most expensive nvidia/AMD GPUs, and will do so while burning a fraction of the power and generating a fraction of the heat.
Not sure about GPU year-over-year changes, but CPU are about 20%.If the upcoming Apple Silicon CPU architecture is new, it stand to reason that the GPU architecture will also be new with better performance compare to the GPU core of the M1? If that's the case, the performance will be even better than just extrapolating the M1 GPU core counts. Maybe another 10-15% improvement on top?
Do you mean it's hard for you to swallow? Well not my fault you know.The truth is sometimes hard to hear, isn’t it?
There’s a couple of usb4 models now (some maybe still in preorder though) but they all require ac power.
It’s hard to see how Apple would “sell” (as in, marketing speak) a reduction in ports. I mean sure the whiners would probably buy one with nothing but a hdmi and USB-a ports, but for anyone who actually uses it for something other than connecting to a projector the loss of ports is gonna be a hard sell.
What exactly are you expecting me to... swallow?Do you mean it's hard for you to swallow? Well not my fault you know.
Welcome back to common sense.
I just hope it's a joke about HDMI, it really should be DisplayPort.
It’s hard to see how Apple would “sell” (as in, marketing speak) a reduction in ports.
I mean sure the whiners would probably buy one with nothing but a hdmi and USB-a ports, but for anyone who actually uses it for something other than connecting to a projector the loss of ports is gonna be a hard sell.
i'm 99% certain that he is joking. no professional actually needs 8tb on a laptop. anyone with such massive amount of storage would probably use a more redundant and reliable solution.
Sorry, how could I forget the 30Hz HDMI port, and the USB2 SD card reader they dropped.And yet they did exactly that with the 2016 MacBook Pro.
Or connecting to a USB-C/HDMI cable. Or a USB-C/DP cable. Or a TB3 display. Or literally any other device you can think of, like they've done for the last... half a decade.I mean sure the whiners would probably buy one with nothing but four Thunderbolt ports, but for anyone who actually uses it for something other than connecting to a dock, the loss of ports is gonna be a hard sell.
The 13" M1 MBP upgrade price from 512 to 2TB is the same as the 16" Intel MBP price for the same upgrade (512 to 2TB). As with a lot of things, the cost of upgrades is generally quite similar across product lines for Apple. The 16" upgrade from 512 to 8TB is $2400 (or $2200 from 1TB), so right around your ~$320/TB cost.Going from 512 GB to 8 TB on the current 5K Intel iMac costs $320 per TB. This isn't entirely unreasonable for fast storage, internal or otherwise, and is actually significantly cheaper per TB than getting 1-4 TB. (And only $70 more per TB compared to the Samsung X5.)
If 8 TB is offered on the 14" MacBook Pro, and doesn't cost more than ~$400/TB, I am definitely going to get it. (
Q: why are apples gpus more power efficient than nvidias and amds?What makes the GPU in the M1(x)(2) “integrated?” Just because it is in the same package as the CPU? What if it is on a separate die in the same package?
Apple won’t be adding nvidia or AMD GPUs, given that even the M1 GPU is quite serviceable; if they quadruple the number of GPU cores, the performance should be pretty competitive with all but the most expensive nvidia/AMD GPUs, and will do so while burning a fraction of the power and generating a fraction of the heat.
Q: why are apples gpus more power efficient than nvidias and amds?
Is it is simply due to using a more advanced process node or is it some inherent architectural advantage akin to the cpu energy efficiency apple has running arm vs x86?
Sorry, how could I forget the 30Hz HDMI port, and the USB2 SD card reader they dropped.
Literally one $27 portable USB hub provides more connectivity than all the HDMI, USB-A and card-reader ports on the 2015 MBP. Not an exaggeration.
How many types of devices can you attach to a HDMI port, or an SD slot? I'll wait.
Literally only if you use exactly one HDMI device.Having four Thunderbolt 3 ports is nice. Having three USB4 ports and an HDMI port? Way nicer.
Huh? Did you mean "don't take" HDMI?Let me know the next time you actually run into a customer whose displays take HDMI.
How could I miss this line before.And yet the amount of times anyone in the history of everything has ever run into "if only I had a fifth Thunderbolt 3 port!" is zero.