Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
3GHz in what appears to be one of the most power-efficient chips, this looks promising. Whether you like Apple's current mobile devices, their chip development is fantastic.
 
Not science fiction at all insofar as that sort of thing has been going on with 'regular' computers for years, although there's a slight matter of network bandwidth if you want to do it wirelessly.

You don't have to do it wirelessly. The iPhone Lightning port supports a 1Gig ethernet cable adapter. Belkin, IIRC.
 
  • Like
Reactions: Zdigital2015
First, like 64-bit? 64 bit OSes have been around since the 70s. Even Windows XP was 64-bit before Mac.
Why are you bringing up OSes? This article—and presumably @I7guy’s comment—is about iPhone SoCs. Qualcomm was literally left “slack-jawed, and stunned, and unprepared” when Apple’s 64-bit A7 was announced in 2013 with the iPhone 5S.

Thanks to Apple’s famous secrecy, the entire industry was caught by surprise and the A7 “set off a panic within the industry”. (Qualcomm’s then-CMO put out a statement trying to downplay the achievement, and it didn’t go over well at all.)
 
Last edited:
  • Like
Reactions: I7guy
Yeah ... pretty much all MacBooks are throttled to all crap.

Just pointing out that the throttling has been taking place for years in the macbook pro, which is not a phone and does require high performance *and* is supposed to have good thermal characteristics. When running demanding tasks, some users have placed their laptops in freezers. Imagine paying top dollar for a high-end processor only to learn that it runs at a reduced clock speed when you need it (ie. rendering). Apple sacrifices performance for thinness and aesthetics, so getting hyped for >3GHz capabilities is pointless because they are going to be running far less than that (even if you run something that needs it).
Just pointing out Intel’s CPUs draw far in excess of their stated TDP if allowed to run past their rated frequency. The 16” MBP, with a 45 Watt i9-9980HK, can draw in excess of 70 Watts.

That CPU, rated at 2.4GHz with an all-core 100% load, runs at about 3.1GHz when fully loaded and averages about 60 Watts.

With Apple’s excellent thermal design, the CPU in the MBP runs 30% higher than Intel’s rated frequency spec. Is that a reduced clock speed? Is that “throttled”?
 
No, it’s a ridiculous flaw of physics. Keeping things in memory means that memory must be refreshed, because the memory is dynamic RAM. Refreshing memory requires power, and only horribly inefficient designs, like what Samsung does, would include more memory than is necessary. Thankfully, unlike the terrible engineers at Samsung, Apple’s engineers chose to make a much smarter design decision.
You say you are an designer of some sorts and you write stuff like this. Funny.
RAM uses power no matter what.
Saying Samsung's design is horribly inefficient because they put 16GB in their phone is objectively a huge exaggeration.
Especially when they made this jump with LPDDR5 RAM, so they used faster and more efficient RAM and also moved to 5G(which allegedly uses more RAM), they didn't just desperately threw a bunch or extra RAM in their phones like some users here would like to suggest.

The most important thing is what users gain from this much RAM and what they sacrifice.
On the sacrifice part they mostly loose a couple of minutes of battery per single charge. It's hard to quantify how much because I was never able to notice an obvious difference in battery life between the same 2 Android phones(exactly the same model) with different RAM configurations. Apps that remain in RAM don't necessarily use a lot of battery, it all depends on how the OS manages these situations and Android has gotten a lot better here in recent years.
Now on the gaining part, users gain flexibility in phone usage(they will basically very rarely see an app refreshes, it doesn't matter how resource intensive apps that are being used are), they gain better consistency and performance in general usage, the ability to have more active services without hurting phone's overall consistency, they gain advanced features like Dex mode, more multitasking options and last, just in general more future proofing.
In the long run having more RAM than necessary is better then just having exactly the necessary amount of RAM because what's enough now a lot of times is not enough tomorrow.

Also there is a precedent with poor RAM management on iOS 11 and iOS 13. With iOS 13 they managed to mainly fix it but with iOS 11 they needed a big OS overhaul to fix the RAM management. These situations make obvious the weakness of having just the enough amount of RAM.
 
With Apple’s excellent thermal design, the CPU in the MBP runs 30% higher than Intel’s rated frequency spec. Is that a reduced clock speed? Is that “throttled”?
Thank you! So many people use that word throttling and throw it around, but I don't think it means what they think it means.
 
Thank you! So many people use that word throttling and throw it around, but I don't think it means what they think it means.
Yeah some apparently think it means “can’t maintain its single-core Turbo Boost spec with all cores running at 100% load”. Or “doesn’t compare to my water cooled 95W processor that draws upwards of 250W when I overclock it” 🙄
 
Last edited:
  • Like
Reactions: firewood
Yea but 5g causes cancer so not worth it
One of the highly-probable side-effects of 5G is inflammation. Cancer can be called a form of inflammation because inflammation leads to cancer. All disease is a result of inflammation. If you eliminate inflammation disease is gone. This is why a lifestyle that does not lead to inflammation is a healthy one. i.e. low stress and no processed foods etc. There is a reason why they want to conceal 5G towers.
 
Last edited:
Android phones have more memory largely because most apps are Java, ergo JIT, and ergo require more RAM.

Non-sense. Android supports split window multitasking, background multitasking, PiP, desktop mode, etc. so it can take advantage of more DRAM. iOS is gimped in comparison to iPadOS which is catching up in features to Android.
 
Still valid, that is why I said L3 and L4 caches. L3 is SRAM, L4 is DRAM. I was hesitant to get into too many technical details.

There is no standard that says L4 must be DRAM. It just happens to be something Intel did in Iris Pro. Which isn’t even a thing any more.
 
Non-sense. Android supports split window multitasking, background multitasking, PiP, desktop mode, etc. so it can take advantage of more DRAM. iOS is gimped in comparison to iPadOS which is catching up in features to Android.

Both are true.

Yes, if iPhones allowed split window multitasking (most of us don't have fingers small enough to make split-window multitasking practical), that would make more RAM useful.

But also, iPhones do need less RAM than Android phones because ahead-of-time compilation requires significantly less memory at runtime than just-in-time compilation.* And yes, having less RAM also requires less energy.

*) I say this as someone who predominantly writes apps that run as JIT, so no, I'm not biased.
 
3.3% higher clock than 3GHz Microsoft SQ1 ARM mobile SoC in Surface Pro X but can they maintain boosted clocks? Apple SoCs tend to thermally throttle under load.
While I'm also interested to see how this new chip handles thermal throttling (most reports are that the A13 improves significantly over the A12), it hardly seems fair to compare a phone with a large tablet. I'm sure the the A14X will do much better than the SQ1 even throttled!
[automerge]1584528362[/automerge]
I'm sorta wondering when Apple starts building their own lightweight high-performance low-power ARM-based servers for their server farms. They could literally make one that works exactly how they want. And be increasingly less beholden to Azure/Amazon for services.
It's Apple, they could be already and we'd never know!
[automerge]1584528760[/automerge]
Said this 6 months ago I would have agree in some form. But Amazon is moving most of their AWS to ARM. This is the start of ARM on Server. It is only a matter of time before ARM repeat what x86 did in the 90s, taking over the Server market.

I just thought Mac Pro introduction was a little unfortunate if Apple is moving to ARM Mac as soon as this year.
Eh. Pretty sure it was the same (or similar) timing with the last generation G5 Mac which was quad core and faster than the first generation x86 Mac Pros. The G5 was a better platform (at first) than Xeons at the time, but the rest of the lineup, and particular the laptops, were suffering because the PPC consortium wasn't making G5 processors that could run in a compact form-factor. We are seeing the same thing today.

I agree X86 compatibility if a big deal but Microsoft et al can get some traction with Windows on ARM, the way forward will be a lot easier for Apple.
 
Last edited:
There is no standard that says L4 must be DRAM. It just happens to be something Intel did in Iris Pro. Which isn’t even a thing any more.
well, based on the current avaliable silicon technology, this is the only technology we have, a step below L3 cache.
 
well, based on the current avaliable silicon technology, this is the only technology we have, a step below L3 cache.

But we don’t really have it. Your proposal was to eliminate memory. That cuts very deep into assumptions of how processes work.
 
That doesn't mean Apple can stop making significant improvement to the A14.

Maybe.

What it does mean is that it doesn't really matter if this rumor is true. Even if they just do a 10% iterative improvement this time, the A14 will be great, because the A13 already is.
 
  • Like
Reactions: russell_314
guess Qualcomm beat them to it with the 865+ (SQ1 is still arm, but it's meant for a different class of product, so I don't count it).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.