Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
lets see, the A12 is faster than the best Qualcomm, and that is last year's phone Soc. So now they have a 2 year advantage, used to be 1. did I miss something?
[automerge]1571246791[/automerge]
Apple doesn't innovate anymore. ummmmm
Yeah apple’s lead in chip power has never been greater than it is right now.
 
Could be a "chicken or the egg" causal conundrum to some degree - does investment follow profit, or does profit follow investment? Of course, overall desktop sales - both PC and Mac - have declined since the advent of smart phones, but a more telling statistic would be to compare Mac sales trends compared to PC sales trends, especially over the last 3 years or so. Top of the line PC hardware has shown significant improvement over that period. Mac development appears to have stagnated a bit of late, and especially laptops have had some QC issues with keyboards, displays, and overheating/throttling. Doing a google search, I could only find relevant statistics through 2016. Apple is finally coming out with a new Mac Pro, but it's beyond the financial reach of most Mac users. The old "cheese grater" Pros were never cheap, but weren't nearly so expensive as the new line, where even a display stand runs about a thousand bucks. Another interesting stat would be proportion of desktops vs. laptops with both PC's and Macs.

What are the significant improvements in the last 3 years for “top of the line PC hardware”?
 
Benchmarks are astonishing, but the A series chips won’t escape the ‘mobile’ stigma for enthusiasts until they can demonstrate that effectiveness doing the same jobs as that Core i9 machine (Like After Effects rendering or playing a new AAA game). It will have to ultimately be a like-for-like usage comparison, and that’s probably not gonna come until a “hypothetical“ A13X ends up in a MacBook with a relatively massive pile of RAM.

I’ll be absolutely fascinated to see what happens when that eventuality comes along, though.

Imagine if they took say, an A13 or A12x, added a lot of power, and a large active cooling solution?
That thing would FLY. Even in a 13 MBP form factor they could probably get 2x performance just countering the thermodynamics over a tiny closed system
 
  • Like
Reactions: Jerion
Could be whyView attachment 869943
And I'm not being snarky. It's just that they have quite the vested interest in that big red part.

Maybe those 9% Mac sales would be a much higher number if Apple gave us some more hardware options, don't you think? ;)

Are you suggesting they 1) buy intel, 2) ditch intel?
They rely on Intel there, unfortunately :(

Apple doesn't currently rely on Intel for eGPU support (seems Intel will release a dedicated GPU in 2020) nor the dedicated GPUs in the current iMac or MacBook Pro line, which are all from AMD. So no, this is more about Apple also supporting modern Nvidia graphics cards I think. One good thing is there seems to be support coming (in Catalina 10.15.1?) for AMDs relatively new Navi GPUs (5700 and 5700 XT), but some of Nvidia's current offerings are still more powerful and also more power efficient.
 
I casually play fortnite on weekends on my XS at 60FPS and phone gets so hot and screen dims to the point where it's nearly impossible to play unless you switch to 30FPS but its completely different. Roommate got pro and I thought there was a change maybe and turns out its identical and get as hot and dims the display as low as my XS.

Tell me what you want but judging simply by a game I see no major difference at all in XS vs Pro
 
only Apple would put as much effort into increasing their desktop graphic performance.

not theirs, maybe 2021

This sort of thing makes me wonder how powerful these chips would be operating without power and heat constraints.

like the power stone

I thought so first too but after having been on the iPad Pro 12.9 for a month now it feels like this thing has 16GB or even 32GB of Ram.

my XS Max with 4 GB of RAM kicks app out of memory so often
 
An iOS app written in Obj-C or Swift does not use garbage collection, regardless of "generation". It uses either manual or automatic reference counting. There is no at-runtime automated memory management; reference counting is prepared at compile time and has extremely little runtime overhead.

The confusion is that ObjC did have garbage collection on MacOS 10.5 through 10.8 (2007-2016). In either case, reference counting isn't free, it adds overhead every time a reference is taken or destroyed, so it trades off RAM for CPU, and you pay for the processing continuously, instead of all at once.
 
Last edited:
The confusion is that ObjC did have garbage collection on MacOS 10.5 through 10.8 (2007-2016).

That's true, but it was an opt-in feature and many apps didn't use it. It also never existed in iOS.

In either case, reference counting isn't free, it adds overhead every time a reference is taken or destroyed, so it trades off RAM for CPU, and you pay for the processing continuously, instead of all at once.

Allocating and deallocating something doesn't take much processing. The main issue with reference counting isn't resource overhead; it's how error prone its edge cases are.
 
  • Like
Reactions: archer75
Specifically, the explicit reason they moved to Intel was that IBM wasn't interested in making low power chips for notebooks and Intel was. That's what led directly to the MacBook Air and Intel's focus on laptop chips/low-power performance.

Apple doesn't care about desktop-class performance, generally speaking; their focus has always been mobile performance and power. That's been true since the first powerbooks, and it's true today.

Well yes, that's what power per watt is all about. Mobile was the future then and is today.
 
  • Like
Reactions: Jerion
Im surprised that apple does not name their GPU and feature it more in presentations

They could do that (and for a Mac release they actually might!) but they know their audience. The vast majority of people have absolutely no idea what a GPU does or what its name implies about the goodness of a device. The "A10, A11, A12" branding keeps it very simple: The new chip is one better than the last one! A lot of tech gets caught up in the idea of giving everything a cool-sounding combination of numbers and letters (i.e. GTX 2070 Ti Super - what on earth does that mean to a layperson?). You could argue that Apple has dipped its toe into that territory with the iPhone lately, and as far as I could tell nobody really thought it was a good look.

Long as the GPU exceeds expectations and keeps the device in "better than what I need" territory, no branding is sufficient branding. And geeks like us can still ruminate over it all anyway.
 
It's fairly easy to explain at a technical level.

iOS apps are mostly AOT-compiled code with no garbage collection (typically written in ObjC, Swift or C++ and targeting the native CPU); Android apps are mostly JIT-compiled code with a garbage collector (typically written in Java or Kotlin and targeting the Java VM). The two have different performance characteristics; generally speaking, native code is faster especially at first execution, whereas JIT code can in some cases be faster when well-optimized against the current situation, but is typically slower.

One isn't necessarily better than the other, but it makes sense that the latter would require more RAM: whereas the former is already compiled, runs natively against the physical CPU, and has had memory allocations and deallocations defined during compile time, the latter needs each process to run a VM, a JIT compiler, and a garbage collector.
Is there any dependency of RAM on the type of chips being used, like boinic or snapdragon.
 
Is there any dependency of RAM on the type of chips being used, like boinic or snapdragon.

There's a dependency on what kinds of RAM are supported, but I guess you're asking if a well-designed chip needs less RAM, which… no. iPhones don't need less RAM due to their chips; they need less RAM because of the way the predominant app framework works.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.