Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s not ego...it’s about not letting the wolf back in the hen house.

NVIDIA’s presumptuousness of their own preeminence got them tossed and hopefully they will never get back in the door at One Apple Park Way. They don’t serve their customers, they serve themselves.



:)
 
You know what I think would actually be pretty sweet?!
What if they really pioneered some sort of interface that would allow stuff like hard drives and video cards and stuff, to be plugged in... but run just as fast as if they were installed internally!
Then you could add say.... a video card in a box, & voila! You could load boot camp & run at the exact same speeds as any other PC gamer!

Or, you know, they could give us more value for our dollar!


No need to be a smartass.
 
Given the increased wattage I would venture to say that next year, by using the new chip manufacturing node (5nm), Apple will go for even greater battery life to allow for 5G (power hungry by all accounts)
 
Could be whyView attachment 869943
And I'm not being snarky. It's just that they have quite the vested interest in that big red part.
Could be a "chicken or the egg" causal conundrum to some degree - does investment follow profit, or does profit follow investment? Of course, overall desktop sales - both PC and Mac - have declined since the advent of smart phones, but a more telling statistic would be to compare Mac sales trends compared to PC sales trends, especially over the last 3 years or so. Top of the line PC hardware has shown significant improvement over that period. Mac development appears to have stagnated a bit of late, and especially laptops have had some QC issues with keyboards, displays, and overheating/throttling. Doing a google search, I could only find relevant statistics through 2016. Apple is finally coming out with a new Mac Pro, but it's beyond the financial reach of most Mac users. The old "cheese grater" Pros were never cheap, but weren't nearly so expensive as the new line, where even a display stand runs about a thousand bucks. Another interesting stat would be proportion of desktops vs. laptops with both PC's and Macs.
 
  • Like
Reactions: entropys
Can't wait to see a Mac powered by a custom chip. Although the migration will likely not be pretty of course.
 
Apple desktop/laptops use Intel chips - look at Intel.
Well, by graphics performance, the OP is likely referring to the standard graphics processors installed on Macs - generally Radeon rather than Nvidia (or AMD of late). As to whether Macs would have overall improvement by moving to ARM or AMD (or some Apple chip in the future), who knows. Except for some cooling issues with some i9 configurations in laptops and inadequate ventilation on iMacs, the main sore point with Macs these days (in my opinion) is the stagnation and slow development of MacOS, at least on the UI level. There have been some significant improvements under the hood with the file system APFS, and the underlying BSD core system remains robust.
 


:)

Really hate this. Glad I can still use Bootcamp with my eGPU with Nvidia 1070. Just wished it worked again with my mac :(
 
If only Apple would put as much effort into increasing their desktop graphic performance.
That settles it. I am going to do all my software development on an iPhone 11 from now on. It's clearly the best performing device they have on a power to weight ratio and therefore makes it a superior device. Or is there another reason the graphics has to be stupidly good on a pocket device?
 
I thought so first too but after having been on the iPad Pro 12.9 for a month now it feels like this thing has 16GB or even 32GB of Ram. I never noticed any lags or "beachball moments". It most of the time even feels snappier than my previous 16GB Ram MacBook Pro.

Hugely impressed with the A12 and Ram managment on the iPad Pro.

Imagine how much more impressive it will be with the A13 and more ram.
Honestly, I doubt I’ll or anyone will really notice a single thing with the A13. Of course it is either faster or more power efficient at times (as someone posted before it’s one or the other,) so it won’t make the A12X seem at all slower, but it will obviously have added features. RAM will only make a difference if developers are allowed to take advantage of all the RAM Apple puts into iDevices, (currently most devs use 2GB) since that’s what the main more budget friendly iPads have been using and still use. Only some programs seem to be able to use more than 2 depending on what they are. So essentially an iPad Pro with 4GB is close to 8GB on a MacBook Pro.
regular 2GB iPad is equal to a MacBook Air with 4GB.
it’ll be interesting if Apple adds more than 4-6GBs by 2021.
personally, if Apple made an iPad Pro with Micro-LED, 8GB RAM, 1TB, A15X, WiFi 6, glass wireless charging back, faster I/O, ToF camera, better FaceTime and back cameras and improved aluminum... I’d get one of those and probably not upgrade for 4-years. But I’m super, super happy with my 3rd gen IPad Pro. I was tempted to get the 6GB version but Procreate is the only reason I’d have done it, (I asked Procreate about it and they said they use under 4GB so 6GB would be wasted on me.)So yeah, I’m good for quite some time with my current iPad. 🙂✌️👌

Kallum.
 
Benchmarks are astonishing, but the A series chips won’t escape the ‘mobile’ stigma for enthusiasts until they can demonstrate that effectiveness doing the same jobs as that Core i9 machine (Like After Effects rendering or playing a new AAA game).

Oh I agree - and can’t wait to see what these chips can do outside of mobile. Not that the raw speed is wasted in mobile. Part of the battery efficiency is to get any particular unit of work done as quickly as possible so the relevant parts of the chip can go back to sleep.

Coincidentally, I recently bought and restored a 1980s Acorn Archimedes computer. These are pretty much unknown outside of the UK, but were the first computers to use an ARM processor. (ARM originally stood for Acorn RISC Machine, before being spun off from their parent company).

The Archimedes with an ARM2 processor at 8MHz performed 4 MIPS in 1986. That was crazy at the time.

It’s quite nice to think that the A13, in some small part, traces its history back to that old machine!

(oh, and Apple worked with Acorn decades ago on the ARM6 processor that was used in the Newton)
 
  • Like
Reactions: Jerion and mobi
Actually, iOS apps use garbage collection, depending on the generation. With ARC there's less GC overhead, since theoretically objects get whacked once they're unused.
Absolutely not true. You are badly confusing reference counting and garbage collection. They are _not the same_. iOS apps DO NOT use garbage collection.
 
So the trade-off of faster better performance this time around is frequent throttling?... gross. 😉
How does the A12X line up with the A13? The A12X had a GPU equal to the Xbox One S... hmmm. 🤔

CPU wise iPad Pro is still faster in everything except single core (naturally since it has more cores) - but it hints at what a beast the A13X is going to be. For GPU 2018 Pro looks around 40% faster in Geekbench METAL, 3DMark & GFXBench offscreen tests, at least- again more sheer # of cores.
 
  • Like
Reactions: Just sayin...
Android has been using ART with AOT compilation by default since version 5.0 now. It replaced the JIT compilation of Dalvik, the previous runtime.

Your other points remain valid, however.

You have a good point, but ART is now actually mixed AOT+JIT.

(And I would guess the main memory footprint impact comes from GC.)
[automerge]1571314275[/automerge]
Actually, iOS apps use garbage collection, depending on the generation. With ARC there's less GC overhead, since theoretically objects get whacked once they're unused.

An iOS app written in Obj-C or Swift does not use garbage collection, regardless of "generation". It uses either manual or automatic reference counting. There is no at-runtime automated memory management; reference counting is prepared at compile time and has extremely little runtime overhead.
[automerge]1571314330[/automerge]
The Android Java code is now compiled ahead of time (AOT),

That's not entirely correct. Starting in Android 7, ART includes a JITC again.
[automerge]1571314399[/automerge]
Android doesn't run the JVM, it runs the Dalvik Virtual Machine.

My mistake. It does in fact not run the Java VM. It typically runs Android Runtime (ART) now; it used to be Dalvik.
 
more than by the processor, I am amazed at how bad the ultra wide angle is compared to the competition, especially in the dark. Ugh
 
Last edited:
...the A13 "essentially matched" the "best that AMD and Intel have to offer" for desktop CPUs, at least based on SPECint2006, a suite of CPU-intensive cross-platform integer benchmarks.
Wow. Just wow.
 
They will. Just Catalyst nneds to mature and more Catalyst apps emerge. And question is whether it will run iPadOS or macOS.

Releasing a Catalyst-only macOS on ARM makes no sense at this time, and probably isn't feasible for at least five years. It would also be a major slap in the face to tell developers that 64-bit AppKit apps are the supported way forward only to make AppKit apps not work at all on ARM.
 
Releasing a Catalyst-only macOS on ARM makes no sense at this time, and probably isn't feasible for at least five years. It would also be a major slap in the face to tell developers that 64-bit AppKit apps are the supported way forward only to make AppKit apps not work at all on ARM.
What if it's running iPadOS & not macOS?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.