Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Even if they catch up, will the be as sufficient as the A14? Those Samsung Flagships have 4.000 mAh battery capacities and not a significantly better battery life.

That's strictly Samsung issue. Battery management is not their priority or they have simply come to conclusion that people charge their phones once a day so why even bother.

S20 with 4000mAh lasts a bit less than 12 Pro with 2900mAh but at the same time Xperia 5II with 4000mAh lasts 30% more than 12 Pro.
 
That's strictly Samsung issue. Battery management is not their priority or they have simply come to conclusion that people charge their phones once a day so why even bother.

S20 with 4000mAh lasts a bit less than 12 Pro with 2900mAh but at the same time Xperia 5II with 4000mAh lasts 30% more than 12 Pro.
The iPhone 12 Pro has a 2815mAh battery. It is surprising that the iPhone 12 has better battery life than even the Xperia 5II. This is in large part due to the A14 CPU in how energy efficient it is.

Screen Shot 2020-12-19 at 6.19.43 PM.png


Screen Shot 2020-12-19 at 6.18.18 PM.png


 
Not really. The performance of the chip was never a reason for dropping support (2 years max) in the Android world, at least in the high-end segment. Another thing I would like to add is that new low-end phones that come with the newest Android versions often come with way slower SoCs than flagships from 5-6 years ago which haven't received OS updates for ages.

Yes really, comparing Android to IOS isn’t even close to a fair comparison. Android devices drop support because the OEM’s are lazy and would rather sell you a new device than continue to support an “old” one. Part of the reason Apple devices hold their value is due to Apple supporting them for so long. The fact that their chips are so powerful/efficient plays a pivotal role into ongoing support. To think otherwise is willful ignorance and to compare it to Android is quite honestly a joke.
 
  • Like
Reactions: snipr125
Apple silicon is most definitely the future

Well it would be if Apple Licenced its use to other brands, or sold the chips for other to use.
Right now and for the foreseeable future Apple silicon sadly does not mean much.

Naturally you and many will disagree, but if you have say 90 - 95% of the world on one silicon/platform of one type
And only 5 - 10% of the world on Apple Silicon.
It does not really matter whatsoever how amazing Apple silicon is, it's never going to be important or relevant enough to make any real difference.

And please note, I wish this was different. I'd love Apple silicon to be more widespread and be available in many more devices build by many more brands so the vast majority of consumers would actually benefit from what it can do.

Sadly for now, and for the foreseeable future there is no way this is going to happen.

As I have said a couple of times:
The PC ceases to exist, the modern (Business, Medical, Engineering, Entertainment? ) world as we know it will come to an end, and come crashing down.
If Apple ceases to exist, a few % of people will be upset, and go and buy a PC/laptop instead and carry on.
 
Well it would be if Apple Licenced its use to other brands, or sold the chips for other to use.
Right now and for the foreseeable future Apple silicon sadly does not mean much.

Naturally you and many will disagree, but if you have say 90 - 95% of the world on one silicon/platform of one type
And only 5 - 10% of the world on Apple Silicon.
It does not really matter whatsoever how amazing Apple silicon is, it's never going to be important or relevant enough to make any real difference.
Apple is big and influential enough that this will cause AMD, Intel, Qualcomm to rethink their strategies. (For Qualcomm, this may not result in much, as they don’t particularly care unless customers become more demanding.)

Competition is good for us. If it causes Intel to retake the crown, that’s also good, as it will in turn cause Apple to work harder.
 
Well it would be if Apple Licenced its use to other brands, or sold the chips for other to use.
Right now and for the foreseeable future Apple silicon sadly does not mean much.

Naturally you and many will disagree, but if you have say 90 - 95% of the world on one silicon/platform of one type
And only 5 - 10% of the world on Apple Silicon.
It does not really matter whatsoever how amazing Apple silicon is, it's never going to be important or relevant enough to make any real difference.

And please note, I wish this was different. I'd love Apple silicon to be more widespread and be available in many more devices build by many more brands so the vast majority of consumers would actually benefit from what it can do.

Sadly for now, and for the foreseeable future there is no way this is going to happen.

As I have said a couple of times:
The PC ceases to exist, the modern (Business, Medical, Engineering, Entertainment? ) world as we know it will come to an end, and come crashing down.
If Apple ceases to exist, a few % of people will be upset, and go and buy a PC/laptop instead and carry on.
While I can’t know what he meant for sure, I’d say a fair interpretation (how I took it) is that Apple Silicon is the exemplar for what future chips will look like rather than “Apple will themselves be the primary chip maker across the PC and phone industries”. By being ahead in performance and PPW they supply the model for how SOCs, even down to the CPU cores, are designed.

Also in terms of PC units sold in the last year their market share of 10-15% while selling mostly the high end market, as chucker23n1 says, makes them very influential in terms of development in PCs and obviously even more influential in the phone/tablet space where they have larger market share - they are one of (if not the) largest companies in the world for a reason. So even the other interpretation is not *that* far off if you consider total CPUs shipped to customers! 🙃 1.4 billion active devices, mostly already running on AS and the rest now to follow, is not bad for a single company ...
 
Last edited:
  • Like
Reactions: Piggie
To Clarify: I totally accept and believe that Apple does and will influence others when it comes to consumer chip design.
Unfortunately other companies also copy other things Apple does which may be seen as more negative than positive also.
However yes, this should all end up being excellent for the consumer at the end of the day.
I'm simplifying things a LOT here but I feel much of what Apple has done is to look at the things users of their products want to do with the device most often.
Then designed a chip with circuitry? inside it, specifically build to accelerate those tasks.

It's a bit like say you wanted to move cars around really fast, then you'd not employ 20 random people who can do anything, you'd employ 20 people who are rubbish at most things but have many years fast driving cars :)

A tiny drawback to this "could" be that what you want to do is something very different to what Apple thinks you should be doing so their effort is not very strong in that particular area. But I hope in reality is does not work out like that.

Apple and GPU's I would suggest have been a weak area for years, so it will be interesting to see how advanced they are willing to push their (what will be separate at some point) GPU's and if one day they will get near AMD and Nvidea.

Build in hardware to handle all the fancy effects and not of course raytracing etc.

The next 2 to 3 years should tell us a lot about what Apple wishes to do and their ability once they are expected? to compete in those areas.
 
Geekbench clearly ALSO needs competition !

But NOT from another Synthetic Benchmark, from a Real World test run on both iOS & Android devices.

Consider this, even the latest Apple mobile devices cannot DE-compress Full-Res 12 Mpx HEIC photos very fast.

If one tries to View an Image Sequence of HEIC photos @ the same Rate that the sequence was captured at (e..g, 30 fps on High-Perf Camera Apps), you'll clearly see the issue (on both platforms) !

Also, I doubt most know the specifics of even a single Geekbench test.

I for one, do NOT !

Food for thought.
 
  • Like
Reactions: mi7chy
Geekbench clearly ALSO needs competition !

But NOT from another Synthetic Benchmark, from a Real World test run on both iOS & Android devices.
Consider this, even the latest Apple mobile devices cannot DE-compress Full-Res 12 Mpx HEIC photos very fast.

If one tries to View an Image Sequence of HEIC photos @ the same Rate that the sequence was captured at (e..g, 30 fps on High-Perf Camera Apps), you'll clearly see the issue (on both platforms) !

Do feel free to write a benchmark for that operation, which in your eyes is clearly very representative of real-world use?
Also, I doubt most know the specifics of even a single Geekbench test.

I for one, do NOT !

Food for thought.
 
Well it would be if Apple Licenced its use to other brands, or sold the chips for other to use.
Right now and for the foreseeable future Apple silicon sadly does not mean much.

Naturally you and many will disagree, but if you have say 90 - 95% of the world on one silicon/platform of one type
And only 5 - 10% of the world on Apple Silicon.
It does not really matter whatsoever how amazing Apple silicon is, it's never going to be important or relevant enough to make any real difference.

And please note, I wish this was different. I'd love Apple silicon to be more widespread and be available in many more devices build by many more brands so the vast majority of consumers would actually benefit from what it can do.

Sadly for now, and for the foreseeable future there is no way this is going to happen.

As I have said a couple of times:
The PC ceases to exist, the modern (Business, Medical, Engineering, Entertainment? ) world as we know it will come to an end, and come crashing down.
If Apple ceases to exist, a few % of people will be upset, and go and buy a PC/laptop instead and carry on.

One of the ways to get more than 5-10% of the world on apple silicon is for apple silicon to continue to annihilate the competition. And if you count ios devices as apple silicon, the number is probably already higher than 10%.
 
  • Like
Reactions: Piggie
To Clarify: I totally accept and believe that Apple does and will influence others when it comes to consumer chip design.
Unfortunately other companies also copy other things Apple does which may be seen as more negative than positive also.
However yes, this should all end up being excellent for the consumer at the end of the day.
I'm simplifying things a LOT here but I feel much of what Apple has done is to look at the things users of their products want to do with the device most often.
Then designed a chip with circuitry? inside it, specifically build to accelerate those tasks.

It's a bit like say you wanted to move cars around really fast, then you'd not employ 20 random people who can do anything, you'd employ 20 people who are rubbish at most things but have many years fast driving cars :)

A tiny drawback to this "could" be that what you want to do is something very different to what Apple thinks you should be doing so their effort is not very strong in that particular area. But I hope in reality is does not work out like that.

Apple and GPU's I would suggest have been a weak area for years, so it will be interesting to see how advanced they are willing to push their (what will be separate at some point) GPU's and if one day they will get near AMD and Nvidea.

Build in hardware to handle all the fancy effects and not of course raytracing etc.

The next 2 to 3 years should tell us a lot about what Apple wishes to do and their ability once they are expected? to compete in those areas.

Given the relative performance of the M1 GPU, and the fact that we know they will have a discrete GPU within the next year, they should be competitive with NVIDIA and AMD pretty darned soon.
 
Given the relative performance of the M1 GPU, and the fact that we know they will have a discrete GPU within the next year, they should be competitive with NVIDIA and AMD pretty darned soon.
I would love you to right on this.

I do struggle to understand how it will take Apple vastly less power, and less silicon to be able to match the years of Nvidia's creations.

With almost 9000 cores, 10GB of memory, 28,300 million transistors and hence a power draw maxing out at around 300w but pumps out almost 30 TFlops of speed.

I'd be overjoyed to see Apple be able to match this.

In all honestly do I realistically expect them to match this within quite a few years. No I don't.
But I've love them to.

The M1 GPU can do, according to Apple 2.1 TFlops just slightly less than 2.9 from a GeForce GTX 1650's

Whilst not as simplistic as just TFlops, we're in effect looking at Apple having to create a 15x improvement over their current offering to match the best from 2020.
 
I would love you to right on this.

I do struggle to understand how it will take Apple vastly less power, and less silicon to be able to match the years of Nvidia's creations.

With almost 9000 cores, 10GB of memory, 28,300 million transistors and hence a power draw maxing out at around 300w but pumps out almost 30 TFlops of speed.

I'd be overjoyed to see Apple be able to match this.

In all honestly do I realistically expect them to match this within quite a few years. No I don't.
But I've love them to.

The M1 GPU can do, according to Apple 2.1 TFlops just slightly less than 2.9 from a GeForce GTX 1650's

Whilst not as simplistic as just TFlops, we're in effect looking at Apple having to create a 15x improvement over their current offering to match the best from 2020.
Given that they’re going to have a “(mini) M-series Mac Pro” in the next couple of years they’ll have to have something powerful but who know if it’ll be a 300W monster. I say (mini) in parenthesis because that’s what the rumor says, but it is just a rumor right now. The only thing we know for sure is that their lineup will be replaced in 2 years and that includes Mac Pro’s with workstation graphics card. So we’ll see ...

Personally as someone who does GPU scientific programming ... I miss being able to run CUDA on my Mac. ☹️
 
I would love you to right on this.

I do struggle to understand how it will take Apple vastly less power, and less silicon to be able to match the years of Nvidia's creations.

With almost 9000 cores, 10GB of memory, 28,300 million transistors and hence a power draw maxing out at around 300w but pumps out almost 30 TFlops of speed.

I'd be overjoyed to see Apple be able to match this.

In all honestly do I realistically expect them to match this within quite a few years. No I don't.
But I've love them to.

The M1 GPU can do, according to Apple 2.1 TFlops just slightly less than 2.9 from a GeForce GTX 1650's

Whilst not as simplistic as just TFlops, we're in effect looking at Apple having to create a 15x improvement over their current offering to match the best from 2020.

I interviewed at NVIDIA once. Just by the questions they asked me it was apparent they were using a terrible design methodology. Maybe they’ve changed since then. I don’t know. But it sounded like they write RTL and run Synopsys to synthesize it. If that’s what they were doing, it will be pretty easy for Apple to clobber them. Not to mention that GPUs scale in performance fairly linearly with the number of cores, and the M1 doesn’t have very many cores (7 or 8).

I assume AMD is doing the same thing; while I worked at AMD, I only worked on CPUs, and we never interacted with the GPU folks.
 
  • Like
Reactions: Piggie
What makes me sad these days, unlike perhaps in the 1970s and 1980's is that all these companies seem to be run by the money men and not the tech people.
The Amiga for example would never ever had gotten built if there were money men demanding profits and returns on investments.
I do feel sadly we miss out on some amazing innovations that could happen if only the engineers were free to go down the paths they wanted.
GIANT corporations which are like lumbering giants can't take those risks, and move that fast.
I've always worked in tiny companies where you decide what you are going to do and get on with it right away.
I'd hate to work in a place where every single thing would feel like a lifetime to get done.

I'm sure you are right that companies have a way of thinking, and worse have some high up people who are unwilling to change as they think they are right.
It's not till that person leaves (IVE?) That perhaps some new things can be done as they are too high up and just block certain things from ever happening.

It would be amazing if Apple could sort of split itself into 2 or 3 sections producing different products.
A but like some of the giant japanese/other companies to. OPPO ?

Where their smaller division can move much faster and bring out more experimental products than the GIANT main company dare not produce.
 
Pretty well known by now that Qualcomm designs for sustained performance, with lower power requirements, particularly with the Adreno GPU. Apple has always been better at quick performance bursts, which helps with things like app launches. But Qualcomm excels at sustained performance activities like gaming, with little or no throttling of the CPU+GPU.

Slightly different approaches, but both companies are top shelf in this category.

LMAO “Qualcomm excels at sustained performance activities like gaming, with little or no throttling of the CPU+GPU” and yet almost ALL o last top tier Samsung CPU+GPU smartphones use some sort of vapour channel for internal cooling.
 
There is a huge difference for the average iphone user.
cPU power is not just how fast whatsapp opens, means better pictures, better AI, more secure platform for faster complex algorithm to compute.
Think this, smarthphone camera’s hardware improve just very little compared to how fast software computing is improving or what the developers have in mind, many times developers are just waiting to CPU be enough powerful to do this or this thing, to improve the resulted pic or video. Noise, light, color accuracy, sharpnesss, all elements in a pic are processed by CPU to show you what you see. How fast and efficiency can process those data would make you have or not have the option to HDR, night vision, deep focus, ETC...
The small speed percetange you win over the next year, would be enough to improve Siri’s AI, to handle new image rendering processing, new smooth visual effects, better “anything” that you have in the XS but you didn’t have in your iPhone 4S
Power and energy efficiency are slow improvements and are necessary generation by generation.

think in a supercomputer in a smartphone, could do things you can imagine now until hardware and software developers brings to you (120mhz or beyond screens, any zoom you like to do in your ultra high quality videos, pics at night as lighday at 1/60 speed... send huge files, export videos in a blink...)

all were happy with the iphone 3g screen until we saw retina screen in the iphone, the best screen you could imagine, until you see the iPhone 12 pro max... until you see the iPhone 14 por max screen, not possible today by many factors, one the CPU

I like this post ... except ‘think in a supercomputer in a’ line. I keep hearing Steve Jobs’ voice saying that from PowerMac G4 circa WWDC era. Just about EVERY computer is now a super computer from that 90’s catch phrase. It’s tired, it’s old, not really relevant no longer as even the Apple Watch is a supercomputer by those old standards.

Cheers.
 
I like this post ... except ‘think in a supercomputer in a’ line. I keep hearing Steve Jobs’ voice saying that from PowerMac G4 circa WWDC era. Just about EVERY computer is now a super computer from that 90’s catch phrase. It’s tired, it’s old, not really relevant no longer as even the Apple Watch is a supercomputer by those old standards.

Cheers.
It was always a bit of a silly line. If it’s in a generally available desktop computer, it is by definition not a supercomputer CPU any more.

(Also, while they didn’t know yet in 1999, the G4 arguably had a bit of an embarrassing evolution.)
 
  • Like
Reactions: DeepIn2U
Apple silicon is most definitely the future
SO far it’s looking good ... but

we do recall the leaked and introduction hype of Panther and the Power Mac G5 right?
BOth were supremely awesome ... yet the G5 in less than 1 year couldn’t compete with IntEl power management. I recal getting a G5 some 10yrs after its debut - cause yeah I’ve always wanted one. Then watched my Hydro bill skyrocket even if just on idle daily it was another $40CAN/mth. Thing was a beast (in its day with performance) In terms of weight and power consumption. I’d bet the power consumption on max load for 4hrs could power a Tesla Model 3 for 10miles!
 
  • Haha
Reactions: George Dawes
SO far it’s looking good ... but

we do recall the leaked and introduction hype of Panther and the Power Mac G5 right?
BOth were supremely awesome ... yet the G5 in less than 1 year couldn’t compete with IntEl power management. I recal getting a G5 some 10yrs after its debut - cause yeah I’ve always wanted one. Then watched my Hydro bill skyrocket even if just on idle daily it was another $40CAN/mth. Thing was a beast (in its day with performance) In terms of weight and power consumption. I’d bet the power consumption on max load for 4hrs could power a Tesla Model 3 for 10miles!

PowerPC was a problem because apple’s interests never aligned with IBMs. (Or Motorola, but Motorola was just terrible at designing cpus by that time).

This time there is no such problem.

Until and unless Intel switches to risc, apple silicon will destroy Intel silicon. And even after Intel switches to risc, that risc will target thousands of customers with different needs. Apple only has to target Apple. Apple will have an inherent advantage for a long time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.