Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Too bad Apple blows them out of the water.
So do you disagree?

Do you think the 888 would be anything approaching this level of performance without apple resounding thrashing every snapdragon in recent years? This is the closest they’ve been in ages which will encourage apple to keep pushing.

That’s good.
 
  • Like
Reactions: cgsnipinva
I would love you to right on this.

I do struggle to understand how it will take Apple vastly less power, and less silicon to be able to match the years of Nvidia's creations.

With almost 9000 cores, 10GB of memory, 28,300 million transistors and hence a power draw maxing out at around 300w but pumps out almost 30 TFlops of speed.

I'd be overjoyed to see Apple be able to match this.

In all honestly do I realistically expect them to match this within quite a few years. No I don't.
But I've love them to.

The M1 GPU can do, according to Apple 2.1 TFlops just slightly less than 2.9 from a GeForce GTX 1650's

Whilst not as simplistic as just TFlops, we're in effect looking at Apple having to create a 15x improvement over their current offering to match the best from 2020.

supposedly current m1 gpu draws about 15 watts. Scale that up to 300 watts and you get a 15x increase in power. If that could translate into a 10x performance increase then they would still be short if top nvidia numbers but they would be in the right ballpark for a new Mac Pro system.

if performance increase is more linear than that then apple would be right in line with nvidia.
 
Not far off then. Hopefully Apple will improve energy efficiency in the iOS devices as that is what is the biggest issue with mobiles. Apple making their own 5G modem could be a huge step.
 
SO far it’s looking good ... but

we do recall the leaked and introduction hype of Panther and the Power Mac G5 right?
BOth were supremely awesome ...

Well, the G5 was pretty good.

But IBM must have quickly realized what Motorola/Freescale already knew: it wasn't worth putting much effort into low-volume high-performance PowerPC CPUs. The early-'90s dream that they might supply CPUs for PCs that way was basically gone: Windows 2000 no longer ran on PowerPC (unlike Windows NT 4.0), BeOS and AmigaOS were going away, etc. That left… Apple, who hadn't quite yet recovered from its slump.

So Motorola/Freescale/NXP focused on the embedded market (which, it turns out, is better served by ARM, so they've since mostly moved over to that), and IBM focused on servers. Making desktop CPUs as a derivative from a server CPU is one thing; scaling it all the way down to a laptop is a whole other challenge. And IBM apparently wasn't willing or able to do so.

I'd love to know what the backroom meetings were like, though. Did IBM eventually say "oopsie daisies, we're not gonna fulfill our promise"? Presumably, they had to pay Apple some kind of contract violation fee?

Anyway, IBM's heart was apparently never quite in it.

yet the G5 in less than 1 year couldn’t compete with IntEl power management.

Well, to be fair, Intel was only starting to figure that out by then. They were actually still notorious for the Pentium 4, of which there were mobile variants as well, and… let's just say it wasn't efficient. At all.

It was only with the odd Pentium M series (Pentium III-derived) that things turned around. They must have shared a roadmap with Apple around 2004, saying "we can take the Pentium M, add additional CPU cores to it, and also scale it up to the desktop".

(Note that this was also just around the time that laptop sales started overtaking desktop sales. The Power Mac G5 was far less of a mainstream product than the early Power Mac G4s were, regardless of pricing.)

So how is today different?

Well, what's similar is that Intel has problems with performance per watt. That'll probably go away again to some extent. But also, unlike the situation in the late 1990s, where Apple sort of had its own CPU architecture, together with IBM and Motorola (now Freescale/NXP), but were sort of a weird niche player who couldn't really dare make too many demands, they can totally make demands now. The volume is there. They can apply the Tim Cook school of supply contracts, wherein they secure their destiny by buying up amounts of manufacturing capacity that others aren't willing to compete with.

Will there be future competing CPUs that are closer to Apple's Mx line? I think so. But regardless, Apple has shown for a good decade now that they're interested in driving their own CPU destiny forward, naysayers be damned, and I see no reason for them to change.
 
Chips are becoming monsters these days. Imagine battery technology starts to take bigger leaps and take more advantage of the chip efficiencies. Week long batter life anyone?
remind me with nokia.... 1 week without charge
 
The import thing is..... it's Catching up! And that's awesome!
Because Apple is screwing up a lot lately and my latest Android phones are rock solid
I also don't like the direction of the Mac with ARM. So GO INTEL.

I wouldn't necessarily say its "catching up". The A14 processor is Apple's current processor and the S888 is QUALCOMM's future chip. When the next iteration of Apple's mobile processor comes out - it will enlarge that gap once again.
 
What because you can’t? Clearly the specs don’t matter with the iPhone because aside from faster CPU which is utterly unused, the rest of the specs are typically far inferior to the competition.

And outperform in any meaningful way, nope.

I would have to disagree on this post. Apple has proven more efficient use of memory and battery and provides smooth performance on all their app on their phones that even some of the best Android phones struggle with.

Unlike the Android space - the software is developed in conjunction with the hardware - hence Apple's real world performance success.
 
PowerPC was a problem because apple’s interests never aligned with IBMs. (Or Motorola, but Motorola was just terrible at designing cpus by that time).

This time there is no such problem.

Until and unless Intel switches to risc, apple silicon will destroy Intel silicon. And even after Intel switches to risc, that risc will target thousands of customers with different needs. Apple only has to target Apple. Apple will have an inherent advantage for a long time.

IBM worked with Apple on bringing their Power Chips (Power 5) to PowerPC ... they built a Fab in Catskill, NY specifically for this if you recall. Their future roadmap, agreed never aligned with Apple. Motorola definitely was crap.

Intel had a RISC cpu before, i960 (early 90's). Some CISC instructions from what I've read still get sent to a mini RISC component to execute more efficiently still in Intel's Core chips but bare metal coders cannot bypass CISC to use it. horrible layman description but I am layman in this waters after all.

I worry about the future ... when RISC-V is heavily used by the competition. Having more computers, more smartphones there maybe a long term advantage that the others may gain that may lock out Apple whom will only focus on themselves. It's already happened with PowerPC and with Intel ... so whats not to say it cannot re-occur?
 
  • Like
Reactions: Henk Poley
IBM worked with Apple on bringing their Power Chips (Power 5) to PowerPC ... they built a Fab in Catskill, NY specifically for this if you recall.

East Fishkill, yep.

It wasn't built for this purpose (IBM owned the property since the 1960s), but it was modernized for various purposes at the time — including the G5 for Apple, but also later the same CPU for the Xbox 360, the related CPU "Cell" for the PlayStation 3, and related POWER4 CPUs. (The PowerPC 970 a.k.a. G5 was sort of a downsized POWER4.)

Their future roadmap, agreed never aligned with Apple. Motorola definitely was crap.

Exactly. There was a brief era where it seemed to be aligned because not only Apple but all three major game consoles used PowerPC (Nintendo at the time used various offshoots of the PowerPC 750, a.k.a. Apple's G3). But the heart wasn't in it. And/or the money wasn't there.

Intel had a RISC cpu before, i960 (early 90's).

Intel had all kinds of attempts to get rid of x86, including i960 and later on Itanium. At the end of the day, AMD's pragmatic approach of modernizing x86 while also bringing it to 64-bit won, and they licensed that.

Some CISC instructions from what I've read still get sent to a mini RISC component to execute more efficiently still in Intel's Core chips but bare metal coders cannot bypass CISC to use it. horrible layman description but I am layman in this waters after all.
Yes, my understanding is that, internally, modern x86 is quasi-RISC.

I worry about the future ... when RISC-V is heavily used by the competition. Having more computers, more smartphones there maybe a long term advantage that the others may gain that may lock out Apple whom will only focus on themselves. It's already happened with PowerPC and with Intel ... so whats not to say it cannot re-occur?
There's no reason Apple can't move to RISC-V 5-10 years from now. I don't think they will, though; if they ever change the ISA again, they'll probably simply do their own thing altogether.
 
I worry about the future ... when RISC-V is heavily used by the competition. Having more computers, more smartphones there maybe a long term advantage that the others may gain that may lock out Apple whom will only focus on themselves. It's already happened with PowerPC and with Intel ... so whats not to say it cannot re-occur?
RISC-V is currently in its infancy - a lot of promise to be sure, but no ecosystem, limited tools, and no reference designs so every user has to design their own cores. If it does fulfill its promise, what competitor is better positioned to take advantage than Apple? Apple is now an experienced chip designer and Apple’s control of the vertical stack means they can turn on a dime, initiating another architecture transition as they have done several times. Much harder for others to do the same.
 
  • Like
Reactions: DeepIn2U
supposedly current m1 gpu draws about 15 watts. Scale that up to 300 watts and you get a 15x increase in power. If that could translate into a 10x performance increase then they would still be short if top nvidia numbers but they would be in the right ballpark for a new Mac Pro system.

if performance increase is more linear than that then apple would be right in line with nvidia.

Doesn’t the whole M1 draw 10W? Where does the 15W number for just the GPU come from?
 
IBM worked with Apple on bringing their Power Chips (Power 5) to PowerPC ... they built a Fab in Catskill, NY specifically for this if you recall. Their future roadmap, agreed never aligned with Apple. Motorola definitely was crap.

Intel had a RISC cpu before, i960 (early 90's). Some CISC instructions from what I've read still get sent to a mini RISC component to execute more efficiently still in Intel's Core chips but bare metal coders cannot bypass CISC to use it. horrible layman description but I am layman in this waters after all.

I worry about the future ... when RISC-V is heavily used by the competition. Having more computers, more smartphones there maybe a long term advantage that the others may gain that may lock out Apple whom will only focus on themselves. It's already happened with PowerPC and with Intel ... so whats not to say it cannot re-occur?

Why worry about RISC-V? Apple could switch to risc-v tomorrow and few people would notice. They are already diverging from the Arm specification (with new matrix multiply extensions, etc.). They will use whatever works for them.
 
  • Like
Reactions: DeepIn2U
There's no reason Apple can't move to RISC-V 5-10 years from now. I don't think they will, though; if they ever change the ISA again, they'll probably simply do their own thing altogether.

I dunno it’s open source - so it’s almost like doing your own thing already. 🙂 Also, given its state of development, even 10 years seems generous. I’m not a chip designer either, but those who are suggest that not only is *a lot* of stuff is missing, but also there’s more than just the ISA that would need to be standardized to ensure consistency and compatibility between CPU designs. According to Jon Masters on Twitter (who is an self-admitted ARM fan boy) certain aspects of chip design need to become “boring” to guarantee stuff “just works”. And it takes awhile for that to happen.
 
Last edited:
Doesn’t the whole M1 draw 10W? Where does the 15W number for just the GPU come from?
The highest I’ve seen someone record GPU power is 10W under Aztec ruins and the whole package of CPU+GPU can draw 30+W if pushed.
 
I wonder what really is the point of these comparisons to be fair. I own and use an iPhone because I like iOS, not because the iPhone had the faster CPU back when I got my iPhone. Likewise, I’m not seeing many Android users jumping ship just because Apple has the faster CPU. I mean, it’s great that we have fast CPUs as it allows the phone to have a longer useful life, and it’s great Apple and Qualcomm are competing, but these comparisons are kinda pointless because the vast majority of iPhone users wouldn’t consider switching to Android and vice versa.
 


Qualcomm today shared benchmark results for the Snapdragon 888 SoC that will be used in flagship Android phones coming out in 2021, and it's not able to keep pace with the A14 chip in the iPhone 12 models, nor the A13 in the iPhone 11.

AnandTech compared Qualcomm's benchmarks to benchmarks of Apple's devices, with the iPhone winning out in Geekbench 5 and GFXBench tests.

qualcomm-snapdragon-benchmark-1.png

The Snapdragon 888 chip earned a single-core score of 1,135 and a multi-core score of 3,794, while the iPhone 12 Pro with A14 chip earned a single-core score of 1,603 and a multi-core score of 4,187.

In the GFXBench test, which measures GPU performance, Samsung scored an 86 (in frames per second), compared to the iPhone 12 Pro's 102.24. Sustained performance is unknown as of yet and will depend on the chip's power consumption, but AnandTech believes the Snapdragon 888 could ultimately win out over the iPhone if power consumption is competitive.

qualcomm-snapdragon-benchmark-2.png
The Snapdragon 888 chip isn't quite performing at the level of the A13 or A14 chips from Apple, but it is a significant improvement over prior-generation Snapdragon chips used in current flagship Android smartphones. CPU performance is up 25 percent and GPU performance is up 35 percent.

AnandTech says that as these benchmarks were provided by Qualcomm and not independently obtained, we have to trust that Qualcomm's numbers are accurate, but the site expects the figures to be "accurate and reproduced in commercial devices."

Article Link: Apple's A14 Outperforms New Snapdragon 888 Chip Coming in Future Android Phones
If the A14 is much better, than why does Samsung have more features?
 
Samsung doesn't have more features. They have some different features. Heck, they just announced they are copy another Apple feature (Spatial Audio).
they call it 3D audio, but they have more little things, like the widgets they had way before, fast charging, multi-tasking, that super zoom lol
 
they call it 3D audio, but they have more little things, like the widgets they had way before, fast charging, multi-tasking, that super zoom lol

Apple has widgets now too, and always did with A14. They also have fast charging, but presumably not as fast as Samsung? In any case, fast charging has nothing to do with the power of a CPU. Apple has had multi-tasking forever. If you are referring to fewer limitations on multi-tasking, that may be the case, but samsung phones pay the price for that (in terms of performance and battery life). Super-zoom is a function of the camera, not so much the CPU.

Apple has unique features, too. Correct me if I’m wrong, but Samsung has nothing like true tone? Portrait lighting? End-to-end-encrypted messaging built in? Devices that are supported by OS upgrades for more than 2 years? Continuity camera?

My information may be out of date. But I’m sure there are still many little features on iOS that aren’t yet on samsung (and vice versa).
 
  • Like
Reactions: Tec_Amigo
Well, the G5 was pretty good.

But IBM must have quickly realized what Motorola/Freescale already knew: it wasn't worth putting much effort into low-volume high-performance PowerPC CPUs. The early-'90s dream that they might supply CPUs for PCs that way was basically gone: Windows 2000 no longer ran on PowerPC (unlike Windows NT 4.0), BeOS and AmigaOS were going away, etc. That left… Apple, who hadn't quite yet recovered from its slump.

So Motorola/Freescale/NXP focused on the embedded market (which, it turns out, is better served by ARM, so they've since mostly moved over to that), and IBM focused on servers. Making desktop CPUs as a derivative from a server CPU is one thing; scaling it all the way down to a laptop is a whole other challenge. And IBM apparently wasn't willing or able to do so.

I'd love to know what the backroom meetings were like, though. Did IBM eventually say "oopsie daisies, we're not gonna fulfill our promise"? Presumably, they had to pay Apple some kind of contract violation fee?

Anyway, IBM's heart was apparently never quite in it.



Well, to be fair, Intel was only starting to figure that out by then. They were actually still notorious for the Pentium 4, of which there were mobile variants as well, and… let's just say it wasn't efficient. At all.

It was only with the odd Pentium M series (Pentium III-derived) that things turned around. They must have shared a roadmap with Apple around 2004, saying "we can take the Pentium M, add additional CPU cores to it, and also scale it up to the desktop".

(Note that this was also just around the time that laptop sales started overtaking desktop sales. The Power Mac G5 was far less of a mainstream product than the early Power Mac G4s were, regardless of pricing.)

So how is today different?

Well, what's similar is that Intel has problems with performance per watt. That'll probably go away again to some extent. But also, unlike the situation in the late 1990s, where Apple sort of had its own CPU architecture, together with IBM and Motorola (now Freescale/NXP), but were sort of a weird niche player who couldn't really dare make too many demands, they can totally make demands now. The volume is there. They can apply the Tim Cook school of supply contracts, wherein they secure their destiny by buying up amounts of manufacturing capacity that others aren't willing to compete with.

Will there be future competing CPUs that are closer to Apple's Mx line? I think so. But regardless, Apple has shown for a good decade now that they're interested in driving their own CPU destiny forward, naysayers be damned, and I see no reason for them to change.

Agreed.

I recall the Pentium 4 issues going mobile into laptops. I used and knew ARM (T.I.'s OMAP lineup) was in mobile with Nokia's S60 2ND edition back in that same timeframe.

An article really needs to be done to show Apple's shrewd work here.

Working on iPad .... yet not ready for prime time,
Designs the iPhone due to the volume and seeing so many internal campus members and media having issues with their smartphones.
Using OSX (MACH BSD) kernel and some other higher level tech as the core OS which today is STILL the most advanced and stable mobile OS ... dare I say computing OS
After a few years and seeing the scale of iPhone sales and demands for chips ... I'd say with iPhone 6 that was the key where Srouji really pushed his knowledge and team to boost performance and battery efficiencies learned of iPhone 5 with its' small batteries.
Supply chain prowess of Cook now heavily implemented.
iPad launches
continued iOS advances and soon iPadOS
Apple Watch furthers their knowledge with battery efficiencies and performance.
and finally M1 launches.

I think the next wearable device they launch will exponentially increase battery and cpu efficiencies and performance. As long as Apple keeps the best staff I think they'll be just fine.

Apple is known for slow progress - not only to not shock their end users into disarray or leaving the platforms they have, but to ensure each step forward is a firm step ... not a half-step or leading to a step backwards.
 
Agreed.

I recall the Pentium 4 issues going mobile into laptops. I used and knew ARM (T.I.'s OMAP lineup) was in mobile with Nokia's S60 2ND edition back in that same timeframe.

Yeah. Odd that TI decided to drop that entire line.

An article really needs to be done to show Apple's shrewd work here.

AnandTech likes to do deep dives, but they probably get far less info compared to from Intel.

Working on iPad .... yet not ready for prime time,
Designs the iPhone due to the volume and seeing so many internal campus members and media having issues with their smartphones.
Using OSX (MACH BSD) kernel and some other higher level tech as the core OS which today is STILL the most advanced and stable mobile OS ... dare I say computing OS
After a few years and seeing the scale of iPhone sales and demands for chips ... I'd say with iPhone 6 that was the key where Srouji really pushed his knowledge and team to boost performance and battery efficiencies learned of iPhone 5 with its' small batteries.
Supply chain prowess of Cook now heavily implemented.

I will say that Apple has an unusually long breath. They're willing to make long-term investments even as analysts are puzzled or critical.

I don't know if, even in 2008 when the path to Apple Silicon started, they knew that moving the Mac to it could end up being an option. Maybe not. But they did know the benefits of owning the entire stack.

There's really not much of a secret: have a decent plan (maximizing vertical integration), have a ton of money, and have a lot of patience.

iPad launches
continued iOS advances and soon iPadOS
Apple Watch furthers their knowledge with battery efficiencies and performance.
and finally M1 launches.

Yep. The crosspollination of Apple platforms is rather interesting.
 
  • Like
Reactions: DeepIn2U
they call it 3D audio, but they have more little things, like the widgets they had way before, fast charging, multi-tasking, that super zoom lol
huh? only thing apple are lacking is spilt screen multitasking and zoom wise thats down to no pericope lens which iPhone will have in 2022. the fact samsung own the rights to this tech is why apple have struggled to add it until then.

apple added Picture in Picture and widgets in IOS 14 so they are giving people less reason to go to android if they are thinking about it. you can customise your Home Screen far more now than ever before If one needed to do so.

iPhone has fast charging and widgets so I don't see your point here.

do samsung phones have more features first? well yes as they get things to market very quickly no matter how good it is and when it's great then great but making sure it's perfect isn't their main focus.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.