Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
LOL, the clip you posted is a joke, he did a terrible job.
haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbage
 
So you stand by "AMD's long term potential in chip volume is higher than Apple's"? What product does AMD make that's higher-volume than smartphones?
Of course I stand by it. I see no reason AMD specifically needs a single product that's as high volume as smartphones although long term anything is possible. The idea behind AMD's potential is the fact that they make multiple chips designs, even semi-custom designs for multiple markets. AMD's biggest problem in the last 3 years was that they couldn't have enough chips made, demand was always higher than they could provide.
 
haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbage
He said the Radeon 780M should be 2.6 TFLOPS when the Radeon 680M is 3.379 TFLOPS.
Like always the guy makes a lot of ridiculous assumptions.

The 7940HS is a stock 35W CPU power, I really doubt AMD increased the power to 54W in an undetailed launch presentation "just to win" in benchmarks like a lot of apple fans are hoping. The 45-54W TDPs are for gaming laptops mostly.
 
Last edited:
Of course I stand by it. I see no reason AMD specifically needs a single product that's as high volume as smartphones although long term anything is possible. The idea behind AMD's potential is the fact that they make multiple chips designs, even semi-custom designs for multiple markets. AMD's biggest problem in the last 3 years was that they couldn't have enough chips made, demand was always higher than they could provide.

OK. Name an AMD product whose volume is similar to that of the A16.
 
To show anything that comes close to Apple's volume, given your claim that "AMD's long term potential in chip volume is higher than Apple's".
I was sure this is where you were trying to get at. Quite a pointless direction that doesn't challenge what I wrote.
 
I was sure this is where you were trying to get at. Quite a pointless direction that doesn't challenge what I wrote.

Please feel free to elaborate.

(I bet it'll go something like "oh, phones don't count because they're not real computers".)
 
Please feel free to elaborate.

(I bet it'll go something like "oh, phones don't count because they're not real computers".)
I don't need to elaborate anything when my comment was clear. I'm not going to spin in circles whit you just because you didn't like what I wrote and can't prove it wrong.
 
LOL, the clip you posted is a joke, he did a terrible job. The guy thinks that when AMD showed Gaming performance in the launch presentation they were talking about the iGPU. That's nonsense, the top 6000 series APU has like at least 60% better iGPU performance in gaming than Intel's iGPU, the new 7000 series iGPU should at least be 40% faster vs 6000 series(some say it should reach double the iGPU performance). The Radeon 680M is 3.379 TFLOPS and that guy thinks the Radeon 780M will be 2.6 TFLOPS. That's a laughable poor analysis.
So they should have shown that if it is the case. But, they didn't. I'm sure they could have found 1 AAA game that would work on Mac (native Apple Silicon), here is a list:
Also the focus of 7940HS presentation was clearly CPU performance, the 7940HS doesn't need a huge iGPU as it will also be paired with dedicated mobile GPUs.
At what wattage? M1 Pro "binned" runs at about 21 watts under load. 30-33 for the 10 core. Where are we at with the AMD chip? 45? That would be 120% more power draw than an 8 core M1 Pro, and just about 50% more power than the full M1 Pro. Either way for an up to 35% performance advantage in 1 task?... I'm 35% faster than you are at one task, but I need to draw 50 to 120% more power to do it. I'll cost half as much though..... They didn't show a chip running at their rated minimum 15 watts performing at up to 35% better.
I'm fully confident I didn't give AMD any credit too soon.
I'm very sure of this.
AMD showed the 7940HS the way they did because they are very confident about the performance and efficiency of their little chip.
Against intel. Not so much against Apple Silicon.
There's no reason to show gaming performance vs the M1 Pro because: 1) There are hardly any AAA games available for ARM Macs
Doesn't exactly mean none, and I gave a list of which they could have shown. Just take Res Evil Village and call it even. A heavily advertised game that runs native on Apple Silicon. I'm sure they can beat it, and would have shown how much better they are at both intel and Apple Silicon. They didn't.
2) An 7940HS laptop with a dedicated GPU will smoke any Apple laptop in gaming anyway
AT MORE POWER!!!
3) A 7940HS without a dedicated GPU will most likely be cheap enough to be considered M2 Macbook Pro competition and it will smoke it in gaming anyway.
Then show it.
I mean for Christs sake Asus just announced new Ryzen 7940HS ROG Zephyrus G14s, so 14inch laptops, with dedicated mobile RTX 4090s and 4080s GPUs. That will smoke any Macbook in gaming without breaking a sweat.
4090 (Nvidia). I thought this was about AMD?
 
so what?
faster (even when only in some areas) is faster
of course low power consumtion is great too, but it all depends on the individual requirements.
by that logic an "Apple Watch" should be better too, should it have a better performance per Watts ratio than an M1 Ultra

i love my M1s, but the current i3-i9 are awesome too, so are the Ryzens!

win-win for us customers 🤘
 
  • Like
Reactions: M3gatron
haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbage
give it a watch. I'm sure he's not perfect. But, he has shown intel vs M1 and AMD vs M1. Win lose or draw he shows it.
 
i won't.
he's completely unwatchable for me.
he might have some valid points sometimes, but he's twisting facts like crazy and cherry picking stuff that it's not even funny anymore
 
So they should have shown that if it is the case. But, they didn't. I'm sure they could have found 1 AAA game that would work on Mac (native Apple Silicon), here is a list:
There's no reason for them to show what You want. 1 game result is meaningless. Also the 7940HS will also be paired with dedicated GPUs. Did you even read what I wrote?

At what wattage? M1 Pro "binned" runs at about 21 watts under load. 30-33 for the 10 core. Where are we at with the AMD chip? 45? That would be 120% more power draw than an 8 core M1 Pro, and just about 50% more power than the full M1 Pro. Either way for an up to 35% performance advantage in 1 task?... I'm 35% faster than you are at one task, but I need to draw 50 to 120% more power to do it. I'll cost half as much though..... They didn't show a chip running at their rated minimum 15 watts performing at up to 35% better.
Assumptions. AMD didn't state anywhere that the 7940HS CPU was running at 45W.
The 5800H at 35W will boost to 4100Ghz all core, the 7940HS will be able to boost much higher at 35W and has better IPS so it can beat the 8 Core M1 Pro no problem at this TDP. The 45W TDP setting is for even higher clocks if there's thermal headroom as the 4nm process allows for cray high clocks. There's no indication AMD used the 45W setting in their launch presentation as they always show their chips running at stock.

Against intel. Not so much against Apple Silicon.
Against both.

Doesn't exactly mean none, and I gave a list of which they could have shown. Just take Res Evil Village and call it even. A heavily advertised game that runs native on Apple Silicon. I'm sure they can beat it, and would have shown how much better they are at both intel and Apple Silicon. They didn't.

You gave nothing, 1 games doesn't mean Macbooks are eligible gaming laptops.
Also there's no reason for you to insist with gaming, it's not an area where Apple is competitive with Windows laptops anyway.

AT MORE POWER!!!
Much higher performance is much higher performance. If somebody needs to properly play AAA games on his computer he needs: 1) Games he can play, 2) Performance to enjoy the games he plays.
Also I'm sure that with power mode on a 4080-4090 laptop set to best battery life, performance in games will still be way better than on any Macbook, like way better.
Mobile RTX 4050(so mid-range) can supposedly beat a desktop RTX 3070 with just one-third the power use.
You are well over your head in this discussion.

Then show it.
LoL 😂

4090 (Nvidia). I thought this was about AMD?
This is the thing, you assume a lot of things but you are unable to follow a conversation. Read my comment again and maybe you will understand this time why I mentioned the mobile 4090.
Do you need a link for the announced upcoming Zephyrus G14? Do you want an explanation for what a dedicated GPU is?
 
so what?
faster is faster
of course low power consumtion is great too, but it all depends on the individual needs.
by that logic an Apple Watch should be better too, when it has a better performance per Watt ratio than an M1 Ultra

i love my M1s, but the current i3-i9 are awesome too, so are the Ryzens
win-win for us customers 🤘
faster is faster is intel's logic. at what cost is Apple's. cost in terms of performance per watt. it matters. A Xeon is faster, an Epyc AMD is faster. 4090Ti is faster. And at high wattage (power draw) it determines what kind of laptop or desktop you end up with.

The knock I have against AMD or intel (more on intel) is that its comparing a chip that needs 50% or more than double the power to compete against an M1 (1, Pro, Max, whatever). If you don't care about the power draw, that's fine. But, it real terms. They are not that comparable. AMD is getting better at it. And I praise them for it. But the same apprehension shown towards Apples M1 is not being shown against AMD (by some). When they did not in my very humble opinion show enough detail to prove otherwise.

Apple will lose in most Cinebench test because they don't have as high a clock rate or as many cores. But, if you normally it out for Performance Per watt, it's a different story (aka pull the power core out of the laptop). Davinci Resolve has shown great improvements even over Final Cut in performance on M1. Where was that test on display? Instead we get a Blender render and we KNOW it's not fully optimized yet for Apple Silicon. Not one game could they find to bench against? ONE!? Just for laughs even, they could show Shadow of the Tomb Raider.

If AMD had shown their chip running at or near the same wattage as any M1 or M2 chip and betting it by 35% I'd hand them the win clear cut. They didn't.
 
There's no reason for them to show what You want. 1 game result is meaningless. Also the 7940HS will also be paired with dedicated GPUs. Did you even read what I wrote?
Right, so show only what makes you look good. Got it. Dedicated GPU's that cost more per watt than the CPU in the laptop it comes with.
Assumptions. AMD didn't state anywhere that the 7940HS CPU was running at 45W.
They do, from 15-45 watts are on the slide they show. They put it there. So the question is really basic, what wattage are they running at with these tests? It matters. If it's 15 watts and they beat any chip they compared it to in their limited tests. That's still AMAZING. But, if you see the 45 watt marker, you know it's running that way, as it's a bench mark. Plain and simple.
The 5800H at 35W will boost to 4100Ghz all core, the 7940HS will be able to boost much higher at 35W and has better IPS so it can beat the 8 Core M1 Pro no problem at this TDP.
Slide shows 45.
The 45W TDP setting is for even higher clocks if there's thermal headroom as the 4nm process allows for cray high clocks. There's no indication AMD used the 45W setting in their launch presentation as they always show their chips running at stock.
If you show up to 35% better performance, you're showing your best performance. You're not getting that at 15 or 35 watts if your chip goes to 45.
Against both.



You gave nothing, 1 games doesn't mean Macbooks are eligible gaming laptops.
Show a chart comparing a "game" against intel at 100% and Apple at N/A then AMD at 100+%. They could have found a game to compare it to if they are going to put Apple in the darn chart. N/A means not applicable. Not available. They could have found 1 game.
Also there's no reason for you to insist with gaming, it's not an area where Apple is competitive with Windows laptops anyway.
Then don't put it in the benchmark chart as N/A. Leave it out completely. Or find a game that you can shove into the chart to prove the point. Heck make it look worse than intel for laughs. Considering their presentation ran on a MacBook, so it's not like they didn't have one around to test with.
Much higher performance is much higher performance.
So let's praise AMD for lowering their power draw while simultaniously stating they are the best thing since sliced bread for beating an M1 pro from 1 and a half years ago in one or 2 tests.
I'm sure Apple could make a version of M1/2 that users 300 watts and beats all chips by 1%. What say you then?
If somebody needs to properly play AAA games on his computer he needs: 1) Games he can play, 2) Performance to enjoy the games he plays.
You do know there are games that are cross platform?
Also I'm sure that with power mode on a 4080-4090 laptop set to best battery life, performance in games will still be way better than on any Macbook, like way better.
SHOW IT! Love to see it in low power mode off the cord. I'd absolutely love to see this.
Mobile RTX 4050(so mid-range) can supposedly beat a desktop RTX 3070 with just one-third the power use.
You are well over your head in this discussion.


LoL 😂


This is the thing, you assume a lot of things but you are unable to follow a conversation.
Completely wrong.
Read my comment again and maybe you will understand this time why I mentioned the mobile 4090.
Do you need a link for the announced upcoming Zephyrus G14? Do you want an explanation for what a dedicated GPU is?
Enjoy your AMD/Nvida hybrid low cost high powered gaming thick book. I'm not stopping you.
I've clearly stated I praise them for their effort, but demand more details. I'll not believe a 15 watt part can come CLOSE to what Apple has already done a year an a half ago based upon 2 benchmarks. Please go ahead and believe what you will, no one stopping you but, you're not proving anything, and neither has AMD in this instance.
 
Last edited by a moderator:
  • Like
Reactions: Romain_H
I was sure this is where you were trying to get at. Quite a pointless direction that doesn't challenge what I wrote.
I think for me the disconnect on the volume observation is that AMD has been building chips since the 70’s whilst Apple have really only gotten into the game in the last 15 years and in that time Apple near as I can tell ship more of their own designed chips than AMD could come close to. Perhaps you don’t include the smartphones and others but for me when we’re considering chip volume, I count all of the devices for which they design their own chips for: their laptops, desktops, phones, tablets, watches, the AppleTV, even the displays now that ship A15 chips down to the headphones with their own custom wireless chips which feature in many of their devices including the AirTags. Additionally there is the Afterburner cards in their Mac Pro line as well which speak to another angle of their explorations in chip design. Many of these chips aren’t on cutting edge process nodes which is why I’m curious about the source for your assertion that as Apple moves down the chain that AMD takes over given that AMD near as I can tell likely ship significantly less chips than Apple does with Apple continuing to ship devices with silicon leveraging those older processes (the watch chips are still on the 7nm node for example and they’re around 60 million shipments in 2021 based on Google searches I’ve done). Given AMD in the server space ship a few million units a year and they’ve at best got 30% of the rest of the market which near as I can tell is around 340 million mark giving AMD around 100 million in CPU shipments and significantly less for their GPUs (like 10% of the market? 2021 numbers put this at 50 million units total, so another 5 million for AMD; worse in 2022). This for me is the disconnect and what I want to understand, it isn’t a challenge it’s just trying to understand how you came to your conclusions. But I guess it’s just common sense.
 
Much higher performance is much higher performance. If somebody needs to properly play AAA games on his computer he needs: 1) Games he can play, 2) Performance to enjoy the games he plays.
Also I'm sure that with power mode on a 4080-4090 laptop set to best battery life, performance in games will still be way better than on any Macbook, like way better.
Mobile RTX 4050(so mid-range) can supposedly beat a desktop RTX 3070 with just one-third the power use.
You are well over your head in this discussion.


LoL 😂
"But the most concrete chart compared an unnamed Lovelace GPU to the RTX 3050 and RTX 3070, showing that the Lovelace GPU (almost certainly the RTX 4050) could perform at the same level as the RTX 3070 while consuming just 40 W instead of 120 W. RTX 3050 GPUs have been popular in thin-and-light designs like the Dell XPS 15 and Surface Laptop Studio, and bringing RTX 3070-level performance to those laptops without requiring RTX 3070-level heat, fan noise, or cooling capacity is a big deal."

40W is more than the power needed to run M1 Pro. And if I wanted to I can find


A benchmark that makes that 3070 look like total crap. Running either intel or AMD. I can cherry pick just like AMD can.
 
God you're dense. Show a chart comparing a "game" against intel at 100% and Apple at N/A then AMD at 100+%. They could have found a game to compare it to if they are going to put Apple in the darn chart. N/A means not applicable. Not available. They could have found 1 game. you're dense.

Go to the article at the beginning and watch the charts again. AMD didn't compare gaming performance vs intel in 1 games, it clearly stated on their chart MULTI-GAME-AVERAGE.
AMD doesn't need to do anything apple fans want them to. Macbooks aren't eligible high end gaming computers as there are hardly any AAA games available for them. Also a 2000$ Windows laptop with a dedicated GPU will piss all over any Macbook in gaming performance all day, it's not in Apple's best interest to have their hardware in gaming comparisons, AMD basically did them a solid.

SHOW IT! Love to see it in low power mode off the cord. I'd absolutely love to see this.

Do you really believe that a mobile 4090 that's close to a desktop 3090 in gaming won't be faster in low power mode in gaming than any Macbook? Really?

Enjoy your AMD/Nvida hybrid low cost high powered gaming thick book.

I will, thank you very much. The Zephyrus G14 will be a really nice normal size laptop(definitely not a thick book laptop) for those that have eyes to see and don't live in a bubble. These new AMD APUs and Nvidia GPUs are super impressive. 7940HS+RTX 4050 mobile is a low cost dream match that will trash any Macbook in various workloads(especially gaming).
 
Last edited by a moderator:
  • Disagree
Reactions: Romain_H
"But the most concrete chart compared an unnamed Lovelace GPU to the RTX 3050 and RTX 3070, showing that the Lovelace GPU (almost certainly the RTX 4050) could perform at the same level as the RTX 3070 while consuming just 40 W instead of 120 W. RTX 3050 GPUs have been popular in thin-and-light designs like the Dell XPS 15 and Surface Laptop Studio, and bringing RTX 3070-level performance to those laptops without requiring RTX 3070-level heat, fan noise, or cooling capacity is a big deal."

40W is more than the power needed to run M1 Pro. And if I wanted to I can find



A benchmark that makes that 3070 look like total crap.

The discussion was about gaming, so go and find videos with 3070 vs the M1 Pro in gaming and see how it goes, not some niche specific machine learning test that better suited for the M1 for the memory alone and doesn't actually tell anything about the general GPU performance. it's like somebody would run a CUDA specific test on an Nvidia GPU vs the M1 which doesn't have CUDA so The End.

The M1 Pro uses around 60W in a constant CPU+GPU load like the Witcher 3, the test in on notebookcheck(also they tested Rise of the Tomb Raider on 1080p High, M1 Pro GPU: 53fps, Laptop 3070: 127fps, 58% faster) . This is the Base 14 Core GPU M1 Pro so it looks like its possible for a 7840HS+RTX4050 laptop with the CPU power limit set to 15W to use less power in games than a base M1 Pro and perform much much better. Things are looking better and better for Windows laptops.
 
Last edited by a moderator:
I think for me the disconnect on the volume observation is that AMD has been building chips since the 70’s whilst Apple have really only gotten into the game in the last 15 years and in that time Apple near as I can tell ship more of their own designed chips than AMD could come close to. Perhaps you don’t include the smartphones and others but for me when we’re considering chip volume, I count all of the devices for which they design their own chips for: their laptops, desktops, phones, tablets, watches, the AppleTV, even the displays now that ship A15 chips down to the headphones with their own custom wireless chips which feature in many of their devices including the AirTags. Additionally there is the Afterburner cards in their Mac Pro line as well which speak to another angle of their explorations in chip design. Many of these chips aren’t on cutting edge process nodes which is why I’m curious about the source for your assertion that as Apple moves down the chain that AMD takes over given that AMD near as I can tell likely ship significantly less chips than Apple does with Apple continuing to ship devices with silicon leveraging those older processes (the watch chips are still on the 7nm node for example and they’re around 60 million shipments in 2021 based on Google searches I’ve done). Given AMD in the server space ship a few million units a year and they’ve at best got 30% of the rest of the market which near as I can tell is around 340 million mark giving AMD around 100 million in CPU shipments and significantly less for their GPUs (like 10% of the market? 2021 numbers put this at 50 million units total, so another 5 million for AMD; worse in 2022). This for me is the disconnect and what I want to understand, it isn’t a challenge it’s just trying to understand how you came to your conclusions. But I guess it’s just common sense.
I will only address this.

Those 60 million 7nm Apple smartwatch chips are probably worth as much as 5 million Xbox Series X chips maybe less. The chip in the Series X's SOC is 52.5mm x 52.5mm, compare that to the entire size of the Apple watch.
It's not just about the number of chips but the size also. AMD already won 2 Supercomputer contracts which will uses millions of their huge server CPUs and GPUs. A contract like that is enough to cover years of Apple small smartwatch chips production.
Most of the chips AMD makes are for enterprise so it's hard to track the exact volume but its obvious that in 3 years AMD went from below 5% overall wafer allocation at TSMC to over 10% while apple basically stayed the same although they moved to their own computer chips. Apple's volume is at it's peak while AMD's isn't.
 
Last edited:
The discussion was about gaming, so go and find videos with 3070 vs the M1 Pro in gaming and see how it goes, not some niche specific machine learning test that better suited for the M1 for the memory alone and doesn't actually tell anything about the general GPU performance. it's like somebody would run a CUDA specific test on an Nvidia GPU vs the M1 which doesn't have CUDA so The End.

The M1 Pro uses around 60W in a constant CPU+GPU load like the Witcher 3, the test in on notebookcheck(also they tested Rise of the Tomb Raider on 1080p High, M1 Pro GPU: 53fps, Laptop 3070: 127fps, 58% faster) . This is the Base 14 Core GPU M1 Pro so it looks like its possible for a 7840HS+RTX4050 laptop with the CPU power limit set to 15W to use less power in games than a base M1 Pro and perform much much better. Things are looking better and better for Windows laptops.
AMD picked niche benchmarks. Just proving you can do the same for M1. Multigaming is nice, but they could have picked games to include the M1/2 in. Would have taken all of 10 minutes. Heck they had some MacBooks in the audience to do it with, LIVE

I have yet to see any benchmarks or tests showing the CPU running at 15 watts for anything. And you can bet the 30+ hours of video play back is exactly when it would be that low. The chip is 35-54 Watts by AMD's site.



And their graphs/charts show the 7940 chip for those benchmarks. So when it reads Up to 8 Cores, and Up to 5.2Ghz. They are not talking about a 4 core 15 Watt chip in a thin an light laptop. They are showing you the best set of performance numbers on a 7940 vs intel and M1 Pro.

Apple M1 Pro peaks at around 31 watts under load. https://www.notebookcheck.net/Apple-M1-Pro-Processor-Benchmarks-and-Specs.579915.0.html
"The M1 Pro is manufactured in 5 nm at TSMC and integrates 33.7 billion transistors. The peak power consumption of the chip was advertised around 30W for CPU intensive tasks. In the Prime95 benchmark the chip uses in our tests (with a MBP16) 33.6W package power and 31W for the CPU part. In idle the SoC only reports 1W package power."

So, I'm not seeing what you're getting at here. That test is not at 15 watts for AMD. It is very likely done plugged into power (unlike MacBooks that can perform the same with or without being plugged in). If you don't care about such things, that's fine. I'm still applauding AMD for beating intel, and getting closer to Apple. And let's not forget the reason why Apple even is bothering to make its own chips. Because they can make a chip that performs better at what they want it to. Even if Apple went with AMD instead of going it alone with M1/2. They still wouldn't have the chips shipping in a product until March 2023. And it still wouldn't be as good as M1 Pro is now. I have AI/ML in my iPad Pro!

For some specific tasks, sure the AMD chip will win. If you want to play games, sure. You would still need a dedicated GPU to get the performance one would want, at more power and lower battery life. And if you're willing to make those tradeoffs, go for it.
 
Before the M2, the lineup was:

M1 → M1 Pro → M1 Max → M1 Ultra

M1 Pro was the second lowest level in the hierarchy.

That is ”lower midrange.”

M1 Ultra is not available in a Macbook Pro so it doesn't count in this instance. That would be like counting AMD Threadripper as flagship CPU when discussing laptops.
 
is very likely done plugged into power (unlike MacBooks that can perform the same with or without being plugged in).
huh?
there's no problem at all to let pretty much any PC laptop utilize full power on battery. depending on the hardware configuration, the battery charge just won't last you very long.
but pf course even a MacBook won't give you 20h of battery life while doing Blender renders at higher screen brightnesses, though i guess it should fare much better, even though it won't compete with a PC laptop with a good dedicated graphics card installed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.