haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbageLOL, the clip you posted is a joke, he did a terrible job.
haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbageLOL, the clip you posted is a joke, he did a terrible job.
Of course I stand by it. I see no reason AMD specifically needs a single product that's as high volume as smartphones although long term anything is possible. The idea behind AMD's potential is the fact that they make multiple chips designs, even semi-custom designs for multiple markets. AMD's biggest problem in the last 3 years was that they couldn't have enough chips made, demand was always higher than they could provide.So you stand by "AMD's long term potential in chip volume is higher than Apple's"? What product does AMD make that's higher-volume than smartphones?
He said the Radeon 780M should be 2.6 TFLOPS when the Radeon 680M is 3.379 TFLOPS.haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbage
Of course I stand by it. I see no reason AMD specifically needs a single product that's as high volume as smartphones although long term anything is possible. The idea behind AMD's potential is the fact that they make multiple chips designs, even semi-custom designs for multiple markets. AMD's biggest problem in the last 3 years was that they couldn't have enough chips made, demand was always higher than they could provide.
What for?OK. Name an AMD product whose volume is similar to that of the A16.
What for?
I was sure this is where you were trying to get at. Quite a pointless direction that doesn't challenge what I wrote.To show anything that comes close to Apple's volume, given your claim that "AMD's long term potential in chip volume is higher than Apple's".
I was sure this is where you were trying to get at. Quite a pointless direction that doesn't challenge what I wrote.
I don't need to elaborate anything when my comment was clear. I'm not going to spin in circles whit you just because you didn't like what I wrote and can't prove it wrong.Please feel free to elaborate.
(I bet it'll go something like "oh, phones don't count because they're not real computers".)
So they should have shown that if it is the case. But, they didn't. I'm sure they could have found 1 AAA game that would work on Mac (native Apple Silicon), here is a list:LOL, the clip you posted is a joke, he did a terrible job. The guy thinks that when AMD showed Gaming performance in the launch presentation they were talking about the iGPU. That's nonsense, the top 6000 series APU has like at least 60% better iGPU performance in gaming than Intel's iGPU, the new 7000 series iGPU should at least be 40% faster vs 6000 series(some say it should reach double the iGPU performance). The Radeon 680M is 3.379 TFLOPS and that guy thinks the Radeon 780M will be 2.6 TFLOPS. That's a laughable poor analysis.
At what wattage? M1 Pro "binned" runs at about 21 watts under load. 30-33 for the 10 core. Where are we at with the AMD chip? 45? That would be 120% more power draw than an 8 core M1 Pro, and just about 50% more power than the full M1 Pro. Either way for an up to 35% performance advantage in 1 task?... I'm 35% faster than you are at one task, but I need to draw 50 to 120% more power to do it. I'll cost half as much though..... They didn't show a chip running at their rated minimum 15 watts performing at up to 35% better.Also the focus of 7940HS presentation was clearly CPU performance, the 7940HS doesn't need a huge iGPU as it will also be paired with dedicated mobile GPUs.
I'm very sure of this.I'm fully confident I didn't give AMD any credit too soon.
Against intel. Not so much against Apple Silicon.AMD showed the 7940HS the way they did because they are very confident about the performance and efficiency of their little chip.
Doesn't exactly mean none, and I gave a list of which they could have shown. Just take Res Evil Village and call it even. A heavily advertised game that runs native on Apple Silicon. I'm sure they can beat it, and would have shown how much better they are at both intel and Apple Silicon. They didn't.There's no reason to show gaming performance vs the M1 Pro because: 1) There are hardly any AAA games available for ARM Macs
AT MORE POWER!!!2) An 7940HS laptop with a dedicated GPU will smoke any Apple laptop in gaming anyway
Then show it.3) A 7940HS without a dedicated GPU will most likely be cheap enough to be considered M2 Macbook Pro competition and it will smoke it in gaming anyway.
4090 (Nvidia). I thought this was about AMD?I mean for Christs sake Asus just announced new Ryzen 7940HS ROG Zephyrus G14s, so 14inch laptops, with dedicated mobile RTX 4090s and 4080s GPUs. That will smoke any Macbook in gaming without breaking a sweat.
give it a watch. I'm sure he's not perfect. But, he has shown intel vs M1 and AMD vs M1. Win lose or draw he shows it.haven,t even watched that clip since i've long stopped watching him, but i don't think i've ever seen a single clip from him that's not utter fanboy garbage
There's no reason for them to show what You want. 1 game result is meaningless. Also the 7940HS will also be paired with dedicated GPUs. Did you even read what I wrote?So they should have shown that if it is the case. But, they didn't. I'm sure they could have found 1 AAA game that would work on Mac (native Apple Silicon), here is a list:
Assumptions. AMD didn't state anywhere that the 7940HS CPU was running at 45W.At what wattage? M1 Pro "binned" runs at about 21 watts under load. 30-33 for the 10 core. Where are we at with the AMD chip? 45? That would be 120% more power draw than an 8 core M1 Pro, and just about 50% more power than the full M1 Pro. Either way for an up to 35% performance advantage in 1 task?... I'm 35% faster than you are at one task, but I need to draw 50 to 120% more power to do it. I'll cost half as much though..... They didn't show a chip running at their rated minimum 15 watts performing at up to 35% better.
Against both.Against intel. Not so much against Apple Silicon.
Doesn't exactly mean none, and I gave a list of which they could have shown. Just take Res Evil Village and call it even. A heavily advertised game that runs native on Apple Silicon. I'm sure they can beat it, and would have shown how much better they are at both intel and Apple Silicon. They didn't.
Much higher performance is much higher performance. If somebody needs to properly play AAA games on his computer he needs: 1) Games he can play, 2) Performance to enjoy the games he plays.AT MORE POWER!!!
LoL 😂Then show it.
This is the thing, you assume a lot of things but you are unable to follow a conversation. Read my comment again and maybe you will understand this time why I mentioned the mobile 4090.4090 (Nvidia). I thought this was about AMD?
faster is faster is intel's logic. at what cost is Apple's. cost in terms of performance per watt. it matters. A Xeon is faster, an Epyc AMD is faster. 4090Ti is faster. And at high wattage (power draw) it determines what kind of laptop or desktop you end up with.so what?
faster is faster
of course low power consumtion is great too, but it all depends on the individual needs.
by that logic an Apple Watch should be better too, when it has a better performance per Watt ratio than an M1 Ultra
i love my M1s, but the current i3-i9 are awesome too, so are the Ryzens
win-win for us customers 🤘
Right, so show only what makes you look good. Got it. Dedicated GPU's that cost more per watt than the CPU in the laptop it comes with.There's no reason for them to show what You want. 1 game result is meaningless. Also the 7940HS will also be paired with dedicated GPUs. Did you even read what I wrote?
They do, from 15-45 watts are on the slide they show. They put it there. So the question is really basic, what wattage are they running at with these tests? It matters. If it's 15 watts and they beat any chip they compared it to in their limited tests. That's still AMAZING. But, if you see the 45 watt marker, you know it's running that way, as it's a bench mark. Plain and simple.Assumptions. AMD didn't state anywhere that the 7940HS CPU was running at 45W.
Slide shows 45.The 5800H at 35W will boost to 4100Ghz all core, the 7940HS will be able to boost much higher at 35W and has better IPS so it can beat the 8 Core M1 Pro no problem at this TDP.
If you show up to 35% better performance, you're showing your best performance. You're not getting that at 15 or 35 watts if your chip goes to 45.The 45W TDP setting is for even higher clocks if there's thermal headroom as the 4nm process allows for cray high clocks. There's no indication AMD used the 45W setting in their launch presentation as they always show their chips running at stock.
Show a chart comparing a "game" against intel at 100% and Apple at N/A then AMD at 100+%. They could have found a game to compare it to if they are going to put Apple in the darn chart. N/A means not applicable. Not available. They could have found 1 game.Against both.
You gave nothing, 1 games doesn't mean Macbooks are eligible gaming laptops.
Then don't put it in the benchmark chart as N/A. Leave it out completely. Or find a game that you can shove into the chart to prove the point. Heck make it look worse than intel for laughs. Considering their presentation ran on a MacBook, so it's not like they didn't have one around to test with.Also there's no reason for you to insist with gaming, it's not an area where Apple is competitive with Windows laptops anyway.
So let's praise AMD for lowering their power draw while simultaniously stating they are the best thing since sliced bread for beating an M1 pro from 1 and a half years ago in one or 2 tests.Much higher performance is much higher performance.
You do know there are games that are cross platform?If somebody needs to properly play AAA games on his computer he needs: 1) Games he can play, 2) Performance to enjoy the games he plays.
SHOW IT! Love to see it in low power mode off the cord. I'd absolutely love to see this.Also I'm sure that with power mode on a 4080-4090 laptop set to best battery life, performance in games will still be way better than on any Macbook, like way better.
Completely wrong.Mobile RTX 4050(so mid-range) can supposedly beat a desktop RTX 3070 with just one-third the power use.
You are well over your head in this discussion.
![]()
Nvidia unveils a broad range of efficient new laptop GPUs, from RTX 4050 to 4090
Mobile RTX 4050 can supposedly beat an RTX 3070 with just one-third the power use.arstechnica.com
LoL 😂
This is the thing, you assume a lot of things but you are unable to follow a conversation.
Enjoy your AMD/Nvida hybrid low cost high powered gaming thick book. I'm not stopping you.Read my comment again and maybe you will understand this time why I mentioned the mobile 4090.
Do you need a link for the announced upcoming Zephyrus G14? Do you want an explanation for what a dedicated GPU is?
I think for me the disconnect on the volume observation is that AMD has been building chips since the 70’s whilst Apple have really only gotten into the game in the last 15 years and in that time Apple near as I can tell ship more of their own designed chips than AMD could come close to. Perhaps you don’t include the smartphones and others but for me when we’re considering chip volume, I count all of the devices for which they design their own chips for: their laptops, desktops, phones, tablets, watches, the AppleTV, even the displays now that ship A15 chips down to the headphones with their own custom wireless chips which feature in many of their devices including the AirTags. Additionally there is the Afterburner cards in their Mac Pro line as well which speak to another angle of their explorations in chip design. Many of these chips aren’t on cutting edge process nodes which is why I’m curious about the source for your assertion that as Apple moves down the chain that AMD takes over given that AMD near as I can tell likely ship significantly less chips than Apple does with Apple continuing to ship devices with silicon leveraging those older processes (the watch chips are still on the 7nm node for example and they’re around 60 million shipments in 2021 based on Google searches I’ve done). Given AMD in the server space ship a few million units a year and they’ve at best got 30% of the rest of the market which near as I can tell is around 340 million mark giving AMD around 100 million in CPU shipments and significantly less for their GPUs (like 10% of the market? 2021 numbers put this at 50 million units total, so another 5 million for AMD; worse in 2022). This for me is the disconnect and what I want to understand, it isn’t a challenge it’s just trying to understand how you came to your conclusions. But I guess it’s just common sense.I was sure this is where you were trying to get at. Quite a pointless direction that doesn't challenge what I wrote.
"But the most concrete chart compared an unnamed Lovelace GPU to the RTX 3050 and RTX 3070, showing that the Lovelace GPU (almost certainly the RTX 4050) could perform at the same level as the RTX 3070 while consuming just 40 W instead of 120 W. RTX 3050 GPUs have been popular in thin-and-light designs like the Dell XPS 15 and Surface Laptop Studio, and bringing RTX 3070-level performance to those laptops without requiring RTX 3070-level heat, fan noise, or cooling capacity is a big deal."Much higher performance is much higher performance. If somebody needs to properly play AAA games on his computer he needs: 1) Games he can play, 2) Performance to enjoy the games he plays.
Also I'm sure that with power mode on a 4080-4090 laptop set to best battery life, performance in games will still be way better than on any Macbook, like way better.
Mobile RTX 4050(so mid-range) can supposedly beat a desktop RTX 3070 with just one-third the power use.
You are well over your head in this discussion.
![]()
Nvidia unveils a broad range of efficient new laptop GPUs, from RTX 4050 to 4090
Mobile RTX 4050 can supposedly beat an RTX 3070 with just one-third the power use.arstechnica.com
LoL 😂
God you're dense. Show a chart comparing a "game" against intel at 100% and Apple at N/A then AMD at 100+%. They could have found a game to compare it to if they are going to put Apple in the darn chart. N/A means not applicable. Not available. They could have found 1 game. you're dense.
SHOW IT! Love to see it in low power mode off the cord. I'd absolutely love to see this.
Enjoy your AMD/Nvida hybrid low cost high powered gaming thick book.
"But the most concrete chart compared an unnamed Lovelace GPU to the RTX 3050 and RTX 3070, showing that the Lovelace GPU (almost certainly the RTX 4050) could perform at the same level as the RTX 3070 while consuming just 40 W instead of 120 W. RTX 3050 GPUs have been popular in thin-and-light designs like the Dell XPS 15 and Surface Laptop Studio, and bringing RTX 3070-level performance to those laptops without requiring RTX 3070-level heat, fan noise, or cooling capacity is a big deal."
40W is more than the power needed to run M1 Pro. And if I wanted to I can find
A benchmark that makes that 3070 look like total crap.
I will only address this.I think for me the disconnect on the volume observation is that AMD has been building chips since the 70’s whilst Apple have really only gotten into the game in the last 15 years and in that time Apple near as I can tell ship more of their own designed chips than AMD could come close to. Perhaps you don’t include the smartphones and others but for me when we’re considering chip volume, I count all of the devices for which they design their own chips for: their laptops, desktops, phones, tablets, watches, the AppleTV, even the displays now that ship A15 chips down to the headphones with their own custom wireless chips which feature in many of their devices including the AirTags. Additionally there is the Afterburner cards in their Mac Pro line as well which speak to another angle of their explorations in chip design. Many of these chips aren’t on cutting edge process nodes which is why I’m curious about the source for your assertion that as Apple moves down the chain that AMD takes over given that AMD near as I can tell likely ship significantly less chips than Apple does with Apple continuing to ship devices with silicon leveraging those older processes (the watch chips are still on the 7nm node for example and they’re around 60 million shipments in 2021 based on Google searches I’ve done). Given AMD in the server space ship a few million units a year and they’ve at best got 30% of the rest of the market which near as I can tell is around 340 million mark giving AMD around 100 million in CPU shipments and significantly less for their GPUs (like 10% of the market? 2021 numbers put this at 50 million units total, so another 5 million for AMD; worse in 2022). This for me is the disconnect and what I want to understand, it isn’t a challenge it’s just trying to understand how you came to your conclusions. But I guess it’s just common sense.
AMD picked niche benchmarks. Just proving you can do the same for M1. Multigaming is nice, but they could have picked games to include the M1/2 in. Would have taken all of 10 minutes. Heck they had some MacBooks in the audience to do it with, LIVEThe discussion was about gaming, so go and find videos with 3070 vs the M1 Pro in gaming and see how it goes, not some niche specific machine learning test that better suited for the M1 for the memory alone and doesn't actually tell anything about the general GPU performance. it's like somebody would run a CUDA specific test on an Nvidia GPU vs the M1 which doesn't have CUDA so The End.
The M1 Pro uses around 60W in a constant CPU+GPU load like the Witcher 3, the test in on notebookcheck(also they tested Rise of the Tomb Raider on 1080p High, M1 Pro GPU: 53fps, Laptop 3070: 127fps, 58% faster) . This is the Base 14 Core GPU M1 Pro so it looks like its possible for a 7840HS+RTX4050 laptop with the CPU power limit set to 15W to use less power in games than a base M1 Pro and perform much much better. Things are looking better and better for Windows laptops.
Before the M2, the lineup was:
M1 → M1 Pro → M1 Max → M1 Ultra
M1 Pro was the second lowest level in the hierarchy.
That is ”lower midrange.”
huh?is very likely done plugged into power (unlike MacBooks that can perform the same with or without being plugged in).