The one thing that's true is they do currently have a f-tonne more compatible games. But yeah, those ads are otherwise a desperate joke.Can't wait for the next cringeworthy Intel ad.
The one thing that's true is they do currently have a f-tonne more compatible games. But yeah, those ads are otherwise a desperate joke.Can't wait for the next cringeworthy Intel ad.
Uh your Ps4 games are running in PS4 mode, so of course you are not going to see a difference.Apple really have hit the ground running … I can see a gaming focused appletv owning gaming in a couple of years
Having a ps5 all I can say that apart from the faster hd and the lack of noise I can’t honestly say it’s a drastic improvement over the PS4 . I spend most of my time playing PS4 games on it via the 2tb usb external hdd too 😝
See this for a performance comparison:It’s indeed supported in their API, but it’s not hardware accelerated. Apple chips don’t have dedicated hardware to accelerate the BVH instructions for ray-tracing.
Gaming is such a ridiculous argument and topic to discuss. Is every gamer in the world running an RTX 3080? I am not, I am using a 5700XT. A few of my friends are using a 1080. A popular streamer is running a 1080. The most popular video card on Steam is a 1060 with the 1050 Ti being #1 which the M1 standard competes with.Pointless arguing that this means AAA games should come to the Mac. This power only exists on the top end MacBook Pro. The regular M1 (which most people will likely get) only has 2.4 TF via this calculation. By the time we get to M3/M4 perhaps the entry level chip will offer ~10 TF but until then I don't think there's going to be much demand for a $3k device to play games on.
I have replied to your post here: #48
Well, a 10W laptop can do sustained 2.6TF. I don't understand why this is such a difficult thing to believe.
Also don’t forget Apple is on 5nm vs consoles like 7-14nm chiplets.. and Apple has a really really skilled engineering group than AMD at this pointTheir CPUs are efficient because of ARM64.
But explain to me how their GPUs are more efficient? How do they magically pull more FP32 performance? Are they magically switching less transistors, but have more TFlops? Pleas don’t confuse GPU performance with ARM64 vs X86 efficiency.
"Baseless conjecture. Chip design matters too. Again, M1 has 1024 shader cores operating at 1.3Ghz at 10W and it exactly hits the advertised 2.6TFLOPs — as I sad before, I have verified it myself using a long sequence of MADD operations. AMD and Nvidia need 2-3x power to reach the same MADD throughput.Basic physics… You switch more transistors in your GPU you pull more wattages and can hit higher TF numbers for both FP16/FP32. This is independent from memory power consumption (which is already very low for GDDR6) or SIMD arch.
- What do you mean with "Apple has a process advantage" and about which processes are you talking about exactly?
- Both AMD and NVIDIA have more experience than Apple when it comes to developing high performance GPU architectures and their power consumptions are tied to their maximum performance and node sizes. They are designed to deliver maximum FP32 performance in the industry, including superior ray-tracing performance (which Apple lacks) and GPU accelerate Tensor cores for ML applications (DLSS for example), INT4/8 operation and Mesh Shading. And all these features are actually accessible to developers, while Apple is not capable in any of these techniques, yet…
What generates all that heat, I wonder? 🤔Such an illiterate logic 🤣.
Apple’s marketing and their fanbase never fails to impress me, especially on MacRumors…
10.4TF doesn’t mean that it can actually use 10.4TF if it’s power limited to +-60W and probably even bandwidth starved, (lack superior L1 and L2 caches, don’t use unified L3 chance, lacks a geometry engine, doesn’t support techniques such as VRS and storage APIs…)
The AMD Radeon V is actually 14.9 TF and the AMD VEGA 64 is 13.4 TF, and both are slower than the PS5 and Series X, significantly slower in fact!
Don’t fall for the fake marketing people. By no mean these MacBooks are slow or anything, but if you actually believe that it’s faster than a PS5 you have to seek help…
For consoles to consistently hit their maximum TFlops performance they actually uses these kind of heatsink.
View attachment 1871105
Watts does not equate to TFs as much as you think… There’s a whole lot of other things like CPU RAM GPU timing, efficiency of calculations, hardware features available, the intelligence to turn hardware features on chip on and off, the locality of cache and ram to processing, the software support… all of that has more impactA 10W laptop cannot do sustained 2.6TF. At least not planet earth yet.
Such an illiterate logic 🤣.
Apple’s marketing and their fanbase never fails to impress me, especially on MacRumors…
10.4TF doesn’t mean that it can actually use 10.4TF if it’s power limited to +-60W and probably even bandwidth starved, (lack superior L1 and L2 caches, don’t use unified L3 chance, lacks a geometry engine, doesn’t support techniques such as VRS and storage APIs…)
The AMD Radeon V is actually 14.9 TF and the AMD VEGA 64 is 13.4 TF, and both are slower than the PS5 and Series X, significantly slower in fact!
Don’t fall for the fake marketing people. By no mean these MacBooks are slow or anything, but if you actually believe that it’s faster than a PS5 you have to seek help…
For consoles to consistently hit their maximum TFlops performance they actually uses these kind of heatsink.
View attachment 1871105
"Not yet" was available until late 2020....so "yet" became a real thingSo the whole planet earth is saying that can sustained that...A 10W laptop cannot do sustained 2.6TF. At least not on planet earth yet.
I like my share of games too, subscribed to GamePass Ultimate and saving for a PS5 or series X. AT some point my 2012 rMBP with bootcamp was the best (and most pricey) machine I used to play Assassin's Creed 2. But that was just an outlier in the grans scheme of Apple things.i think they are talking because 2 reasons..1 they like to have fun in their free time...and 2 a very complex game can show the power that an gpu can have, while others pro apps cant do (ofc there are pro apps that uses your gpu at maximum)
A 10W laptop cannot do sustained 2.6TF. At least not on planet earth yet.
Sortof agree, but the price of GTX 3080s and 3090s suggests there is quite a bit of demand for $3k devices to play games on.Pointless arguing that this means AAA games should come to the Mac. This power only exists on the top end MacBook Pro. The regular M1 (which most people will likely get) only has 2.4 TF via this calculation. By the time we get to M3/M4 perhaps the entry level chip will offer ~10 TF but until then I don't think there's going to be much demand for a $3k device to play games on.
If Apple wanted these machines to appeal to gamers they would do the logical thing and reach out to games studios to offer help and incentives for getting a few AAA games ported to macOS to get the ball rolling.
Given all this GPU power, there’s really no excuse anymore.
it depends on the gaming you do , of course...the metal support games will be on top...and also the general PvP games will work just fine, like CSgo(i think its called) dota, LoL, blizzard games etcI like my share of games too, subscribed to GamePass Ultimate and saving for a PS5 or series X. AT some point my 2012 rMBP with bootcamp was the best (and most pricey) machine I used to play Assassin's Creed 2. But that was just an outlier in the grans scheme of Apple things.
I don't think Apple's GPU would be the best showcase for games, because even with excellent raw power they still don't support modern gaming technologies like DLSS or Raytracing.
Cool. How big is the screen, and what is the battery life?Look, the M1, M1 Pro, M1 Max are really impressive but the fact of the matter is if you want to play AAA titles it’s not going to be on the Mac. And i’m not trying to throw shade, it’s reality.
Now as far as your price example, here’s one. I purchased an HP Omen 30L directly from Amazon for $1999.99 and it has the following specs. i9-10850K, 32GB of Ram, 1TB Nvme, Wifi 6, RTX 3080. I can play the latest games in 4K pushing 60FPS.
Ok, but these consoles uses such a huge heatsinks because they use ancient x86 crap.Such an illiterate logic 🤣.
Apple’s marketing and their fanbase never fails to impress me, especially on MacRumors…
10.4TF doesn’t mean that it can actually use 10.4TF if it’s power limited to +-60W and probably even bandwidth starved, (lack superior L1 and L2 caches, don’t use unified L3 chance, lacks a geometry engine, doesn’t support techniques such as VRS and storage APIs…)
The AMD Radeon V is actually 14.9 TF and the AMD VEGA 64 is 13.4 TF, and both are slower than the PS5 and Series X, significantly slower in fact!
Don’t fall for the fake marketing people. By no mean these MacBooks are slow or anything, but if you actually believe that it’s faster than a PS5 you have to seek help…
For consoles to consistently hit their maximum TFlops performance they actually uses these kind of heatsink.
View attachment 1871105
And that is really great news! Now if the Linux games especially AAA titles don’t start to arrive fast and furious then that means someone i.e. Publisher did not want to take the risk to develop for Linux. It’s the same thing Apple finds itself in.Except, thanks to Valve's work on Proton for the Linux-powered Steam Deck, games don't need to be ported to work on Linux any more.
Post title begs to differ.here we were talking about the mac, not the PS5
Who is this a problem for? Why does this keep coming up? Why do people constantly complain that product X doesn’t fit everyone on the planet?Not everyone uses their gpu for content creation tasks. Many just want to game and can’t on macOS at least not like they can on windows.
I don’t play games on a laptop. I prefer to hookup my PC’s, XBOX series X and PS5 to 50” 4K TV’s.Cool. How big is the screen, and what is the battery life?
(crickets)