No, that is not what this is for.so we will now be able to get 120hz mode by default on a supported monitor?
None of the contents demoed in the video is current, or in active gameplay.
Sorry, I'm quoting you again, but I'm pretty sure Apple is aiming for the future here. In the future there might be more games for Mac if Apple now with their own GPUs manages to get more powerful graphics into the hands of more customers than has previously been the case. Also Apple seems to be improving Metal with new features and has also released Metal tools for Windows.There are no AAA games for Mac (Battlefield, Cyberpunk, COD, etc). It's a nice feature.... but, pointless.
All the people who keep being “blown away” by M1 would be turned a hilarious shade of purple at the dismal FPS they would get on M1 hardware compared to what true dedicated graphics on PC is able to generate.
this will make for a great angry birds experience though!
No, that is not what this is for.
This is for varying the refresh rate depending on content. It has nothing to do with changing the default.
Both are current, and both are active gameplay. What are you talking about? He plays both the retail and classic.None of the contents demoed in the video is current, or in active gameplay.
When you feel like addressing my post let me know. I wasn't talking about Apple's chips. I was talking about Apple not giving two **** cents about gaming. Everything from extremely poor thermals to asininely designed desktops/laptops clearly states that Apple has not now, nor will they ever get gaming.I'm wondering if I can save this comment forever like the ones complaining about the original iPod (thread #500) or the original iPhone, or the iPad, or Apple Music, or really man there are a lot of examples aren't there?
Apple are making their own chips for Macs now, and based on their previous success, they seem to be very good at it. The only thing holding Apple back from matching or surpassing what is happening at AMD and nVidia is their will to do so, not their ability.
You're talking about the past. I think the point @rolphi is trying to make is that the future might not be the same. And to be certain that Apple will never get gaming just because they haven't in the past also doesn't make sense to me.When you feel like addressing my post let me know. I wasn't talking about Apple's chips. I was talking about Apple not giving two **** cents about gaming. Everything from extremely poor thermals to asininely designed desktops/laptops clearly states that Apple has not now, nor will they ever get gaming.
You're talking about the M1 now? Of course something that sits in a MacBook Air and iPad Pro can't compare to the GPUs from AMD and Nvidia. But what will Apple offer the rest of this year and the next graphics hardware wise on the Mac side of things? We don't know that yet.Apple's chips, since you brought them up, can't hold a candle to the ones made by Nvidia, AMD for and others for gaming and that's a fact.
I think I’m like one of 10 people that prefers a glossy screen instead off antiglare.Yes... ONE. And I have one. It's pretty slick. Granted not as big as the heavily overpriced Ultrawide from ASUS.
It's a nice feature for more than just gaming. Besides, it's a chicken & egg problem. Apple now doing their own silicon may very well make serious leaps what GPU performance their hardware has.There are no AAA games for Mac (Battlefield, Cyberpunk, COD, etc). It's a nice feature.... but, pointless.
Being a person always with high-end/workstation GPUs from AMD & Nvidia for my workstations (3090 on my non-workstation desktop as well) + the Series X & PS5, what were you expecting from the M1, an entry-level CPU that's merely better designed for everyday computing than Intel's i5 CPUs & AMD's equivalent CPUs?not sure what the point is unless developers make games.
The M1 is mediocre with gtx 1060 like performance.
This is only worth it if we get AAA games and an M1x with like 3070-3080 performance.
Probably better off to buy a PS5 if I could find one lol.
Yes I agree and I except one of the next M iterations to be close to a 3070 (at least I hope so). We will see in a few monthsBeing a person always with high-end/workstation GPUs from AMD & Nvidia (3090 on my non-workstation desktop as well), what
Being a person always with high-end/workstation GPUs from AMD & Nvidia for my workstations (3090 on my non-workstation desktop as well) + the Series X & PS5, what were you expecting from the M1, an entry-level CPU that's merely better designed for everyday computing than Intel's i5 CPUs & AMD's equivalent CPUs?
M1 integrated graphics should *never* be compared to the performance of bleeding-edge discrete GPUs. It's a faulty & naive comparison.
I'd wait till Apple reveals what it would do for its high-end desktop hardware before having any verdict on Apple Silcon vs AMD & NVidia's dedicated components of doing the same thing.
Note that Apple tends to have a supplier chain advantage with the same suppliers AMD & Nvidia uses (Samsung & TSMC) which *can* enable Apple to catch-up very quickly to both.
What about Metro Exodus, Divinity Original Sin 2? And so many others!If I were you, I wouldn't hold my breath :/ The only (semi-)recent AAA game on the system is XCOM 2 and even it has major quality restrictions (low-quality textures etc.)
I dunno why this double quoted. iOS15 I guess.
Then there’s something wrong because my M1 MBP 16gb has Classic and Shadowlands pegged at 60fps (I believe on 7 but I’d have to check) when using the built in 4.5k screen.I have seen several videos and followed the WoW forums and most I have seen is 50-60 with dips in particle effect heavy areas.
If you have video of 120 FPS I would love to see it
15 FPS in SL on brand new, fully patched iMac
Title says it all. 24" iMac, M1, 2021 Memory: 8GB FPS: 15 with setting at 6 and 100% render.us.forums.blizzard.com
15 FPS in SL on brand new, fully patched iMac
Title says it all.
24" iMac, M1, 2021
Memory: 8GB
FPS: 15 with setting at 6 and 100% render.
None of those are DooM.What about Metro Exodus, Divinity Original Sin 2? And so many others!
oops, sorry, my bad - a temporary lapse of reason. I mistakenly tought for a moment this thread is about iOS and not macOS. (And on iOS indeed the non-XCOM 2 AAA titles are generally 3+ gens old ( GTA, Max Payne, LEGO games (Star Wars, etc), Goat Simulator, Star Wars KOTOR, Rayman, Day of the Tentacle, Full Throttle, several Final Fantasy games, older Rollercoaster Tycoon).)What about Metro Exodus, Divinity Original Sin 2? And so many others!
Adaptive Sync appears to work in windowed mode as well.
WWDC sessions suggest that developers have to support it explicitly, but it works for RoTR and the recent Deus Ex, which are a couple of years old. Maybe it's just Feral doing an excellent job as usual. I'm curious to know if it works on other games.