Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wait, did you seriously say INTEL arc would be competition? HAHA
Yes. They may not win against AMD but they only have to "be on par, not beat" nVidia at their entry level GPUs. Intel has enough cash and a large incentive team that'll help convince many OEMs to go with them and nvidia may not be able to overcome Intel's spending spree on this. Intel is likely to give bigger discounts to OEM for integrating both CPU and GPU together. Given how thin the profit margins are already on these smaller/thin laptops, I seriously doubt OEMs will want to go with Intel/Nvidia if they earn more with having both Intel CPU/GPUs. Intel has a track record for pulling stunts like this.

Early benchmarks on Samsung laptop with entry level Arc (A350M) already put them (with 30w tdp) above Nvidia's MX450 card (25w); https://www.guru3d.com/news-story/intel-arc-350m-benchmarked.html

There are some driver issues of course as found here (https://hothardware.com/news/arc-a350-benchmarks-big-gains-disable-dtt ) and it showed that A350M doubled in pref with the intel's tuning utility turned off, which suggests some driver issues.

Remember, competition isn't always about having the best performance, it can sometime just be good enough for the bucks. If I can get 90% pref of nvidia card for 100-200$ cheaper, I'll pick Intel.

Either way, I'm happy to be proven wrong because competition is almost always a good thing for customers.

Also, rumors are that Intel is already prepping to release 2nd/3rd gen of Arch within a year of each other, the initial Arc product is expected to be a short-lived product anyway. Given their previous delay issues, I am not expecting this to be the case but it will be interesting to see what their 3rd gen will be like because they expect 2nd and 3rd gen alone to be a big leap.
 
Last edited:
These services really only are good for certain types of games, unfortunately

Even the smallest amount of latency makes many game types unusable
I used to hate game streaming, because well, I play competitive multiplayer games. I could not STAND the lag and delay between moving the mouse and what was on screen.

But when I tried GeForce now 3080 tier, it was like playing native. 0 lag at all. What GeForce has done here is quite amazing. It’s like I’m playing native.
 
Yes. They may not win against AMD but they only have to "be on par, not beat" nVidia at their entry level GPUs. Intel has enough cash and a large incentive team that'll help convince many OEMs to go with them and nvidia may not be able to overcome Intel's spending spree on this. Intel is likely to give bigger discounts to OEM for integrating both CPU and GPU together. Given how thin the profit margins are already on these smaller/thin laptops, I seriously doubt OEMs will want to go with Intel/Nvidia if they earn more with having both Intel CPU/GPUs. Intel has a track record for pulling stunts like this.

Early benchmarks on Samsung laptop with entry level Arc (A350M) already put them (with 30w tdp) above Nvidia's MX450 card (25w); https://www.guru3d.com/news-story/intel-arc-350m-benchmarked.html

There are some driver issues of course as found here (https://hothardware.com/news/arc-a350-benchmarks-big-gains-disable-dtt ) and it showed that A350M doubled in pref with the intel's tuning utility turned off, which suggests some driver issues.

Remember, competition isn't always about having the best performance, it can sometime just be good enough for the bucks. If I can get 90% pref of nvidia card for 100-200$ cheaper, I'll pick Intel.

Either way, I'm happy to be proven wrong because competition is almost always a good thing for customers.

Also, rumors are that Intel is already prepping to release 2nd/3rd gen of Arch within a year of each other, the initial Arc product is expected to be a short-lived product anyway. Given their previous delay issues, I am not expecting this to be the case but it will be interesting to see what their 3rd gen will be like because they expect 2nd and 3rd gen alone to be a big leap.
People already don’t buy AMD gpu’s because of things like DLSS. People won’t buy a laptop if it doesn’t have nvidia in it.

AMD doesn’t have CUDA (which is important for video developers)

Nor does it have Tensor cores for ML accelerations (or a software stack that supports).

If anything, it will hurt AMD, not nvidia.
 
I’ve been playing Cyberpunk, maxed out, rtx full on. All perfect on M1 MBA and M1 iPad pro - looks fantastic on that. Great to have a native version but doubtful it will make a performance difference. Totally worth it. I just got thd standard version. On my Studio its a bit disappointing as I have an ultra wide monitor but none of the Geforce tiers support it. With streaming this good there’s not much point in getting a souped up pc or trying to emulate on mac… as long as the games you want are supported. Dying light 2 next :) worth paying over the free option as there’s zero wait times. Rtx is nice but you have to turn it on. Tbo if you forget you don’t really notice
Awesome and good to know!

Last FPS I played on a mac was my 2004 macbook and I think it was COD or COD2.

I wont give up my PC desktop anytime soon but its good to know there is a reliable workaround when it comes to AAA titles and play-ability.
 
Would be better if Apple integrated Nvidia's GPUs into their Macs. Either directly putting in their hardware in their Macs or at the very least allowing drivers being written for them. As it stands, Macs are absolute garbage for gaming... no wonder Hackintoshes are so popular.
so when it comes to playing games, you are saying that having an nvidia GPU in your mac with no games to play is better than having this client that lets you play all the games???
 
That’s not a problem, though. Those that need CUDA still have CUDA, likely in a stable non-macOS system. Apple and others are focusing on the huge market of people that may need some performance beyond the CPU (for science, machine learning or other purposes), but not necessarily CUDA. …
The problem is, I know scientists who like macOS, use Macs for their personal machines, and would prefer to use Macs for their professional work (and can often spec their own machines), but can’t use Macs because they won’t do CUDA, even if they are very high performance.
 
Intersting, another baby step towards true Apple (non-iOS) gaming!
You do know that it's Windows games running remotely on a Windows PC with their graphics and sound streamed via Linux based internet servers?

The only "Apple" part here are the window decorations around the game and the $1000+ game controller you are using.
 
You do know that it's Windows games running remotely on a Windows PC with their graphics and sound streamed via Linux based internet servers?

The only "Apple" part here are the window decorations around the game and the $1000+ game controller you are using.

Yes, I am well aware of that. My point was that anything that allows for game play on a Mac helps the cause.
 
People already don’t buy AMD gpu’s because of things like DLSS. People won’t buy a laptop if it doesn’t have nvidia in it.

AMD doesn’t have CUDA (which is important for video developers)

Nor does it have Tensor cores for ML accelerations (or a software stack that supports).

If anything, it will hurt AMD, not nvidia.

Can you provide actual source to that? "People" as in everyone in the entire market is refusing to buy AMD because of DLSS? What entry level GPU right now offers DLSS support; both gtx 1060 and gtx 1650, the two most popular cards of this level does not have DLSS nor tensor core and yet it is still fine (source: https://store.steampowered.com/hwsurvey/videocard/)?

Steam Deck seem to be selling just fine, Xbox, PS5 seem to be selling fine without DLSS and uses AMD GPUs.

I've never heard any of my friends said they won't buy AMD because of DLSS. Their only issue was finding one in stock.

Intel will have XeSS (not to mention their oneAPI stack) and AMD has FSR2 coming soon, plus Epic has Unreal 5 TSR that looks decent (Tokyo ghostwire has a UE4 backport of it). DLSS is the gold standard but it ain't the only one.

DLSS isn't a blocker for casual gamers, same thing for ray tracing. My sister play games often, she couldn't care less about DLSS as long as the laptop can play games she play. DLSS is becoming more popular for sure but so is FSR and the eventual FSR2/Epic TSR.

CUDA/tensor cores for entry level laptops? The people that needs such thing wouldn't be buying these kind of laptops in the first place, they would be paying up for a powerful laptop that would have a dGPU and yes, there nVidia will remain the king for the moment.
 
  • Like
Reactions: Unregistered 4U
Has the Mac update started rolling out? My Mac is still on the old version, but my PC upgraded already.
 
I don't think you wouldn't notice a big difference in resolution due to the compression - at least not as much as natively playing on a machine. On my PC, I notice a difference quite easily. Higher resolutions look sharper and look like they need less anti-aliasing on the lines. Even a one tier bump make a difference for me. However, I definitely sit closer to my monitor than playing on a TV.
 
I play cyberpunk on it and its fantastic. at 1080p I can't really tell the Difference from across the room and I got in at the tail end of the founders so I only pay $5 a month. would def cost way too more to game any other way. There is a more expensive tier that works at higher resolution and frame rate but I don't want to pay for that personally as its a major bump.
I don't think you wouldn't notice a big difference in resolution due to the compression - at least not as much as natively playing on a machine. On my PC, I notice a difference quite easily. Higher resolutions look sharper and look like they need less anti-aliasing on the lines. Even a one tier bump make a difference for me. However, I definitely sit closer to my monitor than playing on a TV.
 
Comparing games on each cloud gaming platform you can see what's going to happen. It's exactly like movie streaming, in order to get the full selection you want to play, you're going to have to subscribe to multiple game platforms.
 
Would be better if Apple integrated Nvidia's GPUs into their Macs. Either directly putting in their hardware in their Macs or at the very least allowing drivers being written for them. As it stands, Macs are absolute garbage for gaming... no wonder Hackintoshes are so popular.
I think using NVIDIA GPUs in the Mac would be a step in the opposite direction for them as it would hinder the development of the Apple Silicon SOC due to the thermals and power requirements of the discrete GPU
 
you have to buy the AAA games for full price $60, $80, etc. on external stores
Not sricly true, you make it sound like you have ti buy games like Stadia, as yiu can buy a ton of top titles on sale for as little as $5 and of course it will play a loads of games that you already have. You can use it for free and if you bought any games from one of the top stores you can probably play it… at no cost
 
Nvidia faces serious competition in its core GPU business from advance CPU+GPU SoC solutions like what Apple delivered with the M1.
Not really, Nvidia is facing some competition only for some of their consumer GPUs and that's about it.
Also Apple can't touch Nvidia in gaming.
Nvidia's core bussines(there most lucrative business to be precise) is data centers/servers.

If Intel and AMD follow suit and up their integrated GPU game then Nvidia will be the odd man out trying to sell discrete GPUs to a very small sliver of the market.

I don't think they will, AMD has been designing really fast APUs for some time for Consoles but haven't given any hint that they want to make similar chips for the general PC users.
I would say discrete GPUs aren't in any danger for the foreseeable future. Intel is just entering this market for example.

This is why they wanted to buy ARM - they knew the future direction of high-integrated SoC's and needed ARM to reach a wide market for their own solution.

Not really, I think they mainly just wanted ARM's human resource more than anything. It's not like Nvidia can't make ARM chips or SOC's with ARM Cores.
 
Nvidia needed to acquire ARM to take control of the direction of the platform and the licenses for integration with their GPU IP. That's why there were willing to pay $40 Billion for the company. Nvidia leads the market today. I'm talking about the future. If they lose the "socket" on future generation systems then they'll be out of the game.
No they won't.
Did apple acquire ARM or something?
 
Lots of licensees make ARM CPUs. Want Nvidia wanted was to control the direction of the ARM architecture, and to design-in and integrate future GPUs into the ARM ecosystem to sell those designs at scale to the market. They need to own ARM to have that control. Again, that's why they were willing to spend $40B for the company.
You do understand that Nvidia can and has been designing custom ARM CPU Cores which they can control and integrate however they want.
 
You do understand that Nvidia can and has been designing custom ARM CPU Cores which they can control and integrate however they want.
Ask yourself a question - why would Nvidia be willing to pay $40B for a company they already have a license from and from which they've already been developing and making their own CPUs from?
 
Awesome and good to know!

Last FPS I played on a mac was my 2004 macbook and I think it was COD or COD2.

I wont give up my PC desktop anytime soon but its good to know there is a reliable workaround when it comes to AAA titles and play-ability.
Check it out on you pc. It’s a lot cheaper than buying a new gtaphics card!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.