Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Raw performance....nice...

But does that mean it can specific polygons that only nvidia and amd non mobile gpus can? This is like what Pc games has been using.

So pretty much have to rebuild a pc game ground up
 
They are not compatible with Apple Silicon. All Macs with AS will have Apple GPU's.
Sorry, but I find that hard to believe. And I really don't see why they would be incompatible. Maybe with M1 but what about future chips? Does anything prevent Apple from designing an ARM CPU that would work with 3rd party GPUS??
 
For people that just don’t get it. Go and watch YouTube videos of the latest iPad Pro in regards of Photoshop/Lightroom performance or video processing (editing and rendering). In some instances is comparable to a more expensive PC but in a mobile chip that was made in 2018 (albeit updated) because the software is so much more optimized as well.

The M1 is better than that iPad Pro, so for the price point, size and battery life, this is a huge offering. Maybe you can do small gaming but the purpose to compare it to a dedicated graphics card, half the size of the laptop but in a SoC with that low power usage, says a lot.

People trying to compare it to a Console or a dedicated GPU is just missing the point. If anything is laughable they even compare this SoC to all of that.
 
And what exactly would prevent Apple from using Nvidia or AMD discrete GPUs in other machines?
I don't think it would not be possible to use these dGPUs (which have their own on-board VRAM) and still use the unified-memory architecture that seems to be at the core of the Apple Silicon designs. I suppose some kind of hybrid might be possible, assuming the Apple Silicon support sufficient PCIe4 lanes, but Apple has made no mention of this option.

I think a separate GPU "chiplet" on the SoC but off the die (like the M1 RAM) may be what they will do for higher powered GPUs. This may be what "Lifuka" is. This would be similar to the AMD APUs used in the PS5 and XBox X.
 
  • Like
Reactions: StefDeLux
What a ridiculous comparison. That card is designed to be a low end PCI-E GPU, with all of the power stages, ports, and dedicated RAM that product requires. You might as well compare a bicycle with a car.
OK...then what is the comparison to an integrated GPU? An AMD Ryzen 4800U? Intel Tiger Lake with Xe? It compares favorably to these at (probably) less TDP.
 
Well they are benchmarking against old generation AMD, ATI cards

those cards had been released at 2017

Nowadays both AMD, Nvidia play muuuch higher

amazing chipsets for a mobile phone, no doubt

Xbox Ones Series S = 4 teraflops at 300 euros (and it includes other hardware apart from the graphic card)

Xbox One Series X = 12 teraflops

Custom PC = 20 teraflops (and isn't the highest you can get with a single video card)
that's right!!!
XBOX One SS is a powerful beast it runs DaVinci Resolve and After Effects smooth as silk, and only 300€

Nvidia and AMD play much higher with dedicated, more expensive and far far less thermal efficiency GPUs, shame there aren't yet Apple's one to compare, so LET'S WAIT.

in the meanwhile, it's just mind blowing how M1 compares to it's range direct competition GPUs.
 
I see a lot of silly back and forth on this. The bottom line is that this is competitive with my 2-year-old eGPU. Given it means I would have performance on par without all the annoyances of an eGPU, that’s pretty handy.

Plenty of people (like me) aren’t gamers but like playing games from time to time. This is good enough. No, it’s not some huge card requiring an upgraded power supply on your desktop. Who cares? I can play Bioshock remastered reasonably with this GPU. Nice. This is on a machine with crazy battery life.

We can debate the merits of Apple’s GPUs against high end stuff (most likely) when the new Mac Pros are available. These are likely to be the very last Macs that see Apple silicon.
Genuine question, can you really play Bioshock on ARM?
 
For people that just don’t get it. Go and watch YouTube videos of the latest iPad Pro in regards of Photoshop/Lightroom performance or video processing (editing and rendering). In some instances is comparable to a more expensive PC but in a mobile chip that was made in 2018 (albeit updated) because the software is so much more optimized as well.

The M1 is better than that iPad Pro, so for the price point, size and battery life, this is a huge offering. Maybe you can do small gaming but the purpose to compare it to a dedicated graphics card, half the size of the laptop but in a SoC with that low power usage, says a lot.

People trying to compare it to a Console or a dedicated GPU is just missing the point. If anything is laughable they even compare this SoC to all of that.
You're missing the point, the whole article is about the MI iGPU competing and winning against a gaming GPU! When folks start pointing out that it is only about as good as a 4 year old budget gaming card, then all of sudden there is no merit in comparing with dedicated gaming cards?
 
  • Like
Reactions: emmiranda
Genuine question, can you really play Bioshock on ARM?
Looks like it - Bioshock remastered is one of the games Feral updated to 64 bit compatibility for Catalina, so if we go by Apple's assurances it should work just fine.

There was also a native iOS version, which was sadly discontinued many years ago (and IIRC 32 bit only), so running on ARM is a definite possibility.
 
  • Like
Reactions: ani4ani
The thing I am more interested in is this:
If Apple can show descent improvements year over year with the M-series chips, would Intel and others move to the SOC style that Apple is using to compete. Samsung tends to follow Apple on phone changes (headphone jack etc), so why not the same follow up for the x86ers.
 
The thing I am more interested in is this:
If Apple can show descent improvements year over year with the M-series chips, would Intel and others move to the SOC style that Apple is using to compete. Samsung tends to follow Apple on phone changes (headphone jack etc), so why not the same follow up for the x86ers.

Samsung made large phones, high resolution screens, OLED's, wireless charging, reverse charging, multi-cameras, face unlocking, multi-tasking, pen support, underscreen fingerprint sensor, water-proofing, 5G etc before Apple...of which Apple now "tends to follow"
 
  • Like
Reactions: torncanvas
Samsung and Apple have different innovation levers and models.

Samsung is at the forefront of hardware component: best screen, superb craftsmanship on high end devices, fingerprint sensors, camera sensors, lenses, etc... But they have no control on the operating system. So they tend to innovate more on the hardware side.

Apple has total control of the software and some control of the hardware. They control the performance of the CPU but they don't make screens, camera sensors, etc... Although they certainly influence those components designs. Thus they tend to innovate more and go further in polishing on the software side.
 
  • Like
Reactions: emmiranda
Well, like all the time?
  1. charger
  2. display
  3. storage

I use a 2014 MacBook Pro 15 in the living room a lot and I use WiFi and just need to charge it. I use a home NAS for storage. The biggest headache is charging because it only lasts about five hours on battery. I have a 3x4k desktop if I need a lot of monitors. The MacBook Pro 13/AS would be perfect for battery life. Ideally it would have more ports and more RAM in case I want to repurpose it but but I could always just buy another one they make a real Pro model. I do not see it replacing my desktop as my desktop is Mac Pro class but I'm open to the possibility.

I am open to replacing my desktop with a Mac if it can:

- Drive 3x4k monitors
- 64 GB of storage minimum, preferably upper limit of at least 256 GB
- Dual NVME storage, user-upgradeable
- USB-A type connectors

Outperform a Ryzen 5950X.
 
You're missing the point, the whole article is about the MI iGPU competing and winning against a gaming GPU! When folks start pointing out that it is only about as good as a 4 year old budget gaming card, then all of sudden there is no merit in comparing with dedicated gaming cards?
I do understand that, but while those eGPUs are being promoted as gaming cards, the purpose of the iGPU in the M1 is not to run games specifically (that’s why they showed it for a few seconds as is not the priority, yet is a side effect), but to run applications for video and photo editing in my opinion. I think many here comparing it to eGPUs or a Console haven’t experienced how is to edit in those apps in an iPad Pro for example, which already run some gaming apps great because they’re optimized.

Fortnite runs in an iPad Pro at 120fps on a chip 2 years old, and yes I know is Fortnite and in low settings, but the point is that first Apple is not focused on AAA gaming (it would see interesting if they do eventually) nor this SoC is being promoted to show you you can play high end games on it, and yet they showed Tomb Raider playing on it, again, on the iGPU of an SoC, a chip way cheaper, low power usage and smaller than a dedicated GPU.

Point is, if you can play let’s say Doom 2016 on it eventually, that’s already a huge win in my opinion. I mean it runs on the Switch, software has to be optimized (that’s why benchmarks are impressive for early showings) but the fact that this forum is having this kind of talk is impressive, again, on a SoC (that’s what some people don’t understand in my opinion).
 
It appears the article is referring to "offscreen" FPS, instead of "onscreen." Can anyone please tell me the difference here? Does a GPU really render frames offscreen? I've pulled up a comparison to an old RX 480, and you can see the M1 performs great in offscreen, but the RX 480 performs better in onscreen... :confused: what does it mean??
sxs.png
 
I do understand that, but while those eGPUs are being promoted as gaming cards, the purpose of the iGPU in the M1 is not to run games specifically (that’s why they showed it for a few seconds as is not the priority, yet is a side effect), but to run applications for video and photo editing in my opinion. I think many here comparing it to eGPUs or a Console haven’t experienced how is to edit in those apps in an iPad Pro for example, which already run some gaming apps great because they’re optimized.

Fortnite runs in an iPad Pro at 120fps on a chip 2 years old, and yes I know is Fortnite and in low settings, but the point is that first Apple is not focused on AAA gaming (it would see interesting if they do eventually) nor this SoC is being promoted to show you you can play high end games on it, and yet they showed Tomb Raider playing on it, again, on the iGPU of an SoC, a chip way cheaper, low power usage and smaller than a dedicated GPU.

Point is, if you can play let’s say Doom 2016 on it eventually, that’s already a huge win in my opinion. I mean it runs on the Switch, software has to be optimized (that’s why benchmarks are impressive for early showings) but the fact that this forum is having this kind of talk is impressive, again, on a SoC (that’s what some people don’t understand in my opinion).
I had been posting the APU of the XSX and PS5 as an answer to the whole no iGPU is as powerful as the M1. Yes it is a game of semantics.

I am on team Apple, and I hope that every time there is a benchmark comparison for performance with the new hardware that it causes FromSoftware et al, to decide that maybe putting Demons Souls (or Bloodborne, or Sekiro) on macOS is something they should be doing.

Like I wish (baring the spat) that Apples hardware was the premiere hardware for UE5 (specifically Nanite and Lumen) and not the PS5.
 
  • Like
Reactions: emmiranda
I had been posting the APU of the XSX and PS5 as an answer to the whole no iGPU is as powerful as the M1. Yes it is a game of semantics.

I am on team Apple, and I hope that every time there is a benchmark comparison for performance with the new hardware that it causes FromSoftware et al, to decide that maybe putting Demons Souls (or Bloodborne, or Sekiro) on macOS is something they should be doing.

Like I wish (baring the spat) that Apples hardware was the premiere hardware for UE5 (specifically Nanite and Lumen) and not the PS5.

In regards of that, citing my previous example, Tomb Raider was being run under Rosetta 2, not even optimized it seems and it looked fine (as in, that’s amazing).

It seems that games would have 2 choices, either being run under Rosetta 2 which is not the optimal scenario for the potential yet it just works and companies have to do nothing (the cheapest) OR games being ported which costs companies money and would be up to them in a specific basis.

This example is about any application right now, regardless if it’s a game or not. Software has to be optimized to run better on ARM, and if it’s worth it will depend of the demand and how popular is the platform. My guess is that, if Rosetta 2 is good enough to run some software, I doubt companies will choose to optimize it for M1 given that Apple already did the work with Rosetta 2.

In this case, I doubt FromSoftware would choose to spend money to port over to Apple Silicon. Now, if UE5 is scalable as they say and it can run under anything, is certainly a possibility (UE5 will run on the Switch, but of course not the same performance).

And again, the fact that this talk is being made is something huge, maybe not now but eventually. Apple Silicon will never get the performance of a current eGPU, but current eGPUs are pushing now for 4K/8K 120fps tray racing. If in a few years Apple Silicon can run let’s say Gears 5 at modest settings, maybe even GamePass would be available on Macs as well, which is insane.
 
The GTX 1050 series was a budget card in 2016. I don’t remember it being recommended for gaming even at the time, unless budget was a serious constraint.

In all fairness, Apple are nearly always at the root of every spat. They simply don’t know how to meet other companies half way and while that’s worked out, it doesn’t make it right to go blaming other companies for Apples historically poor graphics offerings.
Its integrated graphics. When Apple release and grapichs boards maybe your point will be valid
 
Literally none of the products that just got updated had dGPUs before, and this new GPU seems like it blows Intel's iGPUs that it replaces out of the water, what exactly did you want?
What most of us have wanted for a long time: a dGPU option.
 
What most of us have wanted for a long time: a dGPU option.

Intel UHD 630 Geekbench 5 OpenCL is about 5K and M1 is about 20K so better performance - but, for what the vast majority do, UHD 630 is fine. It's fine for what I do actually - it's just hard to find a motherboard that provides 3x4k support on integrated. So M1 is faster though it really doesn't matter for the vast majority.

I don't know that we're going to see a dGPU from Apple. nVidia and AMD have big markets for dGPUs to sell into but Apple would only have itself.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.