Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Can someone explain to me where is this on the consoles scale? Can it play games comparative to PS4?
If you ask anything more than that on a MacBook you are a crazy brat. My 2015 MBP suffocates with games from 2008.
 
How often does an entry level user make use of more than one, let alone two TB3 ports?

I consider myself a power user, but I have very rarely connected more than two USB-C / Thunderbolt 3 devices to my 2016 MacBook Pro. There are plenty of docks available for more specialized workloads.

I think Apple made the right call with only two ports for the M1 MacBooks. The only thing I might miss is the ability to connect the charger from either side.
 
Also remember that these GPU's they are comparing to are over 3 years old... so I'm not really seeing why we should be so excited. It's a click-bait title with **** analysis. Sure they are decent for mobile stuff - but they in no way compare to full blown dGPU's from AMD or Nvidia.
Thats true, but you cant fit one of those in a MacBook Air. The M1 is really up against AMD with their 4xxx APU chips, but even then the M1 is running with a lower TDP.
 
Finally Macs will have good GPUs. With Nvidia's dominance and the spat between them and Apple it's been bad.
Not really, AMD is surpassing them and now dominate low-mid range and not on par with High end. I'm sure M1X/M2 will be impressive at GPU but I suppose Apple will continue to use Radeon in high end Macs, anyway having iGPU as fast is soooo great for many reasons.
 
I bet a M1 will catch up to PS5 and X Box Series X in 3-4 years and Apple will create a stripped down version of a Mac Mini just for gaming and start entering gaming market. They can potentially come up with a program where people can upgrade the chip annually and change the gaming industry forever.
You'll probably just have to upgrade the entire system every year. It would also probably be an upgraded Apple TV instead of a stripped down Mac mini.
 
And what exactly would prevent Apple from using Nvidia or AMD discrete GPUs in other machines?

I'm not sure any technical issues but at the WWDC engineering and all Apple Silicon references shows they are going to develop the entire SoC including the CPU and GPU's in house going forward. In fact they are more excited about the GPU's and neural/ML engines than even the CPU roadmap.
 
I'm hugely impressed by this chip. Superb efficiency and has more integrated GPU performance than a PS4. Apple really smashed it out the park. To put this into perspective this is ~25% the performance of the PS5 that just came out, that's no small feat all things considered.

Can't wait to see what their future M chips are capable of, especially with the adoption of LPDDR5...
 
  • Like
Reactions: ader42
"these models" also include a mac mini, which has 2USBC (an 2 A ports), while the old mac mini running intel had four USBc and two A.
Many things suggest that Apple plans to split the mini line into "basic mini" model (the one you see today) and a mini pro model with the same sort of IO as the current Intel model.

The mini pro is waiting on the next step (an M1X?) that will presumably have 8 large cores, support at least 32GiB DRAM, and support at least four USB/TB ports; ie something that will match the current (non-low-end) MBP, iMac, and intel "mini pro".
 
If only any of the Apple haters could suggest what we should buy for $999 that is better than an M1 powered MacBook Air... especially when most of us don’t care about gaming. I have a PS5 for gaming.

I suppose when the M1X iMacs come out next year they will say that they are too expensive because a $300 Wintel box is cheaper.

They are the sort of people that are too poor to buy a machine that will last them so whinge about upgradability, when I upgrade I buy an entirely new machine lol
 
I see a lot of silly back and forth on this. The bottom line is that this is competitive with my 2-year-old eGPU. Given it means I would have performance on par without all the annoyances of an eGPU, that’s pretty handy.

Plenty of people (like me) aren’t gamers but like playing games from time to time. This is good enough. No, it’s not some huge card requiring an upgraded power supply on your desktop. Who cares? I can play Bioshock remastered reasonably with this GPU. Nice. This is on a machine with crazy battery life.

We can debate the merits of Apple’s GPUs against high end stuff (most likely) when the new Mac Pros are available. These are likely to be the very last Macs that see Apple silicon.
 
When Apple can beat a GTX 2080 ti or the new 3080, I'll be impressed until then they are still playing catch up... but appreciate the efforts all the same.
Nvidia's own 75W GPU cards can't come close to beating their own 320W GPUs in performance. So why do you expect Apple to do equal or do better than Nvidia with their entry level 10W iGPU SOC?
 
I wonder how much of Apple’s stellar performance is due to their near monopoly on 5nm silicon.
Apple has another advantage. Intel, nVidia, AMD and Qualcomm all have to design and market their products to companies that have different needs. Apple has only one customer and they get exactly what they want.
 
For those not impressed by this, for your reference 1050 Ti is a $140-160 Dedicated GPU with a TDP of 75 W. The Card is PCIE 8x, that is 8 Inch long. Cramming that hardware into a fanless SOC is quite impressive to say the least.
Right this is what everyone is not understanding here.

The M1 has the performance of an entry level dGPU that is 8x more power hungry, oh and it has one the fastest cpus on the planet within the same die and thermal envelope.

This is mind blowing. People comparing this to a 30-40 core 200 watt GPU don’t get it.

Apple could probably scale this up to a 40 core dedicated GPU die tomorrow if they wanted and blow anything AMD or nvidia have out of the water.
 
Alright, so the graphics is now plenty fast on an ordinary MacBook Air. The only thing left to see is sustained performance of M1 with passive cooling and whether there's thermal throttling involved and when it kicks in specifically and I might be able to switch from a Pro to an Air in due time.

Make me proud, Apple.
Well said. Those are my exactly my concerns.
 
  • Like
Reactions: ppetrovic
Also remember that these GPU's they are comparing to are over 3 years old... so I'm not really seeing why we should be so excited. It's a click-bait title with **** analysis. Sure they are decent for mobile stuff - but they in no way compare to full blown dGPU's from AMD or Nvidia.

Look outside the direct year over year comparisons.

There might be an ample amount of market out there that is using these older graphics chips in their current systems.

Apple's marketing may be targeting this market with a pretty affordable offering of 3 Macs, including a desktop version. When these people are shopping computers and hear that the new Apple machines blow away their current chipset, you've got some interest generated.

Apple knows its hardcore base has been salivating a long time for these M1 machines to be announced and released. They don't need to do a hell of a lot of marketing to get sales from them. But Apple needs to grow its installed Mac base in the world to keep competitive.

Just my take on one reason of many why they would directly target older GPUs.
 
For those not impressed by this, for your reference 1050 Ti is a $140-160 Dedicated GPU with a TDP of 75 W. The Card is PCIE 8x, that is 8 Inch long. Cramming that hardware into a fanless SOC is quite impressive to say the least.
What a ridiculous comparison. That card is designed to be a low end PCI-E GPU, with all of the power stages, ports, and dedicated RAM that product requires. You might as well compare a bicycle with a car.
 
Good performance for an entry-level laptop and Apple's first attempt at desktop-level SoC. It's not on par with AMD's and Nvidia's best but at least Apple is trying. In a few years, Apple might at least catch up if Apple can throw a lot of money at the problem. I'd like to see Apple build a powerful GPU that is one-third the size of an RTX3090 with much lower wattage requirements. It probably won't happen because Apple isn't going after such a small market if the profits aren't there. Anyway Apple, thanks for deciding to dump Intel processors in the next year or two. I like the idea of all of Apple's devices using ARM-based processors and interchangeable apps.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.