Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If anything, Apple's GPU project is just getting started. It wasn't so long ago that they licensed from Imagination.

I'm not worried about that.

There's a limit however to how far they can go without shrinking their dies or going bigger.
 
No, it's bulky because NVIDIA had a hard time optimizing their design. We don't have to make excuses for it.
Nah, it is bulky cause folks want "cool" and "quiet" GPU's thus the coolers are monstrous. The workstation equivelent cards are smaller (and use less power).
 
Nah, it is bulky cause folks want "cool" and "quiet" GPU's thus the coolers are monstrous. The workstation equivelent cards are smaller (and use less power).
This. Also, the data centre/scientific computing L40 cards are fully passive and not any bigger than the A40 they replace. I suspect that nVidia also assumes that 'bigger seems newer/more powerful' and, hence, can justify better their atrocious prices.
 
What's the point? If you are committed to the old bottleneck model just get a PC workstation.

Well that’s one way to look at it. But graphics cards exist for a reason. That’s a lot of hardware to integrate. If anyone could find a way it’s Apple but their integrated GPUs are still not as good as discrete cards, and the graphics bus has always been the fastest. I think it’s worth doing the processing on dedicated hardware and having a dedicated high bandwidth channel.

But we’ll see what it looks like when they finally release something they call the Mac Pro. To me that’s always meant expandability. The only ”Mac Pro” they ever released that had no expandability was a total failure.
 
Well that’s one way to look at it. But graphics cards exist for a reason. That’s a lot of hardware to integrate. If anyone could find a way it’s Apple but their integrated GPUs are still not as good as discrete cards, and the graphics bus has always been the fastest. I think it’s worth doing the processing on dedicated hardware and having a dedicated high bandwidth channel.

It's about long-term vision, really. The paradigm is shifting to heterogeneous workflows, raytracing, procedural graphics, machine learning etc. Memory bus is absolutely going to become a problem before long. Whoever didn't take precautions will be left by the roadside very quickly. The big semiconductor companies are painfully aware of this. As I mentioned earlier, Nvidia was publicly talking about these issues years ago, which is why they research processing-in-memory and similar solutions. Intel and AMD are working on tight integration of CPU and GPU into one heterogenous machine, Nvidia was interested in acquiring ARM for a similar reason. The paradigm is shifting. At least that's what I believe.

We can talk again in a couple of years :)
 
  • Like
Reactions: CarAnalogy and Homy
It's about long-term vision, really. The paradigm is shifting to heterogeneous workflows, raytracing, procedural graphics, machine learning etc. Memory bus is absolutely going to become a problem before long. Whoever didn't take precautions will be left by the roadside very quickly. The big semiconductor companies are painfully aware of this. As I mentioned earlier, Nvidia was publicly talking about these issues years ago, which is why they research processing-in-memory and similar solutions. Intel and AMD are working on tight integration of CPU and GPU into one heterogenous machine, Nvidia was interested in acquiring ARM for a similar reason. The paradigm is shifting. At least that's what I believe.

We can talk again in a couple of years :)

Or instead of integrating, they could just throw everything into the GPU, as computers are ever-more reliant on them anyway. Some programs will even use GPU cores for regular calculations to speed things up.
 
Well that’s one way to look at it. But graphics cards exist for a reason.

They do, but fewer and fewer people buy a computer that even has internal expansion.

Or instead of integrating, they could just throw everything into the GPU, as computers are ever-more reliant on them anyway. Some programs will even use GPU cores for regular calculations to speed things up.

For highly tasks that are highly parallelizable and don't need high precision, yes. For everything else, you still want a CPU.
 
When Apple isolated themselves by deprecating OpenGL, and discontinued 32-bit and eGPU support, they started playing a dangerous game: their bet is that their new architecture is so much better that Apple Silicon would be adopted en masse, which would make everyone play in their hands.

That thought is misguided even if the PC alternative is indeed much worse. But if Apple's alternative starts lagging behind and performance doesn't get so impressive over time, the argument for an Apple Silicon machine is less impressive. Why bother spending extra on a machine that is only slightly better or even not better at all than an AMD or Intel chipset?

Sure, right now they have the edge. But if they don't keep up, that edge will eventually disappear.
People that are expecting the kind of jump we saw from Intel to the M1 every year frankly have insanely unrealistic expectations. It was never going to happen.

Don't forget many people buy Macs due to preference over Windows. Who cares if the chip is only comparable to an Intel or AMD chip when the experience is still solid? The "extra" you pay for a Mac can be for many things other than the chip performance for example the longer life expectancy of the hardware, less malware/viruses, better optimised software etc etc...

Also Apple are not an outright chip designing company like Intel or AMD, you would expect the latter to beat Apple every time considering it's their speciality.
 
You need to reseat that heatsink. If done properly you won't get anywhere close to 100c and throttling.
With bios setting 288w power limit 95c, and a -0.4 core voltage mine doesn't go over 90c and still above 30k in Cinebench.
You're kinda burying the lede about undervolting the CPU there champ. This is how the i7 and i9 run stock with TVB, that is how it's designed to run OOB.

 

Attachments

  • cpu-temperature-blender.png
    cpu-temperature-blender.png
    53.1 KB · Views: 83
  • Like
Reactions: Homy
Don't forget many people buy Macs due to preference over Windows. Who cares if the chip is only comparable to an Intel or AMD chip when the experience is still solid? The "extra" you pay for a Mac can be for many things other than the chip performance for example the longer life expectancy of the hardware, less malware/viruses, better optimised software etc etc...

There will always be people willing to buy Bitcoin or Ethereum, but it doesn't mean they are a success.
Do you get my point?

Of course there will always be people that will love Apple products no matter what Apple does, but I'm approaching this from a logical point of view.

The only area where it makes more sense to talk about longer life expectancy is smartphones and laptops, which are a more closed system. But if you are really willing to spend extra money, you CAN buy PC hardware with a longer lifespan too.

Malware / viruses are also much less of a problem on PCs now than it used to be 15 years ago. In most cases, you should be fine if you stick to not opening pirate software on a PC.

And let's be frank: Apple is losing their grip on software quality. iTunes is a mess; Apple Music is not good enough; the default iPad file manager is a joke (forget about e.g, opening a virtual machine on an external disk).

What Apple does very well on their side is color management. But you also CAN get monitors on the PC side that rival what Apple offers, and you should absolutely be fine if you spend money on calibration hardware.
 
I think at this point, we know enough about Apple's custom silicon endeavours.

Apple sells more laptops compared to desktops. It stands to reason that they would focus their processor design on power efficiency and sustained performance, because these are what matter more to laptop users. We also saw the biggest jump from intel integrated CPUs to M1 in part because of how crappy the former had been all this while. Overnight, battery life doubled / tripled and performance leapfrogged.

There likely isn't enough demand for desktop computers to justify Apple creating a custom chip that can rival the top Intel / AMD / Nvidia cards in terms of raw performance, and maybe the point is that Apple doesn't need to. I think there is merit in these M1x chips allowing Apple to compete in areas other than performance (such as form factor and power consumption), and maybe that's how Apple wins the PC wars in the future. Not by taking on the competition on their terms, but by changing the rules of the game altogether. Make people care more about what you are offering, and less of what the other side has.

Who knows - maybe one day, when Intel unveils a new processor that offers minor performance gains at the expense of exponentially greater power consumption, instead of cheering and boasting about it on Apple forums, consumers might go - cool, but can that fit into a case the size of a Mac Mini? That's when the competition knows that they are truly and royally screwed.
 
There likely isn't enough demand for desktop computers to justify Apple creating a custom chip that can rival the top Intel / AMD / Nvidia cards in terms of raw performance, and maybe the point is that Apple doesn't need to.

Or maybe Tim Cook is resting on his laurels and not capitalizing on the buzz and success that the Apple Silicon line brought. The last Apple innovation was the Apple Pencil, around four years ago. If it hadn't been invented, the iPad would be a flop (because it's the one thing that REALLY sets it apart from other tablets).
 
Or maybe Tim Cook is resting on his laurels and not capitalizing on the buzz and success that the Apple Silicon line brought. The last Apple innovation was the Apple Pencil, around four years ago. If it hadn't been invented, the iPad would be a flop (because it's the one thing that REALLY sets it apart from other tablets).
I think that's what I like about Apple these days - they are being extremely disciplined.

I believe that Apple has a very clear product roadmap and they know well enough to stick to it, rather than try to capitalise on all this buzz and get suckered into a meaningless arms race with other chip manufacturers just for the sake of grabbing a Theverge headline. Apple doesn't need a chip that beats the competition in raw benchmarks. They need a chip that enables the best user experience in their products, given the constraints of said products. And I feel that's something Apple has accomplished admirably.

It makes sense when you think about it. The M1 chip is a great baseline, and it being used in a manner of products from the iPad Air to the M1 MBA means that Apple saves a lot in terms of manufacturing and economies of scale. It doesn't sell its chips to other OEMs, and because the M1 continues to leapfrog the rest of the pack in terms of performance and efficiency, this means that Apple doesn't need to come up with a dozen different variations of said chip in order to cater to differing price points or use cases.

You then have the M1 Max and Ultra being used in their more powerful laptops and desktops, and I don't think it's unreasonable that Apple upgrades them every 1.5 to 2 years as well, given that people usually hold on to their laptops for at least 3 years.

Seeing what Apple has announced / released / refreshed this year, I don't get the impression that Apple is resting on their laurels at all. They are lapping the competition, and I believe this lead will only continue to grow over time.
 
  • Like
Reactions: eldho
I hope Apple gets the silicon development going. Shrinking it at longer battery life sounds good to me. Making it faster at higher power needs not so much. I hope the prices remain on par with the performance gains.
Basically I am waiting to switch from Intel to Apple silicon, but will wait for longer now. If I'd need to buy now I would go M1.
 
Maybe Apple decided to cripple its line up in 2022/2023 with a 5nm M2, until 2024`s 4nm M3 (built in Arizona)?
Just releasing a 3nm M4 in spring 2023 in the Mac Pro to give it a head start? They only have a few 3nm chips so they can charge an arm and a leg if they just put them in Mac Pros.
 
I think that's what I like about Apple these days - they are being extremely disciplined.

I believe that Apple has a very clear product roadmap and they know well enough to stick to it, rather than try to capitalise on all this buzz and get suckered into a meaningless arms race with other chip manufacturers just for the sake of grabbing a Theverge headline. Apple doesn't need a chip that beats the competition in raw benchmarks. They need a chip that enables the best user experience in their products, given the constraints of said products. And I feel that's something Apple has accomplished admirably.

They just need to give users more versatile products. For example, allowing for eGPUs in their Macbooks or M1 Mac virtualization would require zero changes to the form factor while giving users many more usecase options.
 
This was always to be expected. M2 is not meant to be a big performance leap over the M1. Much like how Intel chips were not a huge increase in performance between a single generation for similar class chips. The m2 exists to keep moving the development forward for new users. It's not meant as an upgrade for current M1 users. It's there to keep the performance competitive and not stagnant for new users who wouldn't want to buy a new computer with two or three year old chip technology. Thats it.

M1 users need to be looking at a realistic upgrade cycle of a couple of years and set their sights on a M3 or M4.

Apple may make it exciting to upgrade every device every year but that has traditionally been a rather pointless exercise with very little gains.

We are also spoiled by the massive gains from the introduction of the M1 but Apple or anybody else cannot keep that level of gains going for every generation of chip. It's completely unrealistic expectations fueled by online personalities who hyped everyone up thinking the M2 would increase as much if not more than the M1 did over their Intel counterparts. Once the transition was made from Intel many of us suspected the next generation would be more incremental.

Again this is there strictly for brand new users who can go out and buy a new Mac and get a bit more for the same money. When hopefully the same money and the price doesn't increase as well.
 
  • Like
Reactions: eldho
Maybe Apple decided to cripple its line up in 2022/2023 with a 5nm M2, until 2024`s 4nm M3 (built in Arizona)?

Apple doesn’t deliberately do a worse product in order to later do a better product. They’d rather take the profits right now.
 
They just need to give users more versatile products. For example, allowing for eGPUs in their Macbooks or M1 Mac virtualization would require zero changes to the form factor while giving users many more usecase options.

I’d be all for that.
 
  • Like
Reactions: Joe Dohn
I really hope the Apple Silicon performance ramp will be more significant between generations in the years ahead. The A-series had a lot of impressive jumps. I hope this will continue on the M-series — otherwise I fear this whole transition including the mediocre graphics performance of Apple Silicon will be a dead end in a few years.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.