right... but with a much higher quality ray tracing on the PC, at a much higher resolution, you still get double the framerate of the "most powerful", most expensive mac Apple has ever made.
What this demonstrates isn't how badly the PC handles raytracing, but how ridiculously well it does non-ray-traced graphics (while still being ~3x as powerful for ray tracing as a the most expensive Mac)
What's the point of being more efficient at doing something, if you can't actually clear the minimum bar to do the thing? That's like arguing about how fuel efficient a car is, but it can't physically climb a hill on mountainous roads at the speed limit without holding up traffic (I live this).
So I think you make at least 2 points there. Both points are arguable, but I think you have the stronger argument of the 1st point. Those being (1) that the PC does non tracing way faster, and (2) that with ray tracing, the PC plays at them at much higher rates while the mac at unplayable rates.
So from the tests I've seen the mac runs ray tracing on the M4max at 30-50fps for the more modern AAA complicated games, while some tests like Linus showed that the PC went down to unplayable rates like 20fps. The reports on all this ray tracing are very choppy and I dont think they are being done on apples to apple basis, and that's part of the problem in discerning what is the true throughput.
Also, this is a bigger debate as to what is "playable". For me, 30fps and more for most games are very playable. On my SteamDeck I limit games to 30-40fps on purpose to extend battery. And I always trade fidelity/quality at the cost of frame rate, because I just dont care about frame rates over 60fps. For most games, this is fine, for me. I totally get for the more pro players, even 60fps is not nearly enough for their preferences.
But I totally get you, and your hill climb is a great example. If it is below the threshold of what you think or enjoy at a minimum, that is totally fair and legit, and the Assassins creed is a great example of that too, where for many 30fps will look like a joke. Plus, while we do not know for sure that Assassins creed is apples for apples on its graphic settings, it seems likely it is more apples for apples testing than many others I've seen done, and I think it tends to support your argument more.
One reason I'm interested in what seems a disproportionately good showing by apple wrt to ray tracing is, I hope they change the proportions of their chip layout in the future. The gains from more CPU cores for many mac users is not going to be appreciated as much, IMO, as if they give more real estate on the die to GPUs. If apple were to double or triple the proportions of the GPU size on the die relative to other computational units, ie, freeze the space for current efficiency/productivity cores but use all additional space on the 2nm die for GPU, they may get to a very interesting place. A place where they do compete on (1) non-ray tracing graphics, but (2) significantly outperform on the ray tracing output.