Somehow, you've quoted another person's post with my name on the quoting title.
oops. my bad.
Anyway, back to the topic; It's a huge exaggeration to say that by equipping macs with high-end consumer gpus is like 'following the PCs'. It's just common sense. Game devs are not to blame here. You can't seriously expect that game devs will swarm around macs, when macs are mostly equipped with igpus and the rest of them with mobile ones. Bring the h/w and they'll come. Give them underpowered, highly expensive, shiny mediocre h/w and they'll go. It's common sense.
so say Apple makes a dedicated gaming machine following all the definitions here.. how good of a move is that?
at best, it's mediocre and i bet barely anyone would switch to macs for gaming.. i mean, why would they? they're currently doing just fine on PCs it seems.
i think most people (here) would love for apple to make a gaming machine but there's an underlying reasoning of swappable off-the-shelf GPUs in a mac instead of any desire for a gMac type of thing,.
Why is iOS so successful in mobile gaming (and not only in that) ? They deliver the h/w along with it. Heck, the new iPad Pro beats a few current macs in performance. For real.
i don't think i'm being very clear.
iOS is the future of Apple gaming.. and gaming in general.. desktop gaming is going to be a thing of the past relatively soon. (desktop gaming being sitting in a chair looking at fixed 2D panels)
that's what all/most of my posts on this page have been about.
I apologise, if I've misread the thread or intention of:
I also apologise if my attempt to be specific about why that's a problematic statement seemed patronising.
ok. thnx
Possibly because people suspect the pace at which developers will progress the emerging field, and GPU companies can produce hardware upgrades, will exceed the practical costs of replacing an entire machine to keep up with GPU progress for the people most likely to want to buy this sort of gear. The threat in the *Apple model is that customers can't afford to keep up with the pace of development that developers are capable of. Crysis was a great example - brilliantly innovative in its reality simulation, but couldn't be played on most computers because the GPU requirements were too high.
i think maybe the GPU requirements aren't too high.. it's just that with the way the GPUs are built and the way the software is written requires far too much overhead.. the setup is entirely inefficient and requires way too much power for the results given.
Apple's approach seems to be addressing exactly this and they're aiming to get far greater results consuming much less energy.
it's likely we'll see real time raytracing on iPhones soon enough through next generation software/hardware (and that the software is being optimized for the particular hardware as opposed to a general use software aimed at general use hardware)
real-time raytracing isn't even a thing yet on some whiz-bang pci-ePC/GPU combo.. and funnily enough, i think we'll see it happening first on iOS devices.
[doublepost=1499099094][/doublepost]
"8K gaming on 4 x TITAN X: 7680x4320 never looked better
The only YouTuber streaming in 8K uses 4-way TITAN X graphics cards
"
http://www.tweaktown.com/news/58263/8k-gaming-4-titan-7680x4320-never-looked-better/index.html
Is any flavor of Mac hardware currently capable of 8K streaming? I think not.
and what?
so what, you know?
do you need that? no?
then why worry about it?
why not talk about
your needs.. it's a better conversation.
reading about some hypothetical college student wanting to watch TV or some youtube gamer is roll-eyes