We had our day. There was a time in the 90's when I preferred gaming on my Mac. Hell, Halo almost made it to the iMac first. That was before Bungie was consumed by M$.When have Macs EVER been good for gaming?
This is scary actually as a game dev. We tested on several systems OpenGL vs Metal on a new game (that is also both std 3D and VR) and found OpenGL and Metal to be NO DIFFERENT in performance whatsoever. When we saw this at WWDC it actually was rather upsetting because it really gave some bad vibes that are being motivated by totally different reasons than what they're revealing.
This is a truly fake and unfounded need by Apple to move in this direction.
Apple really does not care about gaming apparently. Every game I've run in metal so far has been a disaster.
Delusional.iOS gaming is on the rise to the detriment of Windows gaming.
Since Blizzard started making games in the early '90's. Since MacPlay started porting PC games for Mac in 1990.When have Macs EVER been good for gaming?
Since Blizzard started making games in the early '90's. Since MacPlay started porting PC games for Mac in 1990.
I got my first Mac in college back in 1995, it was a Power Mac 6400. Two of my roommates at the time owned PC's... one was a Pentium 166 and the other a Pentium 200 (top of the line at the time). My 6400 came with a software bundle in the box and one of those pieces of software was the game Descent (pretty big game at the time). We had played Descent on the two PC's on a regular basis and loved it. When I popped that "MacPlay" version of Descent in my new Power Mac for the first time everyone in the room who were initially scoffing at the fact I just got a new Mac instead of Pentium instantly were silenced due to my 6400's vastly superior graphics, sound, and much better frame rate.
The same thing happened the following year with Command and Conquer. First release on the PC followed by a Mac release. The performance of my 6400 vs the PC's was ridiculous. The PC's would always bog down when a lot was happening on screen, my Mac NEVER did.
I say ALL of that because you asked "When have Macs EVER been good for gaming?" As far as I'm concerned the answer is since ALWAYS.
In Apple's "perfect iWorld", all computers would be Macs, all tablets would be iPads, all smartphones would be iPhones, all wearables would be Apple Watches, all TV boxes would be Apple TVs, all smart speakers would be HomePods, all code would be written in Swift, and all app developers in the world would pay the 30% Apple tax. But this isn't Apple's "perfect world" no matter how much they'd love that scenario.
I have no problem with advancement. I'm not a game dev so I can't speak to whether Metal is a good or a bad API. But I can question why Apple can't continue to offer support for industry-wide standards along with their own. Is it "more work to maintain?" Sure it is. But is the effort not worth it? It keeps your platform more relevant. It keeps your app library growing (as developers for other platforms can apply the knowledge of the open frameworks they already know to the Mac). It keeps the "Switchers" interested too, as more apps and games they like can be brought to the platform. Surely the effort can be justified?
Sometimes I feel like Apple does things with the hope - maybe even the expectation? - that it will bring them closer to Apple's Perfect iWorld. They ride quite high on the success of the mobile market, but even there they are not the majority leader anymore. Mac as a platform has always had a relatively minuscule market share. Switching to Intel was actually good move - it brought native Windows support, making the decision of "buy a Mac or a PC" a lot easier since Windows users could get native PC performance. But now it seems Apple is trying to go the other way - moving away from open standards to bring everything in house. AFAIK they no longer use Apache for the HTTP server or Samba for the Windows file server. There's constant talk about moving Macs to ARM processors - which would alienate that entire crowd of Windows users who use Boot Camp or virtualization.
Come on Apple. You make great stuff. Stop making it less enticing by being arrogant. Make great, beautiful products that show off your awesomeness while still supporting the rest of the world. Your fanboys aren't going anywhere - but you can and should strive for more than your cult followers. You do that by embracing other standards and integrating them with your system, not "deprecating" and booting them out in favor of your own proprietary stuff.
I don't really like Windows, but I don't think I want Apple to be king of gaming when the are continuously finding ways to break old apps. They are breaking 32-bit apps and will likely be doing the same for OpenGL in a few years. These aren't the first times either. Before this it was the PPC to Intel switch, before that it was classic to OS X. After Apple drops OpenGL, the next move will be breaking all x86 apps in a move to ARM :/
I just want all my apps to work forever and don't want to have to worry about how long it will be until Apple breaks them.
You ever tried running a blizzard game in metal? Starcraft is the worst of the bunch. Nearly unplayable with graphical glitches. I went back to OpenGL after seeing how bad metal was. The game devs don't support it the same.
Who cares about iOS games for the Mac?
Problem is that not only games but many Scientific software, simulation software & 3D/CAD softwares depend of both OpenGL & OpenCL. So in the near future Mac will be a barren land for those software.
Depricate is an odd word choice, I think. I imagine the intent was for it to mean depreciate, and if so, why not use that word?
In general English usage, the infinitive "to deprecate" means "to express disapproval of (something)". It derives from the Latin verb deprecare, meaning "to ward off (a disaster) by prayer". In current technical usage, for one to state that a feature is deprecated is merely a recommendation against using it. It is still possible to produce a program or product without heeding the deprecation.
While a deprecated software feature remains in the software, its use may raise warning messages recommending alternative practices; deprecated status may also indicate the feature will be removed in the future. Features are deprecated rather than immediately removed, to provide backward compatibility, and to give programmers time to bring affected code into compliance with the new standard.
Even now , look at the garbage of Chrome. Apple should make macOS Safari only as well.