Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Even transcoding it stays silent as it’s not pushing the cpu all that hard. At least not to 100% load.

I'm guessing you mean Plex? Does Plex just set transcoding to a really low level of usage? If I use something like Handbrake or Screenflow, they tend to max out the CPU cores (which means fan noise for me).
 
For other people it might be that they can GPU enable their laptop for gaming, and not have to own a desktop at all.
This! This is why I have the egpu in general—so I have close to desktop level performance at the office, but can take the mba to go. One system, all the things. Works well, reduces clutter and complexity.

I originally used my Blackmagic egpu to game (dota2 exclusively) and did some pretty solid benchmarking with the plan to publish it. But I took a break from the game and Just haven’t finished that project.

Now I am doing some video work for a client and use premier to render a bunch, It is handy there too.

Finally I can say anecdotally that the 2018 mba macos experience is just more responsive running at 4K on an external. Even fully loaded the egpu makes a difference.
 
Last edited:
  • Like
Reactions: SteveW928
And, well, Windows.
I run my setup with Boot Camp, so I'm well aware of how both sides of the fence work (including decades of work in IT, much of it with Windows).

I cringe every time I need to do much work in Windows. Sure, once you're in a game, I suppose there isn't a ton of difference (unless you're trying to hook a controller up or something like that), but they aren't even close in terms of productivity overall (unless you're just talking raw hardware performance possibilities).
That doesn't mean anything.
And you are grossly exaggerating with the productivity part.

To many of us, the OS matters a lot too!
That's not a very strong argument(but I guess it's among the only plausible ones, together with I like Macs more).
Macs were barely able to make a dent in Microsoft's market share after decades of existence. This is the simple reality.
 
Last edited:
I think you lost me at the part about Apple can't lawfully... why not? They can reject anything they want to, afaik.

No they couldn't. A platform can't reject drivers or apps lawfully if there are no issues, the code is functional, and the developer is offering tech support for the devices the driver is supposed to enable. In this case, Nvidia has been enabling devices that it doesn't officially support on macOS and the bugs have been widely reported for a few years. It would be better for Nvidia to hand over the driver source code to Apple and let them bake the drivers into macOS in the future.
 
Plausible for you perhaps. But every reason was great for me. Almost all the games I play have Mac clients. And no, it's not noticeably worse. I don't care for windows so I want a Mac. Again, I've built computers for decades. I have many PC's here now including my gaming PC. I still prefer Macs. And it goes beyond simply gaming.
Nope not for me but in general.

And no, it's not noticeably worse

Yes it is, game optimization on Mac is clearly inferior, there are tons of youtube videos that prove it. Install Windows on a Mac and the same game will rung at higher and more consistent frame rate. Example.
It's obvious you don't care for the truth but unlike you not everybody has a dogmatic way of seeing these things.

PS. I also built computers for decades(since I was 10 actually) and I consider myself a true computer hardware enthusiast. You won't be seeing me constantly attempting to exaggerate things and present narrow opinions as absolute truths.
 
Last edited:
A platform can't reject drivers or apps lawfully if there are no issues, the code is functional, and the developer is offering tech support for the devices the driver is supposed to enable.

Lawfully is NOT the word you want to use here. Try again.

There are no laws that force companies to accept drivers from another company. None.
 
  • Like
Reactions: SteveW928


The Core X Chroma also has a 700W power supply so it supports more powerful graphics cards than the previous model. You can use the Core X Chroma to transform a MacBook Pro or MacBook Air into a desktop-class machine with a single cable, which is handy.

That is not correct.
Both new and old enclosure support a maximum of 500W, but the new model has an increased wattage to support the extra ports (usb / ethernet).
 
Nope not for me but in general.



Yes it is, game optimization on Mac is clearly inferior, there are tons of youtube videos that prove it. Install Windows on a Mac and the same game will rung at higher and more consistent frame rate. Example.
It's obvious you don't care for the truth but unlike you not everybody has a dogmatic way of seeing these things.

PS. I also built computers for decades(since I was 10 actually) and I consider myself a true computer hardware enthusiast. You won't be seeing me constantly attempting to exaggerate things and present narrow opinions as absolute truths.
I have bootcamped and gamed. I've owned multiple macs. I've built hackintoshes. ALL I care about is truth. It's people like you who exaggerate. OMG it's 1% slower it's so much worse! The sky is falling! In my experience some games do run a tad bit better on windows than MacOS, sure. But it's not all games and it's not by a large margin. So bootcamp if you want. You do have that option.

And people use egpu's for more than just gaming. My original post stated that a reason a person may want a egpu for their mac rather than buying/building a PC because THEY find macs to be better. Which is absolutely true for them and their needs. I didn't make a blanket statement that macs are better. Don't try to turn this in to something it's not.
[doublepost=1556201438][/doublepost]
I'm guessing you mean Plex? Does Plex just set transcoding to a really low level of usage? If I use something like Handbrake or Screenflow, they tend to max out the CPU cores (which means fan noise for me).
Transcoding isn't 100% usage. It's spikes in usage. Up to 100% for a few seconds and then back down to nothing, then back up. So it never really ramps up the fans.
Handbrake is 100% the entire encode.
 
Last edited:
  • Like
Reactions: SteveW928
You can physically put Nvidia cards in there, but you can't run Mojave properly.

I don't know what Apple's problem with Nvidia is, but it makes me have a problem with Apple.
Most software use Metal and OpenCl even the ones which used only CUDA for years. Also radeon cards have many advantages for eGPU in both macOS and Windows. Best low end cards: Radeon, best mid: Radeon.. sure in very high end Nvidia still beats but very few users needs a overpriced 2080Ti. Radeon VII support in included in macOS 10.14.5 and it runs already very very good giving you a mix of gaming and professional performance peaks.
I've done this video (only Italian sorry) testing first beta
[doublepost=1556202363][/doublepost]
can you guys advise,

What is the best GPU that can be put in this that runs in mojave?
What is the best GPU that can be put in this that runs via boot camp?

Thanks
With 10.14.5 Radeon VII, with 10.14.4 Vega 64
 
Um, no. The specs for a Vega 56 recommend a 650W power supply; with a Vega 64 it's 750W. Not sure if you realize these things have *thousands* of processors; power-wise it's not cheap to run them. Why do you think the electricity usage for cryptocurrency mining is so absurd?

--Eric

Just as an FYI, Video card recommended PSU specs aren't stating the outright GPU power requirements, but try and take into account some overhead for the rest of the system.

Most GPU's without the overhead of the rest of the system only use up to around 350w of power.

A 2080ti at stock clocks is rated at 378w.
A Vega 64 is rated at 414w

The vid card manufacturer in these cases is taking an estimate that it will be paired with the rest of the system for a total of 650w. But given that an eGPU breakout box is not powering a CPU, RAM, storage, ETC, a lot of that overhead is not required. A 400w PSU in a eGPU would be more than sufficient for the bulk of cards out there. 450w max if want to power a few other things inside that internals.

heck, even my computer, a decently loaded R7-1700@3.9ghz, 16gb DDR4 @ 3200mhz, 1xNVME, 1xSSD, OC'd 1070, pull < 450w total. Even if I kept 90% load I'd need less than 500w.

we in the computer building industry have gotten very VERY carried away with how much wattage we throw at our systems. we tend to go overkill (for good or bad).
 
  • Like
Reactions: SteveW928
Just as an FYI, Video card recommended PSU specs aren't stating the outright GPU power requirements, but try and take into account some overhead for the rest of the system.

Most GPU's without the overhead of the rest of the system only use up to around 350w of power.

A 2080ti at stock clocks is rated at 378w.
A Vega 64 is rated at 414w

The vid card manufacturer in these cases is taking an estimate that it will be paired with the rest of the system for a total of 650w. But given that an eGPU breakout box is not powering a CPU, RAM, storage, ETC, a lot of that overhead is not required. A 400w PSU in a eGPU would be more than sufficient for the bulk of cards out there. 450w max if want to power a few other things inside that internals.

heck, even my computer, a decently loaded R7-1700@3.9ghz, 16gb DDR4 @ 3200mhz, 1xNVME, 1xSSD, OC'd 1070, pull < 450w total. Even if I kept 90% load I'd need less than 500w.

we in the computer building industry have gotten very VERY carried away with how much wattage we throw at our systems. we tend to go overkill (for good or bad).
Something else to consider is you don't get all the power advertised solely for the video card. For example, the razer core X has a 650w PSU but only 500w of that is available for the video card. Most of the enclosures average about 300w, give or take, for the actual video card regardless of the advertised total wattage. Something to look for when shopping for an enclosure.
 
Last edited:
Something else to consider is you don't get all the power advertised solely for the video card. For example, the razer core X has a 650w PSU but only 500w of that is available for the video card. Most of the enclosures average about 300w give or take for the actual video card regardless of the advertised total wattage. Something to look for when shopping for an enclosure.

it's a good point as well. Just like any PC Customization, it always helps to read product description/manuals and ensure the compatibility of the components you are purchasing.
 
One example is Red Dead Redemption 2 or do you not consider a game A list unless it's also out for Windows ?

1 game hardly makes a case, but that is also Rockstar's MO. I swore I have seen it on Steam though. I will look again.

* edit * yes you are correct, it isn't out for the PC yet.
[doublepost=1556212767][/doublepost]
So tell Nvidia to list official support for the cards and squash the bugs. I was the first guy to use these drivers with Maxwell and Pascal cards and gave thorough test results to the Mac Pro forum for a long time. But the bugs were bad. The driver performance isn't optimal. There is no HDR support in these drivers so video editors won't benefit either.

The Geforce gaming performance in Windows is great but the macOS performance is mixed, sometimes not much better than a Radeon 580. Radeon are generally better in productivity apps. Radeon also supports 10bit HDR on macOS, very important for content creation.

[doublepost=1556124099][/doublepost]

The drivers that people are using with Maxwell and Pascal cards are the Kepler driver downloadable from Nvidia's site. These feature unofficial unlisted support for the drivers. If you were an employee at Apple responsible for certifying these buggy, unsupported drivers for use with Mojave and new Geforce cards you would probably get a lot of trouble. They should never have been signed for Sierra or High Sierra either.

If you want proper official drivers that list support for Maxwell, Pascal, Turing and RTX cards you need to put the spotlight on Nvidia. People blaming Apple are misguided and not using their heads.

You are probably right on the HDR; * edit * I believe High Sierra and above automatically sets the output to HDR if a compatible display device is detected. HDR is probably more constrained by whether or not the output port on your GPU supports it or not. I do not have a HDR display to test.

but I have not had problems in Premiere or AfterEffects on my hack using CUDA, OpenCL or Metal (8bit, 16bit or 32bit processing) in fact the 1080ti pretty much wipes the floor of anything currently used in a shipping Mac. (I will dig up the numbers from geek bench, I posted them here a bit ago)

*edit here are the geek bench scores for the 1080ti in my hack *

OpenCL = 215873
Metal = 227638
CUDA = 245960

You are correct though, nVidia site only lists cards that had "Mac" versions released.
 
Last edited:
Just as an FYI, Video card recommended PSU specs aren't stating the outright GPU power requirements, but try and take into account some overhead for the rest of the system.
Yes, I thought that went without saying, but aside from archer75's point, I was responding to the notion that GPUs don't use much power. Over half of the computer's total power supply is quite a lot.

--Eric
 
That doesn't mean anything.
And you are grossly exaggerating with the productivity part.

That's not a very strong argument(but I guess it's among the only plausible ones, together with I like Macs more).
Macs were barely able to make a dent in Microsoft's market share after decades of existence. This is the simple reality.

I'm everyone's primary OS/system gets tuned to make it most productive to them, and then any other platform feels less productive. That's true. But, I'm also trying to look at things more objectively, like refinement of the UI, quality of the tools and software, etc. IMO, Mac has the edge. Windows sometimes has the edge in raw performance (in certain areas), but that's not the same as productivity.

Of course, if there is something you can only do on one platform or the other, then it's kind of 100% vs 0% productivity, but that will vary depending on workflow.

And, of course, market share is kind of irrelevant to the argument we're looking at here. If I had a dollar for every time I saw an IT exec or company decide to go Windows over Mac (or even outright exclude Macs, even in a department) based off total ignorance, well you know how the saying goes...

(BTW, I used to do IT consulting from mom & pop shops to Fortune 100 companies, so I've run the gamut. I also worked in Sr IT at a Fortune 100 for over a half-decade.)

No they couldn't. A platform can't reject drivers or apps lawfully if there are no issues, the code is functional, and the developer is offering tech support for the devices the driver is supposed to enable. In this case, Nvidia has been enabling devices that it doesn't officially support on macOS and the bugs have been widely reported for a few years. It would be better for Nvidia to hand over the driver source code to Apple and let them bake the drivers into macOS in the future.

So, you're saying that if Nvidia were willing to produce proper drivers and/or code for Apple to product them, that Apple would be willing to incorporate them back into macOS?

(That runs counter to about every story I've ever heard on the state of things. But, if true, then we really need to start petitioning Apple heavily. Not that it would matter, I suppose. But, if this isn't just some childish spat, then it is a solvable problem!)

Yes it is, game optimization on Mac is clearly inferior, there are tons of youtube videos that prove it. Install Windows on a Mac and the same game will rung at higher and more consistent frame rate. Example.

In general, I'd agree here. Since a lot of games are based on technologies Apple hasn't kept up with the latest, often there is a performance difference. This doesn't just apply to gaming, but Windows-centric 3D/CAD apps, etc. But, the same applies to some apps that are optimized for Mac.

But, unless you're really down to the wire, crunching massive projects were minutes/seconds count in terms of money... I think people are often going to pick the OS/app that they work best in, not the one with the best raw performance.

Transcoding isn't 100% usage. It's spikes in usage. Up to 100% for a few seconds and then back down to nothing, then back up. So it never really ramps up the fans.
Handbrake is 100% the entire encode.

Cool (if I understand)... Plex doesn't run 100% when transcoding, which is good to hear if we can't get them to change their behavior. It's just a huge pain in terms of time then, more than the impact on the server.

Yeah, most video apps that output/encode run at 100% that I've used (including Handbrake).
Thanks for the info!
 
1 game hardly makes a case, but that is also Rockstar's MO. I swore I have seen it on Steam though. I will look again.

* edit * yes you are correct, it isn't out for the PC yet.
[doublepost=1556212767][/doublepost]

You are probably right on the HDR; * edit * I believe High Sierra and above automatically sets the output to HDR if a compatible display device is detected. HDR is probably more constrained by whether or not the output port on your GPU supports it or not. I do not have a HDR display to test.

but I have not had problems in Premiere or AfterEffects on my hack using CUDA, OpenCL or Metal (8bit, 16bit or 32bit processing) in fact the 1080ti pretty much wipes the floor of anything currently used in a shipping Mac. (I will dig up the numbers from geek bench, I posted them here a bit ago)

*edit here are the geek bench scores for the 1080ti in my hack *

OpenCL = 215873
Metal = 227638
CUDA = 245960

You are correct though, nVidia site only lists cards that had "Mac" versions released.

Re: HDR

GeForce cards only output 8 bit color on Windows and macOS so can’t do genuine HDR anyway. Radeon does 10 bit color. Nvidia expects customers to step up to Quadro just to get what is now a mainstream feature. They have great tech but a ****** business model. For people defending them based on brand tribalism, let’s not forget Nvidia deliberately crippled OpenCL performance for a couple of years in order to push CUDA.
 
Re: HDR

GeForce cards only output 8 bit color on Windows and macOS so can’t do genuine HDR anyway. Radeon does 10 bit color. Nvidia expects customers to step up to Quadro just to get what is now a mainstream feature. They have great tech but a ****** business model. For people defending them based on brand tribalism, let’s not forget Nvidia deliberately crippled OpenCL performance for a couple of years in order to push CUDA.

you're confusing yourself with technologies.

HDR and 10bit colour aren't the same tech.

ALL 10 series Nvidia GPU's support HDR in windows and MacOS.

Geforce series cards can do 10bit colour in Direct X titles only
Quadro Cards can do 10bit colour in professional programs.

Given that the Quadro and Geforce cards are essentially the same hardware, the lmitation here for application support for 10bit colour isn't hardware limitation, but Nvidia's software / business decision to lock it down to the professional graade.

I believe that a lot of your "hate" here for nvidia is probably justified because of their business practices. But you're kind of wrong for the tech claims you're making.
 
  • Like
Reactions: ROGmaster
Re: HDR

GeForce cards only output 8 bit color on Windows and macOS so can’t do genuine HDR anyway. Radeon does 10 bit color. Nvidia expects customers to step up to Quadro just to get what is now a mainstream feature. They have great tech but a ****** business model. For people defending them based on brand tribalism, let’s not forget Nvidia deliberately crippled OpenCL performance for a couple of years in order to push CUDA.

Nvidia seems to be ahead on performance and power consumption for cost:
http://barefeats.com/vega-vii-versus-other-gpus.html

CUDA is the big deal for most people that need it. I don't need CUDA, so that isn't even a factor for me. But, I'm not sure what Apple is thinking with Metal. Are the software devs of all the apps we need (especially professional apps, or more niche 3D apps and tools) going to support Metal? If they do, I'll be surprised.

There were some tricky gotcha's I've read on the 10-bit color thing, but maybe LordVic has covered it (ie: it's about the pro drivers). Anyway, in the threads I was reading, there seemed to be a lot of confusion, and a lot of Nvidia users seemed to think they had 10-bit color when they actually didn't, or something like that.
 
Semitism?

Haha, wasn't the word I meant to go for, sounded right though. Self entitled snobbiness works better, I guess. Stupid logic either way.
[doublepost=1556307250][/doublepost]
No rock here... you may or may not know, but there are eGPUs on the market that are sold complete with the GPU, and they’re not meant to be upgraded with a different card. The Blackmagic lineup is an example. It’s good for those who have an LG 27” 5K Thunderbolt 3 monitor, for instance, and want to accelerate DaVinci Resolve. In fact it’s not good it’s great. Which is awesome because there are no other options.

But yeah I found the article to be a strange read, since the author kept calling the enclosure an eGPU. (I actually didn’t notice its use in the title.) To me, that like calling a computer case a computer, even though there’s no CPU, or motherboard.

Maybe that’s a thing, that’s pretty much why I was asking. If people are actually calling the empty box an eGPU nowadays, I’m fine with that and I can easily adjust my thinking. So maybe one with a bundled GPU is an “eGPU with GPU”? ¯\_(ツ)_/¯

No you're totally right, it is misleading at a literal writing and wording sense - but the 'rock' statement (sorry if that was bit insulting) comes form just about everybody knowing of RAZER's eGPU chassis'. In fact, I believe they were the ones to more or less pull an Apple and pioneer/invent the concept for mass market use and adoption.

Regardless, it is an empty shell, and anyone new to the game would be incredibly mislead. Good catch, and sorry if I sounded like I needed a Snickers.
 
Haha, wasn't the word I meant to go for, sounded right though. Self entitled snobbiness works better, I guess. Stupid logic either way.
[doublepost=1556307250][/doublepost]

No you're totally right, it is misleading at a literal writing and wording sense - but the 'rock' statement (sorry if that was bit insulting) comes form just about everybody knowing of RAZER's eGPU chassis'. In fact, I believe they were the ones to more or less pull an Apple and pioneer/invent the concept for mass market use and adoption.

Regardless, it is an empty shell, and anyone new to the game would be incredibly mislead. Good catch, and sorry if I sounded like I needed a Snickers.
No worries. Dang I could use a Snickers right about now myself :)
 
Last edited:
  • Like
Reactions: Tucom
Nvidia seems to be ahead on performance and power consumption for cost:
http://barefeats.com/vega-vii-versus-other-gpus.html

CUDA is the big deal for most people that need it. I don't need CUDA, so that isn't even a factor for me. But, I'm not sure what Apple is thinking with Metal. Are the software devs of all the apps we need (especially professional apps, or more niche 3D apps and tools) going to support Metal? If they do, I'll be surprised.

There were some tricky gotcha's I've read on the 10-bit color thing, but maybe LordVic has covered it (ie: it's about the pro drivers). Anyway, in the threads I was reading, there seemed to be a lot of confusion, and a lot of Nvidia users seemed to think they had 10-bit color when they actually didn't, or something like that.

I hate brand tribalism with the same disdain that I hate all forms of Us Vs Them arguments. I am a shareholder in all these chip companies. Having been clear on that, I proceed.

The productivity apps is where it matters for most Mac users. In the Barefeats test the Radeon VII wins most productivity tests, and that is also seen in reviews of the card elsewhere.

The Radeon VII is a prosumer card that can be used for professional applications without issue. It supports double precision and 10bit HDR. It costs $600-700.

The Geforce 2080 Ti is a gaming card that can be used for pro apps but with some weakness, such as the lack of true 10 bit HDR and the cost being around $1000-1100.

This video is fun btw. Just an example of how things don't always go as expected...

 
Last edited:
These eGPU’s are confusing to me. Sure having a laptop that you can game on would be cool, and sure a laptop that you can do more heavy editing on would be cool as well. But at what point should we just get a desktop? eGPU IS $400 (if I understand correctly that is not including the actual GPU...) another $300-400 for a great GPU a decent screen that is another $500. That almost $1,500. Seems crazy to me. Maybe I’m missing something?
That's Apple's vision of modularity, I suppose. I guess the idea is that you have an ultra-portable laptop which you can carry around with you, and then when you are at a desk, you can hook up a 5k display and an E-GPU to it and get some semblance of a Mac desktop.
 
I have bootcamped and gamed. I've owned multiple macs. I've built hackintoshes. ALL I care about is truth. It's people like you who exaggerate. OMG it's 1% slower it's so much worse! The sky is falling! In my experience some games do run a tad bit better on windows than MacOS, sure. But it's not all games and it's not by a large margin. So bootcamp if you want. You do have that option.

It's very simple, if you care only for the truth you shouldn't lie.
The clip I posted showed difference in performance above 10% in most cases. And when I'm saying above 10% I'm saying 12, 14, 16, 18, 20% etc. Also the FPS is not everything, games running on Windows also generally smoother because they are much better optimized.
Anyway you only proved my point, It's not me who is exaggerating. You are acting like games that run better on Windows are the exception and it's basically the other way around, it's the general rule.

And people use egpu's for more than just gaming. My original post stated that a reason a person may want a egpu for their mac rather than buying/building a PC because THEY find macs to be better. Which is absolutely true for them and their needs. I didn't make a blanket statement that macs are better. Don't try to turn this in to something it's not.
Again with the anecdotal and pointless "some people consider Mac to be better".
It's a very weak argument. I could just say many people find Windows to be better and we are back to square one. But I guess this is the best you have.
[doublepost=1556373540][/doublepost]
I'm everyone's primary OS/system gets tuned to make it most productive to them, and then any other platform feels less productive. That's true. But, I'm also trying to look at things more objectively, like refinement of the UI, quality of the tools and software, etc. IMO, Mac has the edge. Windows sometimes has the edge in raw performance (in certain areas), but that's not the same as productivity.

Of course, if there is something you can only do on one platform or the other, then it's kind of 100% vs 0% productivity, but that will vary depending on workflow.

And, of course, market share is kind of irrelevant to the argument we're looking at here. If I had a dollar for every time I saw an IT exec or company decide to go Windows over Mac (or even outright exclude Macs, even in a department) based off total ignorance, well you know how the saying goes...

(BTW, I used to do IT consulting from mom & pop shops to Fortune 100 companies, so I've run the gamut. I also worked in Sr IT at a Fortune 100 for over a half-decade.)
You needed quite a lot of word to basically not say and not prove anything.
 
It's very simple, if you care only for the truth you shouldn't lie.
The clip I posted showed difference in performance above 10% in most cases. And when I'm saying above 10% I'm saying 12, 14, 16, 18, 20% etc. Also the FPS is not everything, games running on Windows also generally smoother because they are much better optimized.
Anyway you only proved my point, It's not me who is exaggerating. You are acting like games that run better on Windows are the exception and it's basically the other way around, it's the general rule.
Why are you so mad? I didn't lie at all. You're putting words in my mouth. In my experience that is not always the case. Some games run a little better in windows on the same machine. Not all and not all that much. Depends on the game. Some games are optimized very well for Mac. And it's true that some are not. If a game isn't optimized well for Mac then bootcamp. Simple. Best of both worlds.
You see, I have done this. When I had my iMac I gamed in OSX and booted into windows and gamed in it. Same games, similar performance. So I speak from real world experience. Have you?

Again with the anecdotal and pointless "some people consider Mac to be better".
It's a very weak argument. I could just say many people find Windows to be better and we are back to square one. But I guess this is the best you have.
And yet, that's all a person needs. What they prefer is all that matters. They don't need an argument.
You could indeed say some people find windows to be better, for them, and you'd be right. You simply cannot argue what a person finds better for themselves.

Which goes back to my original statement which still holds true, a person may want a Mac because they find it better. That's it. You really can't argue that so I don't know why you're trying.
 
Last edited:
Why are you so mad? I didn't lie at all. You're putting words in my mouth. In my experience that is not always the case. Some games run a little better in windows on the same machine. Not all and not all that much. Depends on the game. Some games are optimized very well for Mac. And it's true that some are not. If a game isn't optimized well for Mac then bootcamp. Simple. Best of both worlds.
You see, I have done this. When I had my iMac I gamed in OSX and booted into windows and gamed in it. Same games, similar performance. So I speak from real world experience. Have you?

And yet, that's all a person needs. What they prefer is all that matters. They don't need an argument.
You could indeed say some people find windows to be better, for them, and you'd be right. You simply cannot argue what a person finds better for themselves.

Which goes back to my original statement which still holds true, a person may want a Mac because they find it better. That's it. You really can't argue that so I don't know why you're trying.
Yes you lied and you continue to lie.
"You experience" means nothing in the face of the actual proof that I provided.
Deny it all you want, game optimization and availability on Mac is a huge mess there's no way around it.
Have a nice day.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.