Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And this concludes the argument: For you, no matter what ports a pc might have, in your opinion a mac will ALWAYS be better than a Windows pc.

You have every right to hold that opinion. Just as you hold every right to believe you know a pro photographer's workflow from your own experiences with a Sony A 6000, as per your argument from yesterday.

To me, it is absurd in so many ways to claim that a Windows pc is a priori not able to show a presentation, that I choose not participate any further in the discussion.

I am not sure if you are not aware of basic facts or just trolling, but alas I do not find this debate is going anywhere meaningful. I wish you all the best.

I agree! Windows might be irritating at times, and I find Windows 10 less stable for me. However, I do this all day long where people want my recommendations on what systems to get. Sometimes I mention Windows PCs, sometimes I mention Macs, and sometimes I mention Chromebooks.

Dell's generally have the best hardware on the Windows side. I have never had good experiences with anyone else - including Microsoft's Surface line. I have had the occasional issue with Dell systems, but nothing as bad as other manufacturers.
[doublepost=1508848065][/doublepost]
The ironic thing is that for VGA projectors in the very least, you are likely already using an adaptor to begin. Be it mini-display or HDMI to VGA. You aren’t any better or worse off using a USB-C to VGA in its place.

Well yeah. It would be highly unprofessional of me not to have adapters. Not every computer in the world has a SD card reader, what if I have a client that brings their equipment and I need to put in an SD card in their computer that has no SD card reader?

And as I said, dozens of adapters are in my laptop bag not adding pounds of weight of make my laptop bag horribly bulky. I also have ethernet, HDMI, DisplayPort to USB-C cables in my bag too.
 
Yet my issues with Windows 7 were very very few. I have had very very very few issues with my Windows 7 installs. I have had nothing but problems with my Windows 10 installs as mentioned in my post.

Windows 7 never BSOD on me. Windows 7 never broke all desktop and start menu shortcuts. Windows 7 never broke itself after a disk cleanup. Yet I have experienced these issues on Windows 10 across dozens of systems even at home. I loved Windows 7. Windows 10 is pretty buggy.

And not everyone will experience these issues. How is it right to complain that High Sierra is buggy but not Windows 10? How is it that I have not experienced any major bugs from Leopard to Sierra and other people have?

In my experience, Windows 10 is a lot less stable than Windows 7. Your experience might be different. You could have had horrible experiences with Windows 7 even though it is generally treated as the best or second best next to XP.

[doublepost=1508512532][/doublepost]

Yes, that is understandable. I was mostly targeting the people that state "Apple has crappy GPUs" or "At least this one has a decent GPU". And their argument is about gaming. Really?

I agree. If I want a gaming computer, I would look at a Windows PC. If I need to help someone pick out a computer and they said they want to game, I would recommend Windows PC over a mac. You buy what you need.
[doublepost=1508512919][/doublepost]

And actually if you include servers, Linux is the most popular.
[doublepost=1508513374][/doublepost]

.... for specific use cases like gaming. These are NOT gaming systems. Try comparing a $1,350 NVIDIA Quadro to a $500 GTX 1080 and see that it is only 80% as powerful as the GTX 1080 in gaming.

These AMD cards are better than NVIDIA for what I use. Do not assume everyone plays games on their mac and needs NVIDIA 1080s for it. Or uses software that only can run on CUDA.

I humbly apologize. I assumed that NVIDIA cards could be used for other applications besides just games. I was unaware that if you turned on a computer and used Word or any non-gaming app, the computer would black screen. I now know it was wrong to think you could edit movies with an NVIDIA card. I also didn't realize that only gaming systems are appropriate for games and/or NVIDIA cards. I'm grateful for the instruction. I didn't realize that what YOU use your computer for is what everyone else uses it for. I'm so, so sorry.

Also, this debate is rendered moot by the move to Thunderbolt 3, affordable eGPU enclosures, and support for eGPU in Mac. So, whatever, brah! :D
 
Also, this debate is rendered moot by the move to Thunderbolt 3, affordable eGPU enclosures, and support for eGPU in Mac. So, whatever, brah! :D

Not yet.

Mac OS High Sierra nVidia eGPUs are a non-starter right now. Sierra requires some work (but at least one can get it working!) and they are advising people who need eGPU to not upgrade. Hopefully they will fix it by spring 2018, but right now it is a colossal mess. It's classic Apple software promises: support is coming... hopefully, one day, maybe.

The drivers are there. The people over at egpu.io in this thread document the issues. If I'm understanding all of the talk about kext files and drivers, they say it's an Apple issue, not an nVidia issue. Nvidia would love to get everything working so they can sell a lot more hardware to Mac users.

I hope to see that eGPUs become straight forward to plug in and just work like things on the Mac should. This is the stuff I criticize windows for: hardware and driver issues that suck away time like a vampire sucks blood. Even Mac eGPU setups with AMD cards have issues. You have to pick hardware configs very carefully. If I wanted to deal with that headache I'd build a PC again, except I remember my PC build going smoother than what people are saying in the forum on egpu.io.

I'd use a couple of nVidia cards right now for rendering on this new iMac and be in heaven.
 
I humbly apologize. I assumed that NVIDIA cards could be used for other applications besides just games. I was unaware that if you turned on a computer and used Word or any non-gaming app, the computer would black screen. I now know it was wrong to think you could edit movies with an NVIDIA card. I also didn't realize that only gaming systems are appropriate for games and/or NVIDIA cards. I'm grateful for the instruction. I didn't realize that what YOU use your computer for is what everyone else uses it for. I'm so, so sorry.

Also, this debate is rendered moot by the move to Thunderbolt 3, affordable eGPU enclosures, and support for eGPU in Mac. So, whatever, brah! :D

And? The same can be said about AMD GPUs. We get our Macs to perform work and not play games, so the people complaining that it is not as good as a GTX 1050 for gaming.

It cannot work only one way. You need NVIDIA for other stuff, I need AMD GPUs for my stuff. It is a shame we cannot have both choices to make both sides happy. But it has been know for a very long time now that AMD and Apple go together.

And if you do heavy 3D work you are probably looking at the Quadro video cards which are not that good for gaming too.
 
Count me as a chosen one too.

So... I updated recently to Fall Creators Update.
My newly acquired BeatsX disconnect from my iphone/ipad whenever my Windows laptop is on (because the windows laptop has made it a rule that since I paired the Beats X once, the headphones are his and his only). NO other device at my place disrupts my bluetooth headphones experience... but the Windows PC.

Turns out the connection with the beats X is jumpy on the PC, by the way. Unusable. What a surprise. It's OK if I pair it over again every 3 hours, so I guess the Windows crowd will call it "not an issue", cause you know, you can waste time sorting out windows ****. That's part of the Windows fun, right ?

There is no such issue on my Samsung TV, nor on my phones or tablets, just on the one windows PC.

So I wanted to shut the PC's bluethooth down. You know, so it would let me listen to music on my phone. I had to deal with Windows' stupidity. Selected the bluetooth icon, and accidentally erased the icon from the quick access task bar... Because ****ing windows skipped a beat or something and the icon self-deleted.

So I type "Bluetooth" into Cortana, right ? Cortana ****s up. Cortana won't let me type a U inside it. For reasons, no doubt. If you hold a key while typing inside the Cortana bar, fun fact, the bar will shut down. Why.

BTW all the privacy settings have been put back on default since the update, so I discovered that Cortana has been listening in on me for weeks although I made it clear several times I wanted the option OFF. Nice.

It's a rabbit hole. It starts with something annoying, and you look it up and try to solve it and the mess unfurls.

Windows is pure ****. Has always been, will always be.
I need it only for work but I swear the PC would fly out the window (get it) if I could only get rid of it.

If not for work, I would never touch a piece of **** PC again.

EDIT : Mark me, I didn't have a problem with USING Windows, the EXISTENCE of a nearby PC RUINED my experience with using OTHER NON WINDOWS devices. That is how FORKED UP Windows is.
 
Last edited:
  • Like
Reactions: Jack Burton
Sounds remarkably like an old Mac commercial:



Yeah, that's how I felt xD
[doublepost=1508907466][/doublepost]I actually use a cheap little ASUS (running Linux, because Windows' background processes were hogging the CPU), for basic on-the-go tasks. I usually recommend Chrome-book like laptops to those who are mainly desktop users, as it wouldn't be worth having such a powerful laptop if it's not used at home.

That being said, The 2016/2017 MacBook Pro is perfect within a desktop setup; and, on-the-go, what is one really going to be doing with it? If you're worried about desktop-level productivity whilst you aren't even home, buy one of those 3-inch-thick, 9-pound monster laptops with 17" screen. You'd get so much done on a lawnmower with mobile GTX 9xx/10xx card. When you get home, you can make use of all of those ports... because you have to.

Again, one port can do it all for someone needing a desktop setup with which a laptop is integrated. Two, if you want storage. And, if you rely on tons of USB devices and such, bite the bullet and get a hub. I've even had to use hubs on desktop computers. If it's integrated well with the setup, it shouldn't matter.

I know of a producer who daisy-chains roughly 20TB of storage, through USB-C, and it's all connected to his MacBook Pro. The same goes with his display and hub. Takes THREE measly cables to do all of that.



BTW, I have had experience with a Nikon D80. Of course, in those days, it would have been stupid to not have an SD card slot. Sooner or later, cameras will have USB-C ports on them for fast transfer. SD card transfer speeds aren't super hot anyhow.
 
And? The same can be said about AMD GPUs. We get our Macs to perform work and not play games, so the people complaining that it is not as good as a GTX 1050 for gaming.

It cannot work only one way. You need NVIDIA for other stuff, I need AMD GPUs for my stuff. It is a shame we cannot have both choices to make both sides happy. But it has been know for a very long time now that AMD and Apple go together.

And if you do heavy 3D work you are probably looking at the Quadro video cards which are not that good for gaming too.

Nvidia is better than AMD in almost all aspects, the only issue is they cost more. I honestly wish Apple went with Nvidia in at least their laptops, since Nvidia cards are much more efficient than AMD cards right now. In a desktop? Meh, throw whatever you want in there since power consumption isn't as important. Apple used the 750Ti in the 2012 or 2013 MBPr, so it all comes down to pricing.
 
Nvidia is better than AMD in almost all aspects, the only issue is they cost more. I honestly wish Apple went with Nvidia in at least their laptops, since Nvidia cards are much more efficient than AMD cards right now. In a desktop? Meh, throw whatever you want in there since power consumption isn't as important. Apple used the 750Ti in the 2012 or 2013 MBPr, so it all comes down to pricing.

Not necessarily. My Radeon 7950 beats my GTX 980 in FCPX. So I stick with AMD.

Look at the FCPX benchmarks here (and that is with a 1080 Ti) - http://barefeats.com/imac5K_vs_pros.html

Also, the upcoming Vega 64 in the iMac Pro is very close to the 1080 Ti on this test where the iMac 2017 Radeon 580 was a little farther behind: http://barefeats.com/vega_resolve.html

Again, it sucks there is no choice. It has been clear for a while now that we will not be seeing NVIDIA in our Macs. So get a Windows PC if you need it, or hopefully we will see better eGPU support now that Apple is pushing that.

There are some tasks that I need to use CUDA for, that is why I have a Windows PC along with my iMac 2017.
 
Last edited:
Not necessarily. My Radeon 7950 beats my GTX 980 in FCPX. So I stick with AMD.

Look at the FCPX benchmarks here (and that is with a 1080 Ti) - http://barefeats.com/imac5K_vs_pros.html

Also, the upcoming Vega 64 in the iMac Pro is very close to the 1080 Ti on this test where the iMac 2017 Radeon 580 was a little farther behind: http://barefeats.com/vega_resolve.html

Again, it sucks there is no choice. It has been clear for a while now that we will not be seeing NVIDIA in our Macs. So get a Windows PC if you need it, or hopefully we will see better eGPU support now that Apple is pushing that.

There are some tasks that I need to use CUDA for, that is why I have a Windows PC along with my iMac 2017.

Ahh ok, fair enough. Nvidia also cut down any compute tasks on all but their Quadro cards since I think the 900 series. Then again, I’m assuming if Apple worked together with Nvidia they could incorporate CUDA support in their software. Or, AMD could just make more power efficient cards :)!
 
Try again. Top three banks use Linux, Windows and mainframe for backend. Rarely do they use MacOS/iOS except for QA testing client access to web banking services.

Did you even read what I wrote at all? Because I never even mentioned OSX/iOS and specifically spoke about how banks use Linux and mainframes in the backend (mentioning IBM's mainframe OS by name) and Windows for their employee terminals.


Where the hell did you get that I said financial services are using OSX when I didn't even mention the OS in the whole post. You're the second person to have read that post and start going on about how I'm wrong about banks using OSX even thou I made no mention of it.

Financial services are however a somewhat different beast seeing how in the U.S at least data is commonly transfered from organization to organization on excel sheets sent via email. In terms of requirements on transaction rates, correctness, logging, accountability actual banks, not people who manage your company pensions accounts, are on a whole different level. From the level of incompetence with data and downright stupidity that I've seen the financial services industry is nothing like the gold standard for data processing that banks are.
 
Last edited:
While that would be a nice selling point, the better model wouldn't really need it as it has a 1060 in there already. But I agree that it probably should have came with it.
[doublepost=1508634226][/doublepost]
I think you meant to quote the other guy, but I agree with your statement 100%. Why not have options in ports? They always say "Better to have, and not need, than to need, and not have"

random: You look like my Uncle! I bet your a really cool guy
Only when I'm playing banjo and doing bluegrass stuff! I first got into computers in 1973 as a student after getting out of the service. It eventually became a career from which I retired 10 years ago (yeah, I'm getting old). Anyway, with computers and other tech gadgetry (like mobile phones), I tend to judge a product as a price point balance between convenience, functionality, and cost. Aesthetics are in there to some degree - Apple seems to value that over everything else - but aren't really a significant concern with me. I always valued the speed of mainframes, the stability of servers, and the functionality of laptops and desktops when I was a work slave. I love beautiful hardware, more for its function than its looks - but that comes from a background in working in and around machine rooms, behind the scenes, often with weird hours - just to make sure things worked smoothly. Felt I might want to finally respond to your compliment :).
 
  • Like
Reactions: Regime2008
Eh... 8th generation quad core cpu vs older generation dual core? Plus you forgot (for your own convenience) the NVidia 1050/1060 vs Intel Iris? That could be easily 2x as powerful. Seems there is not much reason to get pissed off about.

The 1060 model goes for $2500 so it’s 7920HQ vs 8650U…which runs 1 Ghz slower. They both have four cores. That’s not 2x faster, it’s actually slower!

Also, my post was meant in jest, I’m not really pissed off, lol!

The 1060 vs 560 is an entirely different comparison with the Nvidia part being nearly 3x faster.

Lets see how it handles real life tasks. I'd love a 1060 in a MacBook Pro.

I would rather have the 8650U/1060 combo vs the 7920HQ/560 setup in the MBP. I wonder how loud and hot that 1060 will make the SP2 tho.

It's not just CPU but also GPU performance which is now commonly used for compute processing, machine learning, password strength testing, etc. So, the 2x performance is correct considering the GTX 1060 is faster than Radeon Pro 570 and the Radeon 560 is about half the performance of the 570. Doesn't matter anyway since there's hardly any software on MacOS to utilize the GPU.

Radeon Pro 570 > 560

They mentioned the CPU and GPU. 7920HQ vs 8650U isn’t 2x as powerful in the 1060 model.
 
  • Like
Reactions: MH01
Those two aren't appreciably different.

Comparing based on clock speed is meaningless. However, the HQ has an edge on multi-core, and the next generation will likely come with six cores.
Geekbench is a benchmark. It's a quick benchmark where the 8650U will be able to turbo the entire process. If you're doing something CPU intensive where the fan kicks up and it takes awhile, then that 8650U will be running a Ghz slower than the 7920X. This isn't the days of Bloomfield to Sandy Bridge. The IPC of Coffee isn't much of a jump over Kaby. So in this case, clock speed means everything. We also have no idea of the cooling solution in a Surface Book 2, but I highly doubt it's going to be as robust as MBP 15".

Not in all tasks. In fact a Radeon 7950 beats the crap out of my GTX 980.

http://barefeats.com/imac5K_vs_pros.html

Look at FCPX tests. NVIDIA is significantly slower. I wish there would be a choice for those that want NVIDIA, but with my Mac, AMD smokes any NVIDIA I have used.
One test. For most tasks and all games, the 1060 will be significantly faster than the 560 in the MBP.
 
Geekbench is a benchmark. It's a quick benchmark where the 8650U will be able to turbo the entire process. If you're doing something CPU intensive where the fan kicks up and it takes awhile, then that 8650U will be running a Ghz slower than the 7920X. This isn't the days of Bloomfield to Sandy Bridge. The IPC of Coffee isn't much of a jump over Kaby. So in this case, clock speed means everything. We also have no idea of the cooling solution in a Surface Book 2, but I highly doubt it's going to be as robust as MBP 15".


One test. For most tasks and all games, the 1060 will be significantly faster than the 560 in the MBP.

Did you look at that? There were multiple FCPX tests. I do not play games on macOS my Mac. And guess what? Gaming performance with my Radeon Pro 580 with Bootcamp is VERY GOOD.

Davinci Resolve is 12 seconds with AMD and 11 seconds with NVIDIA. After Effects render with software was faster with AMD too. OpenCL and Metal was about even with all systems. Motion was about double the frame rate with AMD. In Photoshop, the 2017 iMac with AMD was way faster than the 2010 Mac Pro with GTX 1080. Lightroom was about the same performance too.

Seriously, we get it. You guys use software that uses CUDA or play games. Not all of us use our Macs like this. I use FCPX, After Effects, Affinity Photo and Affinity Designer on my iMac. People around here treat AMD like you will only get 5 FPS in any game and your computer will be SOOOOOOOOOO SLOW if it does NOT have a NVIDIA GPU.

I do hope with eGPU now that we will start seeing more NVIDIA eGPU options to cut down on all of these complaints. That and more options would be nice. I need AMD, so I would like Apple to keep AMD internally and people can use NVIDIA as a eGPU.

Don't get me wrong. I love both AMD and NVIDIA. I specifically purchased the GTX 1080 when it was brand new. I got the Founder's Edition for around $850 and it was definitely worth it! When it comes to iMacs, the thermals are really not suited for heavy duty gaming even if it had a GTX 1080 in it. Same with the laptops. So I consider gaming on my Mac as a secondary "bonus". I do not care if it takes an additional few minutes to render something in Premiere Pro on AMD vs a NVIDIA GPU. I do most of my rendering at night anyway when I am sleeping.
 
Last edited:
  • Like
Reactions: Jodiuh
Did you look at that? There were multiple FCPX tests. I do not play games on macOS my Mac. And guess what? Gaming performance with my Radeon Pro 580 with Bootcamp is VERY GOOD.

Davinci Resolve is 12 seconds with AMD and 11 seconds with NVIDIA. After Effects render with software was faster with AMD too. OpenCL and Metal was about even with all systems. Motion was about double the frame rate with AMD. In Photoshop, the 2017 iMac with AMD was way faster than the 2010 Mac Pro with GTX 1080. Lightroom was about the same performance too.

Seriously, we get it. You guys use software that uses CUDA or play games. Not all of us use our Macs like this. I use FCPX, After Effects, Affinity Photo and Affinity Designer on my iMac. People around here treat AMD like you will only get 5 FPS in any game and your computer will be SOOOOOOOOOO SLOW if it does NOT have a NVIDIA GPU.

I do hope with eGPU now that we will start seeing more NVIDIA eGPU options to cut down on all of these complaints. That and more options would be nice. I need AMD, so I would like Apple to keep AMD internally and people can use NVIDIA as a eGPU.

Don't get me wrong. I love both AMD and NVIDIA. I specifically purchased the GTX 1080 when it was brand new. I got the Founder's Edition for around $850 and it was definitely worth it! When it comes to iMacs, the thermals are really not suited for heavy duty gaming even if it had a GTX 1080 in it. Same with the laptops. So I consider gaming on my Mac as a secondary "bonus". I do not care if it takes an additional few minutes to render something in Premiere Pro on AMD vs a NVIDIA GPU. I do most of my rendering at night anyway when I am sleeping.
Thanks for posting all that. I see video card and think gaming because it's all I've ever really done. I didn't even know what FCPX meant until I saw it here. I'm sure a lot of others will appreciate the info!
 
Thanks for posting all that. I see video card and think gaming because it's all I've ever really done. I didn't even know what FCPX meant until I saw it here. I'm sure a lot of others will appreciate the info!
It really is a shame there are no NVIDIA option for those that prefer those GPUs. Like I said, hopefully this is changing with eGPU.
 
Why does everything have to turn into such a hassle on this site?

What do you do when the projector in that meeting room has VGA? What would you say? "I don’t believe in dongles as it makes me look unprofessional so I can’t present with this projector. "

The ironic thing is that for VGA projectors in the very least, you are likely already using an adaptor to begin. Be it mini-display or HDMI to VGA. You aren’t any better or worse off using a USB-C to VGA in its place.

This is correct. I would prefer not to have any dongles, but VGA is one that I bring, but its a single display dongle that has several different ports on it.
 
The 1060 model goes for $2500 so it’s 7920HQ vs 8650U…which runs 1 Ghz slower. They both have four cores. That’s not 2x faster, it’s actually slower!

Also, my post was meant in jest, I’m not really pissed off, lol!

The 1060 vs 560 is an entirely different comparison with the Nvidia part being nearly 3x faster.



I would rather have the 8650U/1060 combo vs the 7920HQ/560 setup in the MBP. I wonder how loud and hot that 1060 will make the SP2 tho.



They mentioned the CPU and GPU. 7920HQ vs 8650U isn’t 2x as powerful in the 1060 model.

Clock speed comparisons have been speculative at best for quite some time now. When Lenovo released the first generation of the X1 carbon, and real work would cause the i7 version to heat up and SpeedStep would throttle back faster than the i5 in the exact same notebook, so the i5 was actually both cheaper and faster for anything that mattered.

It's all about heat - if you make too much of it and can't get rid of it, you have to get off the gas pedal.
 
Not to mention, one of the most useful things about Win is the snap to half screen, MacOS default window management is horrendous. The only saving grace is the Mac track pad. By far the best.

Yes, that particular feature is handled much better in Windows 10. Most other window management related stuff (Mission Control, app switcher showing apps and not all windows, being able to switch between windows in the frontmost app) I think is better handled in macOS. Well, except that the double-clicking a window in macOS doesn't always behave the same way (zoom to fill entire screen), although it does mostly.
[doublepost=1510242518][/doublepost]
It really is a shame there are no NVIDIA option for those that prefer those GPUs. Like I said, hopefully this is changing with eGPU.

Pretty certain it will change. NVIDIA have drivers for their current GPU generation available for macOS which can be used if you have one of the older tower Mac Pro with a PCI Express slot.
 
Yes, that particular feature is handled much better in Windows 10. Most other window management related stuff (Mission Control, app switcher showing apps and not all windows, being able to switch between windows in the frontmost app) I think is better handled in macOS. Well, except that the double-clicking a window in macOS doesn't always behave the same way (zoom to fill entire screen), although it does mostly.
[doublepost=1510242518][/doublepost]

Pretty certain it will change. NVIDIA have drivers for their current GPU generation available for macOS which can be used if you have one of the older tower Mac Pro with a PCI Express slot.

I just searched, I do not see macOS listed when I look for drivers for my GTX 1080 Ti.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.