Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What are you guys talking about? 3D? Video editing? Motion graphics? VFX? Music? Software development?
Just some points…
I see the AMDs best for video editing, that's where Apple is. Final Cut Pro X doesn't support CUDA.
In my machine (iMac 27" Late 2013 with NVIDIA GeForce GTX 775M 2GB) I can play very well a scene with 3D Text, lights, reflections, emitters, particles, etc in Motion 5, not even close with the latest version of After Effects!!! Software development??!! I believe I could run it better if I had a AMD. CUDA is on After Effects but that doesn't seems to be affecting it's performance…
I've ever heard that AMD is best for video and NVIDIA for 3D, I could be wrong but…
And please, don't show us comparative tables with numbers testing games on the windows side, that's nothing to a Mac user.
 
I doubt they're going to adopt good gaming graphics cards anyway, that's just not what they aim for.

They certainly used to have better video cards (for their day). Christ, my 2011 MacBook pro with the 6750 card in it still outperforms the current iGPU in $2000 MacBook pro they sell here in 2016, 5 years later.
But nowadays all Apple cares about is thinness, not performance. "PRO" my ass.
 
http://forums.anandtech.com/showpost.php?p=38148706&postcount=29 One.
http://forums.anandtech.com/showpost.php?p=38148709&postcount=30 Two.
http://forums.anandtech.com/showpost.php?p=38147252&postcount=1126 Three.
http://forums.anandtech.com/showpost.php?p=38147272&postcount=1129 Four.

If you want to speak about something you better know anything about what you are speaking. Only thing that lets Nvidia GPUs keep up with AMD is proprietary software: CUDA, Iray, GameWorks.

I genuinely suggest for people on this forum educating themselves about GPUs, by reading posts of this guy. He posts on many forums, under the same nick.

He only lightly touches upon some of AMD's advantages but the real advantage is that AMD is all in on heterogenous compute. This dictates some of the design decisions that go into their discrete GPUs and as highlighted in your referenced articles gives AMD chips significant advantages.

Anybody dismissing AMDs hardware simply doesn't have a clue as to what is happening inside their chips. There are good reasons for the various vendors choosing AMD of late when decisions are based on technical evaluation of the hardware. Now we can argue about AMD's drivers but that doesn't make any difference in a Mac where Apples GPU drivers universally suck.

What I really want to see though is an AMD APU built on the 14nm tech in a Mac Mini replacement. That has the potential to give us a really fine machine in a compact package.
 
They certainly used to have better video cards (for their day). Christ, my 2011 MacBook pro with the 6750 card in it still outperforms the current iGPU in $2000 MacBook pro they sell here in 2016, 5 years later.
But nowadays all Apple cares about is thinness, not performance. "PRO" my ass.
you say that your 6750 is better than HD 6200 Iris pro?
 
  • Like
Reactions: DanCorleone
They certainly used to have better video cards (for their day). Christ, my 2011 MacBook pro with the 6750 card in it still outperforms the current iGPU in $2000 MacBook pro they sell here in 2016, 5 years later.
But nowadays all Apple cares about is thinness, not performance. "PRO" my ass.
In what measure? HD6750M Is nowhere near HD5200 Iris Pro both in gaming and compute performance.
 
  • Like
Reactions: DanCorleone
The combination of Apple's closed (as in non-publishing) research environment, and lack of interest in 3D gaming rigs has led to them being way behind the ball in providing users with new products for machine learning and deep neural networks, one of the hottest areas in computing these days. That's why Facebook and Google are getting all the PR (Image captioning, Go, TensorFlow, etc.). And someone else (IBM, Nvidia, etc.) is selling all the highly-profitable high-end GPGPU hardware.
 
Prior to my 2010 MBP, I upgraded every 3 years on average. For my Mac Minis I upgraded every 6 years ending in 2012. Since then not one upgrade, Apple is not making pro or top level machines anymore. The last one IIRC was the 2011 17 inch MBP and the 2012 Mini. I am waiting for this years WWDC and if we don't see pro hardware, then i'll have to go elsewhere. Not sure where, but elsewhere. I am sure Apple does not care.
This is complete nonsense. Seriously the MBPs are very much pro quality machines. Further is Intelmhad gotten its act together we would have seen MBP updates months ago. You can't blame Apple for the lack of MBP updates when Intel hasn't deliv red viable chips.

The same thing applies for the Mac Pro, there have been no viable chip updates for this machine. The GOU industry has been stuck on the same process node now for around 5 years thus now real GPU upgrades. This in fact is part of what makes this article interesting as we will finally see a new generation of chips this year that will eventually make it into all device categories. Besides that it makes a lot of sense in Mac Pros case to wait until they can deliver all the new port technology that is coming online.
Since Apple outsourced Swift, I think they are moving to become only a phone and tablet maker.
My god don't you even have a clue as to what OPEN SOURCE is? Seriously guy they didn't outsource it they opened sourced it there is a massive difference here. They out sourced Swoft in the hopes that it would see wide adoption and thus avoid the Apple only problem that Objective C has.
I think they will move their development tools to linux or windows and stop making real computer completely. That seems to be the trend. Remember they moved all their data center hardware to linux and greatly reduced mac os x server. While that makes some sense, since OS X was never a server OS. it is still troublesome.

Clearly you have no idea what you are talking about. Apple runs Linux servers for the same r as on everybody else does.
 
  • Like
Reactions: page404
The good thing about this AMD vs. NVIDIA argument is that I can change alliances whenever I want. I love the Mac Pro, but the proprietary GPU formfactor really was the ultimate deciding factor for me when I finally decided to buy my workstation.

People should really get interested in building hackintoshes (desktop users).

Or, god forbid, use Windows. The horror. :rolleyes:
 
  • Like
Reactions: koyoot
What are you guys talking about? 3D? Video editing? Motion graphics? VFX? Music? Software development?
Just some points…
I see the AMDs best for video editing, that's where Apple is. Final Cut Pro X doesn't support CUDA.
In my machine (iMac 27" Late 2013 with NVIDIA GeForce GTX 775M 2GB) I can play very well a scene with 3D Text, lights, reflections, emitters, particles, etc in Motion 5, not even close with the latest version of After Effects!!! Software development??!! I believe I could run it better if I had a AMD. CUDA is on After Effects but that doesn't seems to be affecting it's performance…
I've ever heard that AMD is best for video and NVIDIA for 3D, I could be wrong but…
And please, don't show us comparative tables with numbers testing games on the windows side, that's nothing to a Mac user.


Basing on FCP is bad. A. apple doesn't support the hardware universally (MBP's...for those of who can't throw down the bigger money on MP, or we jsut don't want a tower system) so its not even needed to do CUDA.

B. Its an application Apple is all but saying we just don't really care anymore. Its getting the same treatment aperture did before its public execution. Weak patches released once a year. Not counting the more frequent RAW patches...apple clearly did that for base picture app primarily in OS X not aperture. The one time aperture mentioned at WWDC in years...and they put a bullet in its head. Ugly trend repeating?

Its not even getting updates that we know of for apple tech. not even talking NVidia/cuda. I personally was hoping for FCP...the swift edition....by this time.

Great marketing chance there. Look at out our new application. Built on our new framworks. this apple developers...this is what can be done. Instead it gets lame updates yearly. Stuff I'd rate for 3-6 month minor patching really. Some I'd even class as hot fixes really. Drop on the server, get em if you need em. Don't need em...don't pull them down.

Apple can't even be assed to have an update on release days. IPP 12" and iphone 6 took months to get FCP patching to recognize (that year long cycle thang). 9.7 release day...cricket cricket. Industry standard is support on release day. Even Nikon with NX2 did this on their body and lens release days. And they suck ass, massively, at software updating. 3rd party lags only because of NDA or they have to get samples in to test and build up support for them.

Gaming....all work and no play makes johnny a dull boy. For good or bad...gaming helps sells systems. Look at iOS, when its not used in meetings or work...most will squeeze some game time on it. Why you see its graphics improve over the years. Most of it not needed to make a note taking or office product app look nicer. Office apps on the cheapest dell with crap GPU or the latest Titan custom monster build.....looks the same to draw a parallel from the PC world. Ipads/phones have improved to make some irate birds look good as they are flung across the screen. This helps sells them.
 
The combination of Apple's closed (as in non-publishing) research environment, and lack of interest in 3D gaming rigs has led to them being way behind the ball in providing users with new products for machine learning and deep neural networks, one of the hottest areas in computing these days. That's why Facebook and Google are getting all the PR (Image captioning, Go, TensorFlow, etc.). And someone else (IBM, Nvidia, etc.) is selling all the highly-profitable high-end GPGPU hardware.

3D failed long ago, if you mean VR, its still in its infancy and Apple usually waits until maturity before entering. Google has enter many fields and failed in nearly all of them. PR does not mean much if you never have a final product release as it usually gets canceled long before that. Remember Google Glass?
 
Words I hope never to hear from my wife - "yes dear, you have a mini" :D

More seriously, what is your use case that you use no computer other than the mini? Or do you have none Apple computers as well? This is my personal preference but I like the all in one versus the component computers like the mini or mac pro (aka ash tray) - but they are equally good computers. As you know, i am hoping to move to the ipad pro since I don't do much that requires a desktop anymore. All my work stuff i access through citrix so even more intensive stuff is just a web access at this point -- right now however that sucks on the ipad because the windows based software that expects a mouse does not translate well to the ipad via citrix. That requires a computer or laptop with mouse/keypad. That should change at some point especially if companies ever decide to migrate to Windows 10.
Personally, I rather hear "you have a mini" than "Hey honey, can I get your credit card?":eek:
Yeah, I have several use case computers. Falcon NW Fragbox - gaming at home. Alienware 15" laptop - Gaming on the road. HP Elitebook - work. MBA '11 - play. Chromebook - play. 3 Raspberry Pi's that I have no idea what to do with... yet.;) About 3-4 other home brew computers in various states of completion. <-- That last hobby is why I have an aversion to all-in-ones. I like to tinker.

My company is a mixed bag of voodoo and witchcraft on the IT side. We have some proprietary software that is only XP compliant. Our main everyday stuff is goofy mix of Win7 & 8. Scuttlebutt says Win10 is in the works and will be fully integrated by late 2017, early 2018. Can't wait. I actually like 10.
 
  • Like
Reactions: 2457282 and koyoot
My god don't you even have a clue as to what OPEN SOURCE is? Seriously guy they didn't outsource it they opened sourced it there is a massive difference here. They out sourced Swoft in the hopes that it would see wide adoption and thus avoid the Apple only problem that Objective C has.

... and the thing is ... nobody cares.

I'm sure Chris Lattner is a great guy, but Swift exists to further close down Apples platforms.

The day you can only code in Swift and use Swift Frameworks will come ... and then iOS becomes a free-to-play platform where only big companies can make money selling Apps as a UI for their services.

Apple needs this as a security measure, they control most of the communication of the phone by encapsulating HTTP API's, they cannot risk that millions of phones have an attack vector from some OpenSource library used as a communication protocol.

Swift exists to ensure that all people publishing on iOS are using a unified toolchain that uses a strict set of Frameworks.
At that point, it will be extremely limiting to code for iOS, which will kill the rest of the independent devs who cannot make money today anyways.
 
  • Like
Reactions: jedifaka
so you are saying that amd is better almost in every way than nvidia?

I have no idea what he is saying but Apple has made the right technical choice to go with AMD because their hardware designs align with the directions Apple has taken with their operating system and support software. Look into some of AMDs papers on heterogeneous computing and you will better understand.

By the way making the right technical choice doesn't mean that a GPU is perfect. What it means is that the positives out weigh the negatives.

Beyond all of that there seems to be two camps that are against AMD in Macs. One is the NVidia fan boy crowd which has no value in the discussion. The other is the frequent posts from CUDA users. CUDA is a dying technology for a number of reasons. For one it is proprietary which is fairly stupid especially considering the areas where CUDA gets used. The second issue is that discreet GPU cards are quickly becoming a thing of the past. As the industry moves to 14nm and smaller chip geometries GPUs integrated into the CPU does become far better compute resources. The problem here though is that NVidia doesn't have a GPU solution with an X86 compatible core. The rise of the APU means that the vast majority of compute code will be written for non CUDA supporting hardware. It should be noted that Intel's GPUs do amazingly well as compute resources already for certain code bases. That will only improve as Intel learns to do GPUs better. The same can be said for AMD as having the GPU onboard the same chip as the CPU actually makes for a compute platform that is viable for a wider array of uses. So anybody arguing about CUDA needs to start considering the future where NVidia GPUs will be rare. CUDA based apps will be even rarer as the incentive to use proprietary software like CUDA evaporates.
 
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p

Mac since 1984 and in the beginning I always updated everything asap.
128K to 512 K to 1024 etc. LOL for a 9 " b/w Mac.

My latest "upgrading system" has however saved me a lot of $$$ and works better (for me)

Bought a 2008 17" MBP and kept upgrading it until it hit its max.
That is no longer possible.

For what I do if my MBP is waiting for me to put in something or process something, it's still good.
(I no longer do desktop)

If I get too many beach balls and I am waiting for the MBP , it's time to get a faster one.
(Which I did 15.4 , mid 2014)

Even then not buying the latest, just 1 generation back, but top configuration, which is the bottom in 3 years.

Some good deals so far on ebay, requires patience and knowing what you want. (I usually do not bid. I decide on a what I want to pay and then pounce) Or refurb store from Apple.

Without know what you actually need, I would say every 4-5 years is plenty of advancement.
 
I was just thinking the other day how it has been forever since Nvidia has updated their GPUs. Now if Intel would hurry up, we could see some serious CPU performance increases. My mid-2012 Retina MBP benchmarks not much lower than the current models for CPU intensive tasks. However, it could definitely use a faster GPU. This is probably why Apple has been holding out on upgrading the Mac Pros.
 
Means nothing to me until they allow the 21" iMac to have a dedicated graphics card. You want me to pay $2000 for a 4K computer with only integrated graphics? The 27" option is simply too big for me--the 21" would be perfect, but I won't be buying it without a dedicated graphics card.

And considering that the new MacBook are supposed to be "hyper-thin", I wonder how they're going to fit these graphics cards in there, if they even use them.
 
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p

We have a 2010 mini we use as a home theater. Waiting for the Mini to adequately handle 4k video before we replace. (Hopefully within a couple of years)

Use a 2011 iMac. Waiting until iMac with SSD and discrete gpu are available for under $1200 before replacing. (Not likely to happen)

Use a 2009 MacBook (the white plastic one) with SSD upgrade. Waiting until it dies before we upgrade. (Probably get another 3 or 4 years out of it).
 
  • Like
Reactions: mojolicious
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.