Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Prior to my 2010 MBP, I upgraded every 3 years on average. For my Mac Minis I upgraded every 6 years ending in 2012. Since then not one upgrade, Apple is not making pro or top level machines anymore. The last one IIRC was the 2011 17 inch MBP and the 2012 Mini. I am waiting for this years WWDC and if we don't see pro hardware, then i'll have to go elsewhere. Not sure where, but elsewhere. I am sure Apple does not care.

Since Apple outsourced Swift, I think they are moving to become only a phone and tablet maker. I think they will move their development tools to linux or windows and stop making real computer completely. That seems to be the trend. Remember they moved all their data center hardware to linux and greatly reduced mac os x server. While that makes some sense, since OS X was never a server OS. it is still troublesome.

I really do not think that is the case. Now they less and less support the professional market. But with Mac sales increasing I really do not see them leaving that market. Also remember they have to actually do their work on computers. Is Apple going to walk away from a profitable business (one where they make billions every year) to also put themselves in a position where they have to actually do their work on a Windows machine made by HP or Dell? Do you think they want to walk into coffee shops and see tons of young folks with booted up ChromeBooks? Are they going to abandon the billions they've spent on OSX? No way.

Also if you go out and start buying PC hardware you may find yourself very disappointed. Sure the CPU and the GPU will have the latest specs. But if your experience is like mine you will find that the actual integrated machine is going to have issues and serious durability problems. My company bought a bunch of nice Toshibas and most have failed in one way or another within less than two years. Considering the lack of profit that PC manufacturers are making these days, I suspect even more corners are being cut now in 2016. If the PC market continues to shrink I don't know what value I can actually get when I buy a $1,300 PC laptop in the future.
 
  • Like
Reactions: MacsRgr8
you compare last gen gtx 980 vs new fury nano i think. we should compare the 1080 vs fury nano right?
In terms of price and thermal envelope both are similar. Cheapest GTX 980 on Newegg goes for 469.99$. Cheapest Nano goes for 479.99 on the same site. GTX 980 has 186W power draw almost exactly the same as Nano(184W).

But compute power and overall performance is different.

Recently Nvidia made efforts to put their desktop 980 into laptops. If that trend continues, Apple should have a few more high end options for the iMac while keeping mobile factor GPUs.
The only problem with that GPU is that even with lower voltage it still has 150W TDP.

And before someone call me AMD bias: I am Nvidia fan. But also I am fan of hardware. Best hardware must win. Not proprietary software.
 
New GPUs always worry me a bit. Apple traditionally has a terrible record with GPUs -- particularly with MacBook Pros. I've lost two MBPs to GPU issues since 2009. That's really pathetic for machines that each cost me in excess of $2,500. By the time Apple finally admitted the GPU issues with the 2011 MacBook Pro, I had already replaced it with a 2014 MBP months earlier since my work doesn't stop just because Apple can't seem to get the proper GPUs in its notebooks.
 
Did you even bothered to read the posts? Did you even bothered to educate yourself on most recent benchmarks on both DX11 and DX12?

You know why Maxwell is so efficient? I know the answers to this questions. Do you?

Wow, you sure are an aggressive fellow. I did read the posts of the fanatical guy you linked. I don't really care what technology AMD/nVidia say they're going to use, what I care about is the products that actually get launched and how they perform; particularly in overall performance and performance per watt. If AMD magically pulls out a new line of graphics cards that outperform nVidia's offerings and don't run stupendously hot and require liquid cooling for their top end models then fine, all will be forgiven. Unfortunately their releases in recent years have been lacklustre so I don't hold out much hope. Their recent track record is poor. I'd rather AMD surprise me - I honestly wish they would particularly in the CPU field but that's a whole other game - as healthy competition is only good for the technology and the end users. I'm not anti AMD - I have probably bought more AMD/ATI products than I have nVidia and at the high end too (dual gpu cards).
 
  • Like
Reactions: tuxon86
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p
My new approach to all devices, iPhones, Mac's, etc, upgrade when the device is no longer useable. I am So done chasing technology and its associated costs. My iPhone 5 is like the Energizer Bunny. Does not mean that I will not invest in new technology like my Apple Watch but, will not trade to just trade. So far this approach has yielded significant monetary gains. Not for everyone but is working nicely for me.
 
Wow, you sure are an aggressive fellow. I did read the posts of the fanatical guy you linked. I don't really care what technology AMD/nVidia say they're going to use, what I care about is the products that actually get launched and how they perform; particularly in overall performance and performance per watt. If AMD magically pulls out a new line of graphics cards that outperform nVidia's offerings and don't run stupendously hot and require liquid cooling for their top end models then fine, all will be forgiven. Unfortunately their releases in recent years have been lacklustre so I don't hold out much hope. Their recent track record is poor. I'd rather AMD surprise me - I honestly wish they would particularly in the CPU field but that's a whole other game - as healthy competition is only good for the technology and the end users. I'm not anti AMD - I have probably bought more AMD/ATI products than I have nVidia and at the high end too (dual gpu cards).
Im sorry that you felt offended, but I was not aggressive. Believe me :).

One more question, if you say that perf/w is most important, then calculate for me performance per watt of compute power of the GPUs. For Example FirePro W8100 and compare it to Nvidia counterpart ;). For example Calculate performance of single Fiji from S9300x2 which has 300W TDP and core clock of 850 MHz ;).

I am pretty sure you will be extremely surprised with results.
 
  • Like
Reactions: Atlantico
Everyone ignores the Mac Pro, including, apparently, Apple.

I agree. I switched to an iMac workstation because Apple updates them far more frequently, and I can deal with the speed and lack of upgrades for at least two years.

I'd like to see some real advancement in the line of the Mac Pro and, as many of us have been asking for decades now, better GPU options and support.
 
Unfortunately, Apple will likely choose to use AMD's inferior products and mobile versions of those even in desktops so at best these future graphical improvements will mean that new iMacs will finally be able to compete with PCs from late 2014. Yay.

Lol wut? Nvidia's OpenGL and Async Comput performance are garbage. AMD is better than Nvidia at literally ever pricepoint, except the very top end. Fortunately, you buy Macs, and will never see top end GPUs in a Mac.
 
I've been very excited for a new Mac Pro, the latest one is crazy fast and newer ones using these graphics are going to be great. I can't wait to get my hands on this stuff!!!!

I can't even imagine what these devices are going to look like in 10 years, the power and portability is already out of control.. this is really fun time to be seeing all of this.
 
  • Like
Reactions: Macrumorstew
These new GPUs are going to be great, but I don't think we'll be seeing them in many Mac products as Apple has moved towards Intel integrated graphics. A new AMD or Nvidia GPU will most likely pop up in the 5k iMac and Mac Pro when they are next updated, and maybe the 15" MBP, but I can just as easily seeing them switching to a Skylake based Iris Graphics (GT4e) on the MBP.
 
I'm still using my 2011 iMac at home. Nice machine with the integrated DVD player. Every five years is about my desktop upgrade cycle. I'm going to hold off until end of year to see if I can get these better GPUs.
Me too actually. The last iMac with an integrated IR sensor too. That matters as I use the silver remote with my iMac a lot. I'll miss that when I upgrade to a new iMac. I see no need to upgrade anytime soon. I think I'll wait till Apple decides to make a new OS X not compatable with our 2011 iMac.
[doublepost=1460048239][/doublepost]
http://forums.anandtech.com/showpost.php?p=38148706&postcount=29 One.
http://forums.anandtech.com/showpost.php?p=38148709&postcount=30 Two.
http://forums.anandtech.com/showpost.php?p=38147252&postcount=1126 Three.
http://forums.anandtech.com/showpost.php?p=38147272&postcount=1129 Four.

If you want to speak about something you better know anything about what you are speaking. Only thing that lets Nvidia GPUs keep up with AMD is proprietary software: CUDA, Iray, GameWorks.

I genuinely suggest for people on this forum educating themselves about GPUs, by reading posts of this guy. He posts on many forums, under the same nick.
I agree 100%. I've always been more of an AMD fan. That was only because the research I found told me in most circumstances they were the better choice.
 
  • Like
Reactions: Atlantico
I doubt we'll see discrete GPUs in most of Apple's notebooks, for one integrated graphics just keep getting better, and two Apple is still on a spree to make everything as thin as possible and that goes against leaving proper ventilation for anything that gets warmer than a mid-range GPU (or at least stick to only using mobile GPUs like the 5k iMac).
 
But won't it still be the same? Nvidia have better drive support but fail more, AMD have worst driver support but better reliability?
It's hard to tell then again when both chips have failed in Macs, although I wonder if that's to do with Apple's poorly designed cooling solutions more then anything else these day's?
 
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p
I upgraded to the new format imac when it first came out in 2012. I am still holding out to upgrade to an ipad pro. For sure no desktop in my future, maybe a laptop, but I am hoping that one or two more releases of the ipad pro will give me what few things that I need that aren't there yet.
 
Im sorry that you felt offended, but I was not aggressive. Believe me :).

One more question, if you say that perf/w is most important, then calculate for me performance per watt of compute power of the GPUs. For Example FirePro W8100 and compare it to Nvidia counterpart ;). For example Calculate performance of single Fiji from S9300x2 which has 300W TDP and core clock of 850 MHz ;).

I am pretty sure you will be extremely surprised with results.

Workstation cards get a bit tricky to compare as they tend to go with quite different 'load outs' - processor type, memory etc to each other but from what I can tell, the closest competitor to the W8100 looks like the K5000? Those seem to perform about the same in benchmarks but with a 100W difference between AMD and nVidia, the nVidia card's performance per watt is quite a bit improved. Finding real world compute benchmarks is a little tricky but from what I can see, they perform about the same - AMD usually a bit better in OpenCL stuff and nVidia better in stuff that has a CUDA build versus an OpenCL one. It's been years since I was coding with CUDA/OpenCL so I don't know the differences between the two. Still, the nVidia cards have lower TDP for those comparisons.
 
  • Like
Reactions: tuxon86
The next MBP needs to be truly great or I'm going to consider it abandonware. A $2500 ask for not the most up-to-date everything is a crime.

I agree, but so many vendors bundle the best and latest of everything in a box and then wonder why it doesn't work properly for at least 6-9 months whilst owners wait for adequate drivers & software to become available. Apple needs to find a middle-ground and more importantly offer more frequent updates. I think >2 minor refreshes in a year isn't too much to ask or expect.
 
The only problem with that GPU is that even with lower voltage it still has 150W TDP.

And before someone call me AMD bias: I am Nvidia fan. But also I am fan of hardware. Best hardware must win. Not proprietary software.

They're already fitting 125W mobile cards in there. The issue could have simply been that Apple didn't want to foot the bill on a custom mobile card for a higher power GPU for the iMac.
 
Come oooon Apple, refresh those MacBook Pros!

I've got a mid-2011 13" MacBook Air with two keys that don't work and a battery at 60% of its original capacity - desperate to upgrade to something like the current 13" MBP but not if that line is soon to be updated..
 
Workstation cards get a bit tricky to compare as they tend to go with quite different 'load outs' - processor type, memory etc to each other but from what I can tell, the closest competitor to the W8100 looks like the K5000? Those seem to perform about the same in benchmarks but with a 100W difference between AMD and nVidia, the nVidia card's performance per watt is quite a bit improved. Finding real world compute benchmarks is a little tricky but from what I can see, they perform about the same - AMD usually a bit better in OpenCL stuff and nVidia better in stuff that has a CUDA build versus an OpenCL one. It's been years since I was coding with CUDA/OpenCL so I don't know the differences between the two. Still, the nVidia cards have lower TDP for those comparisons.
Closest competitor in terms of raw compute power is K6000. 5.1 TFLOPs vs 5.1 TFLOPs. The problem is that AMD GPU draws 185W of power under load( http://www.tomshardware.com/reviews/firepro-w8100-workstation-graphics-card,3868-15.html ) and Nvidia around 230W. K5000 is nowhere near in terms of compute power the W8100. And we still compare GTX 770 with R9 290 but with different clocks. Also the W8100 has much higher double precision performance. Compare compute power on the GPUs to get the best picture of power per watt.
 
  • Like
Reactions: Atlantico
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.