Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It means that relative power of previous gen. GPUs can be achieved on much lower power consumption.
 
I upgraded to the new format imac when it first came out in 2012. I am still holding out to upgrade to an ipad pro. For sure no desktop in my future, maybe a laptop, but I am hoping that one or two more releases of the ipad pro will give me what few things that I need that aren't there yet.
Still rockin a 2011 MBA. Rarely used but still works like a charm. All I need Apple to do is update the mac mini (and not kick us in the nuts like they did with the last mini:oops:). They get guaranteed money. The wife already said I could have it.:) If they don't come correct with the mini, they won't see a dime from me until my daughter needs a new phone (if she goes with iPhone again). Nothing else in their lineup is even remotely compelling to me. Maybe the Mac Pro but that is beyond overkill for what I plan to do with the mini. Daddy wants a proper mini.:mad:
 
Closest competitor in terms of raw compute power is K6000. 5.1 TFLOPs vs 5.1 TFLOPs. The problem is that AMD GPU draws 185W of power under load( http://www.tomshardware.com/reviews/firepro-w8100-workstation-graphics-card,3868-15.html ) and Nvidia around 230W. K5000 is nowhere near in terms of compute power the W8100. And we still compare GTX 770 with R9 290 but with different clocks. Also the W8100 has much higher double precision performance. Compare compute power on the GPUs to get the best picture of power per watt.

The page you linked says that W8100 and K6000 draw about the same power under load - 188W for graphics, 190W for pure compute compared to the K6000's 202W and from what I've seen, a CUDA implementation on a K6000 looks to usually hugely outperform an OpenCL implementation on a W8100 and furthermore, AMD's claims of strong double floating point calculations seem to be false. It's likely all driver stuff though and AMD has been far weaker on that front for a while.

I'm probably comparing these workstation cards wrong though - the last time I looked into this in detail was when I was coding with OpenCL/CUDA and opted for a 790 for home development for stuff to run on the Tesla cluster at work. OpenCL wasn't fun to work with and CUDA just seemed to work better, have more active support and was simply more mature. That was when I switched from AMD to nVidia last.

I don't do that kind of coding anymore and these days I just look at FPS and TDP - having used dual GPU solutions from AMD, I'd found the heat to just be ridiculous. Nvidia seem to do more work on the driver side than AMD and these days I'm more interested in VR anyway and nVidia is way head of AMD on this. We'll see what the new architectures from both sides produce but until they're actually ready to order, I consider them vapourware and until non-synthetic benchmarks are produced by independent parties, I consider the tech from both sides to largely be bluster and hype.

In recent years, AMD has disappointed me with their designs and implementation philosophy re:tdp and so I expect them to let me and the tech world down. Nvidia has surprised me several times now with hugely efficient designs so I'm far more biased in expecting that nVidia's next gen will be better. Looking at the current iMac though, Apple could have put a 970M or 980M in there as an option - they were available - and they would have been so so much nicer.
 
  • Like
Reactions: tuxon86
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p

My wife and I purchased mid-2010 MBP's (upgraded RAM in 2013). It's time. We are looking forward to upgraded MBP's or MBA's.
 
Last edited:
Me too actually. The last iMac with an integrated IR sensor too. That matters as I use the silver remote with my iMac a lot. I'll miss that when I upgrade to a new iMac. I see no need to upgrade anytime soon. I think I'll wait till Apple decides to make a new OS X not compatable with our 2011 iMac.
[doublepost=1460048239][/doublepost]

The only problem with mine is that it has an HD instead of an SSD. The HD is the bottleneck. If I got it upgraded to an SSD I'd probably try to stick with it. But a big SSD will cost a few hundred dollars and the installation will either cost more or will be something I do myself. Doing it myself would come with some risks as I'm not experienced in this stuff.

If you wait long enough I suspect the Macs will be voice controlled not too long. There is really no reason Siri isn't supported on Macs that I can think of. And if Apple threw in that always on chip from the iPhone 6s you could control your Mac with "Hey Siri". Though I know some folks would disable that feature for privacy reasons.
 
I agree. I switched to an iMac workstation because Apple updates them far more frequently, and I can deal with the speed and lack of upgrades for at least two years.

I'd like to see some real advancement in the line of the Mac Pro and, as many of us have been asking for decades now, better GPU options and support.

Define Workstation! Not the machine you work on. Following the standards the Mac Pro is a Workstation, the iMac not.
One detail: RAM. ECC for Workstations, non ECC for the others.
 
It's up to Apple if they want to add discrete GPUs back to Macbook Pro.

The vector of progress in integrated circuits is increased integration. Reducing integration by switching from integrated GPUs to discrete GPUs would be going backwards, which is not the Apple way.

Next the luddites will start dreaming of how fast a discrete FPU could be with 2010s technology, despite the fact that discrete FPUs died in the 1980s.

Skylake CPUs will be replaced by Kaby Lake which will be distinct from Skylake mainly in having better integrated GPUs. The MacBook Pro users who are dreaming of faster graphics should be salivating over Kaby Lake, which Apple will put into the MacBook Pro, rather than AMD or Nvidia discrete GPUs, which Apple have been phasing out of the MacBook Pro.
 
Can't wait for more native AMD and NVidia drivers support built into macs in order to build a nice Skylake hackintosh...
You absolutely do not need Apple's native drivers when Nvidia themselves provide monthly driver updates to their GPUs. I am running i7 6700K on Asus Maximus Ranger VIII Z170 with GTX 970 with working HDMI audio and Intel I219-V ethernet. I even managed to get 10.11.5 beta up and running on it. Easy peasy.
 
The page you linked says that W8100 and K6000 draw about the same power under load - 188W for graphics, 190W for pure compute compared to the K6000's 202W and from what I've seen, a CUDA implementation on a K6000 looks to usually hugely outperform an OpenCL implementation on a W8100 and furthermore, AMD's claims of strong double floating point calculations seem to be false. It's likely all driver stuff though and AMD has been far weaker on that front for a while.

I'm probably comparing these workstation cards wrong though - the last time I looked into this in detail was when I was coding with OpenCL/CUDA and opted for a 790 for home development for stuff to run on the Tesla cluster at work. OpenCL wasn't fun to work with and CUDA just seemed to work better, have more active support and was simply more mature. That was when I switched from AMD to nVidia last.

I don't do that kind of coding anymore and these days I just look at FPS and TDP - having used dual GPU solutions from AMD, I'd found the heat to just be ridiculous. Nvidia seem to do more work on the driver side than AMD and these days I'm more interested in VR anyway and nVidia is way head of AMD on this. We'll see what the new architectures from both sides produce but until they're actually ready to order, I consider them vapourware and until non-synthetic benchmarks are produced by independent parties, I consider the tech from both sides to largely be bluster and hype.

In recent years, AMD has disappointed me with their designs and implementation philosophy re:tdp and so I expect them to let me and the tech world down. Nvidia has surprised me several times now with hugely efficient designs so I'm far more biased in expecting that nVidia's next gen will be better. Looking at the current iMac though, Apple could have put a 970M or 980M in there as an option - they were available - and they would have been so so much nicer.
Ask any game developer which company has better hardware for VR. And why driver optimization is becoming meaningless at the moment. They will tell exactly the same thing as the one "crusader" you dismissed already - Mahigan. Read, educate.

The whole point of VR, and Low-Level APIs is that Application drives the hardware without software in between(drivers), so that it does not create more latency. What is important for VR? Compute power and Asynchronous Shading and Compute capabilities. The only way it is possibly to achieve this on Nvidia hardware is through CUDA - so it will create latency. What is more important for VR is Hardware Scheduling, have you seen any Hardware Scheduler on any Nvidia architecture since Fermi? No. The CPU is doing all the work of Scheduling - creates more latency.

The hardware currently that is much better for VR is GCN. And that is a fact. Ask any game developer that is not connected to Oculus. All of this has been discussed on forums for last... 8 months or so. All has been debunked. Educate yourself, you will know why R9 390X in DX12 games ties with GTX 980 Ti. Because it has similar compute power. It will also be blatantly obvious why 8.6 TFLOPs GPU(Fury X) is much faster in DX12 games than 6.1 TFLOPs GPU. Why do I say about DirectX12? Because it is based on API that is base of every most meaningful API currently there: Mantle. And Mantle is in DX12, Vulkan, LiquidVR, everything.
 
Last edited:
  • Like
Reactions: Atlantico
People should really get interested in building hackintoshes (desktop users). Apple doesn't listen to you. I have yet for them to ever. You listen to Apple.

I get that it can be hard if you never built one, though there's like tons of resources on picking the correct parts to make it run smoothly.

Some people care about performance more than design. Apple keeps dumbing hardware down. So much for premium quality.

I'm running one just fine with Core i3 and GTX 750 TI SC.
 
Last edited:
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p

I just upgraded mine after about 4 years. It was still actually going strong (a 2011 Mac Mini). I'd dropped an SSD into it a couple years ago, which was a dramatic upgrade in performance -- but I was really craving a big retina display so I bit on a refurbished 27" 4K iMac and dropped 24GB of RAM into it. I'd been doing more photography and some video editing, and although the Mini was up to the task, I was honestly looking for an excuse to buy a new Mac. I didn't NEED it, but man, is it nice.

Also, you can always use the strong resale value of an old Mac to take some of the pain out of buying a new one.
 
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p

Don't upgrade unless you have performance or reliability problems. Actually why would you upgrade otherwise? Just for the hell of it?

The reality is 2011 and even more so 2014 era machines aren't that bad performance wise when it comes to the CPU's it is the GPUs that have seen massive performance increases in that time. So ask yourself how much of your software is GPU dependent.

The reality is if you are asking most likely you don't need an upgrade.
 
  • Like
Reactions: ignatius345
Apple seriously need to get rid of AMD graphic cards. They are rubbish!

Actually they remain the best option available to Apple. You need to consider what Apple has to consider from an informed engineering point of view. To put it simply (because in this case I have to be simple) AMD's design goals align with the direction Apple is going.
 
  • Like
Reactions: Atlantico
Still rockin a 2011 MBA. Rarely used but still works like a charm. All I need Apple to do is update the mac mini (and not kick us in the nuts like they did with the last mini:oops:). They get guaranteed money. The wife already said I could have it.:) If they don't come correct with the mini, they won't see a dime from me until my daughter needs a new phone (if she goes with iPhone again). Nothing else in their lineup is even remotely compelling to me. Maybe the Mac Pro but that is beyond overkill for what I plan to do with the mini. Daddy wants a proper mini.:mad:
Words I hope never to hear from my wife - "yes dear, you have a mini" :D

More seriously, what is your use case that you use no computer other than the mini? Or do you have none Apple computers as well? This is my personal preference but I like the all in one versus the component computers like the mini or mac pro (aka ash tray) - but they are equally good computers. As you know, i am hoping to move to the ipad pro since I don't do much that requires a desktop anymore. All my work stuff i access through citrix so even more intensive stuff is just a web access at this point -- right now however that sucks on the ipad because the windows based software that expects a mouse does not translate well to the ipad via citrix. That requires a computer or laptop with mouse/keypad. That should change at some point especially if companies ever decide to migrate to Windows 10.
 
  • Like
Reactions: 69Mustang
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p
I just upgraded my 2011 27" iMac by replacing the internal HDD with an SDD and a much larger HDD, both internal (OWC did it for me). I had already bumped memory up to 32GB earlier. It's like a new machine for only $500. I like this machine far more than anything I've seen in the newer iMacs released since 2011, and I do use the internal DVD drive fairly often.
 
  • Like
Reactions: 69Mustang
How often do people upgrade their Macs?
I haven't since 2011 and 2014 (besides upping the RAM in the iMac)...

Should I be upgrading? :p

I upgrade my MacBook Pro every year. Best Buy has deals on them for ~$300 off during the college shopping season. I buy five or so and resell my machine plus the other four on eBay/Craigslist so it doesn't cost me anything but a bit of time making the listings etc.
 
Last edited:
It means that relative power of previous gen. GPUs can be achieved on much lower power consumption.

No it doesn't, it shows that the power of previous generation GPUs can be achieved with lower power consumption (by implication, since it shows that the new GPUs have more performance using less power). It doesn't claim "much lower" at all.

Please, MR, why on earth would you waste screen real estate on such a pathetic graph?
 
Doesn't matter for me as long as Apple stubbornly won't be putting those into at least one version of a 13" MacBook Pro.

Agreed. The 13 inch has never been a 'pro' as far as I'm concerned. They really should name it the '13 inch MacBook mid level'. But that doesn't roll off the tongue like 'PRO' so they'll continue to use misleading names for their product lines.
 
  • Like
Reactions: AlexGraphicD
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.