Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is cool and all but it will never impact me :(

Retina MacBook Pro 15" is out of my budget range, that's without the discreet graphics which is a $500 jump in price I believe.

I'm looking at low to mid 13" rMBP... I'll be stuck with integrated graphics. Not that it would hit me really, I'm not much of a PC gamer. I don't do anything that requires something nuts like a GTX 980 Ti for example hah..

I'm just far more comfortable knowing I have more capable graphics, in case the occasional small game comes up or something.
 
Anyone doing professional audio or video should not touch Apple hardware (without being paid to), because you cannot depend on Apple to stick to anything other than satisfying the teen crowd. Mac Pro 3 years old no upgrade/update. No hardware between Mini and Mac Pro. No top end laptops. I could go on. When these people move away from Apple, it no longer makes sense to stay in the Apple ecosystem, except of course for the phone, which is now a commodity with limited growth. I am one of the few professionals that I know in my profession that still uses Apple hardware, the others have moved to windows.
Funny. I work at ESPN, ALL macs in video production. But hey, they're not pros apparently.

And you know what's funny? I've never heard one video editor ever mention anything about the macs. Hell, for all they know their screens are plugged into toasters. You know why they don't care? They're busy getting content on air for over 30 channels, and millions of viewers each and every day. You know, working.

Don't confuse your tech snobbery with professionalism.
 
How are Nvidia GPUs “useless pieces of garbage”?
[doublepost=1461107331][/doublepost]It’s nice to see Apple continuing to make MacBook Pros with dedicated GPUs, but AMD isn’t the best option. AMD might cost less, but it generates a lot more heat than a comparable Nvidia chip and that’s a big problem in a laptop, especially a thin laptop like the Retina MacBook Pros. The problem is only worsened by the lead-free solder which eventually shorts out the logic board. Otherwise, external graphics is the only other route for MacBook Pro users.

Apple's not concerned about the long term. As long as the chip is cool enough during normal use over a 2-3 year period, they will cover any lawsuit/replacement program costs that crop up, since they will likely still save from going with the cheaper chip to begin with.
[doublepost=1461113354][/doublepost]
I have preferred AMD GPUs for a number of years now. In general, they have been much more power efficient than comparable nVidia GPUs.

It’s nice to see Apple continuing to make MacBook Pros with dedicated GPUs, but AMD isn’t the best option. AMD might cost less, but it generates a lot more heat than a comparable Nvidia chip and that’s a big problem in a laptop, especially a thin laptop like the Retina MacBook Pros.

But who is more right?
 
I'm willing to believe that Apple will use a discrete GPU in the 2016 iMacs, but I'm very skeptical that Apple would use a discrete GPU in any 2016 MacBook Pro. Apple have been incrementally transitioning from discrete to integrated GPUs in the MacBook Pro line for years. That process is all but complete.
 
Contrary to popular belief AMD never left the GPU game. They in fact have very good GPU's with a slightly different forces than Nvidia and often far better performance than Nvidia.

When building my Oculus Rift gaming beast I avoided AMD graphic cards for two reasons; high power requirement and temperatures at the same performance level. But seeing that these are basically budget gpu's/laptop chips their fine for what they are. I built a HTPC using a low cost AMD chip with integrated graphics and it does fine with 1080p video. But 4K I don't know.
When I upgrade my Nvidia 970 this year we will see who has the best next gen video card; Nvidia or AMD.
 
Anyone doing professional audio or video should not touch Apple hardware (without being paid to), because you cannot depend on Apple to stick to anything other than satisfying the teen crowd. Mac Pro 3 years old no upgrade/update. No hardware between Mini and Mac Pro. No top end laptops. I could go on. When these people move away from Apple, it no longer makes sense to stay in the Apple ecosystem, except of course for the phone, which is now a commodity with limited growth. I am one of the few professionals that I know in my profession that still uses Apple hardware, the others have moved to windows.

The margins aren't high enough in the market pros and semi-pros are in.
They are the ones who buy based on cold hard facts. They see HP make a desktop that makes their workflow faster for the same money, they shop HP.

Apple doesn't want anything to do with them, because they simply aren't thrilled by Apple's ever growing need of maximizing margins.

If I were you I'd quickly find an exit strategy shouldn't you already have one.
The writing is on the wall:

Mac Pro - 2 years no update. (before that they didn't even bother to adapt the old Mac Pro to EU standards and actually halted sales for the time being until the new Mac Pro launched many months later - that's unacceptable in the professional market)

Aperture - Use Apple Photos! Yeah right...

Garage Band getting Logic Pro features and loops - Hey, I remember that... Shared libraries between Aperture and iPhoto anyone?

Well, the list goes on... Long story short: I'm amazed by your loyalty, I truly am.
Apple got through its toughest times only because of professionals who back then at least still could get top-notch gear buying from Apple.
I'm sure a mass exodus would have spurred innovation on the PC side to mature at a faster rate, too, but yes indeed, the fashion crowd is where it's at today.
The casual home user who benchmarks their new machine by how amazing the holiday pictures on Facebook look.
Now I'm not one to sneeze at less demanding users, but what I find irritating is how they get absolutely screwed these days price-wise.
Apple's long left the territory of reasonable surcharge for superior experience, especially when you have to deal with the hardware outside of Apple's warranty.

Another day, another rant about Apple's downfall from me (QUALITY-wise, not MARKETSHARE or financial performance-wise, many people seem to only see Apple's financial health as a parameter for Apple's well-being, trusty sheeple they are!)

Glassed Silver:mac
 
Last edited:
Too bad, nvidia is really doing bad as a company..the future of nvidia is bleak......this is probably why apple took the AMD route. AMD now has ties to microsoft, sony, intel, and apple.
 
I'm willing to believe that Apple will use a discrete GPU in the 2016 iMacs, but I'm very skeptical that Apple would use a discrete GPU in any 2016 MacBook Pro. Apple have been incrementally transitioning from discrete to integrated GPUs in the MacBook Pro line for years. That process is all but complete.

I remember this comment from quite a while back. I'm eagerly awaiting apple's next move in regards to this matter too! I still hope you're wrong :p
 
  • Like
Reactions: jblagden
I'm willing to believe that Apple will use a discrete GPU in the 2016 iMacs, but I'm very skeptical that Apple would use a discrete GPU in any 2016 MacBook Pro. Apple have been incrementally transitioning from discrete to integrated GPUs in the MacBook Pro line for years. That process is all but complete.

That may be so, but I doubt it. I think Apple would lose a lot of high paying customers if they did not have at least one portable machine capable of doing decent graphics processing. The whole point of the MBP (or at least the main point) over the MB is computational power, and the vast majority of computational power in these kinds of computers is the GPU.
[doublepost=1461117846][/doublepost]I find it so amusing how people can care so much about who makes their graphics card, NVIDIA or AMD. For something hidden away inside a computer, that has no outward effects other than relative performance, why does it matter?

This is why I love WCCFTech so much. The level of obscenity and hostility in the comments sections over why red/green is so much better than green/red, and that people who like green/red are morons, is hilarious.

It makes sense that people should do what Apple does. Choose what gives the best performance, at the lowest power, for the best price. What colour the sticker is should not be relevant at all.
 
Don't confuse your tech snobbery with professionalism.

I'm reminded of the hundreds of comments I've seen on this site along the lines of "Macs are not pro machines because they are no good for gaming." I've never seen any reliable data about how many pro gamers there are. Real pros just want to get their work done. Macs allow people to concentrate on real work, rather than messing about with administration of the computer.

I find it very difficult to believe that pros know or care which chips are inside a computer. It either does the job satisfactorily or it doesn't.
 
In nutshell...
yNwZ4Bj.gif
 
Funny. I work at ESPN, ALL macs in video production. But hey, they're not pros apparently.

And you know what's funny? I've never heard one video editor ever mention anything about the macs. Hell, for all they know their screens are plugged into toasters. You know why they don't care? They're busy getting content on air for over 30 channels, and millions of viewers each and every day. You know, working.

Don't confuse your tech snobbery with professionalism.

We use them at the BBC, does not mean they are the best tool for the job anymore. Ironically none are 2013+ the editors admit they are working with old hardware cause the department is so heavily bought into apple for its workflow. There was a lot of resistance to move away from macs as die hard management supporters still had faith apple would support the pros. 2013 Mac pro plus an aweful update schedule and very uncertain future is making that choice very easy. A naive person might walk past and believe all is fine, on the contrarary, they are about to all get replaced in the near future when the 2012 models become too slow. Many places are holding onto their 2012 pros with with hope that apple will bring back the cMP.

Don't confuse a major media organisation that still uses macs as being in a healthy\happy place, it takes along time for these big media companies to jump ship. Apple shot themselves in the foot with poor support and updates for the pro range.

One must not confuse brand loyalty with professionalism .

I have a Mac pro 2013, great for myself and home use, but my PC kills it at video editing . just cause I love the brand does not mean I am going to waste my time when I can get a job done faster on my PC
 
I'm reminded of the hundreds of comments I've seen on this site along the lines of "Macs are not pro machines because they are no good for gaming." I've never seen any reliable data about how many pro gamers there are. Real pros just want to get their work done. Macs allow people to concentrate on real work, rather than messing about with administration of the computer.

I find it very difficult to believe that pros know or care which chips are inside a computer. It either does the job satisfactorily or it doesn't.

Gaming performance can give boosts to work performance. If a GPU does really good Anti-aliasing for a game, it will be good at dealing with jaggy lines in other apps. Color rendering to make cut scenes beautiful also make other movies and such look nice as well.

Nvidia offers CUDA. If application supports it, it can use the GPU's resources to shift process to it. In short...you can get more work done much faster. Science apps I use can support CUDA. Why my interest. When I do some analysis it ties up the mbp for quite a bit. I'd love to shift some processes to the GPU to speed it up. As it just sits there doing nothing may as well put it too work.

GPU's tend to break down to 3 main groups. Generic, gaming, content creation.

Generic is...generic. No bells and whistles. Does neither gaming nor content creation great. Which is fine...it usually has the bene of being cheaper.

gaming. offers more over generic. Yes...it runs games well. And unless, imo, a paid content creator does a very decent job at content creation. Its a workhorse imo...works hard, plays hard.

content creation. great for well...content creation. It does not do gaming as well but its not supposed to. Which is okay...you see this generally in purpose built rigs. You don't buy it to get max FPS in a game.

Issues is apple is charging gaming and/or content creation prices but we get by and large generic performance. Its only saving grace to some is well FCP runs good on it. What if you don't use FCP. Or...what if like you use FCP but have growing concerns/dislikes about it.

I am of the opinion its a matter of when, not if, FCP will be executed on stage at WWDC. As its being treated the exact same way aperture was. Aperture was killed at a WWDC out of the blue. No warning, no rumor. And it was having a 1 year patch cycle to show apple seemed to care if only a little.. .45 hollow point to the back of the head just like that.



That and you have to define for pro user needs. I am all over the place. Post processing pics and vid, I do programming as an educational hobby and of late I am trying out something called docker. Imagine virtualized computers without the overheard of making and running a virtual machine. this is docker. Currently testing on a linux vm in parallels before I go more native use on the mac os of mbp. Great geek moment of last night was in 3 systems at one time. Host mac os, linux vm running docker running another linux OS in container. Machine in a machine in a machine....stuff that gets geeks off really.

Now at around 2300 to wind down from this stuff...I like to play a game. It be nice that $2500 gets me this as best as possible. Not even AAA's latest. I can get graphic stutter on diablo 3 (2012 release). Higher end greater rift, make massive mobs die and see the lock up. Its not lag spike...I know what that is (like many...I lost hardcore characters to it lol). Its a GPU made in 2015 gimped by drivers to maintain thermal limits saying yep give me a second...2012 is kicking my ass, lets slow it down a bit.
 
I have no doubt that the new iMac5k will have an great new AMD GPU, honestly the only reason to go Nvidia at this point, would be the new 980 non M (which i highly doubt coolingwise).
But please Apple, give us an updated Display with Freesync and either Thunderbolt3 with eGPU Support or better give us back that sweet TargetDisplayMode that you took away from us so uncalled-for!

TOD
 
I have no doubt that the new iMac5k will have an great new AMD GPU, honestly the only reason to go Nvidia at this point, would be the new 980 non M (which i highly doubt coolingwise).
Nvidia is releasing Pascal (GP104/106) this summer at roughly the same time as the AMD Polaris 400 series. I expect that they will be comparable. I'm content to wait and see, and post again next year as Captain Hindsight.
 
Funny. I work at ESPN, ALL macs in video production. But hey, they're not pros apparently.

And you know what's funny? I've never heard one video editor ever mention anything about the macs. Hell, for all they know their screens are plugged into toasters. You know why they don't care? They're busy getting content on air for over 30 channels, and millions of viewers each and every day. You know, working.

Don't confuse your tech snobbery with professionalism.

Totally agree. I use a circa-2009 shitbox PC at my work in the last 3 years I've delivered around 30 million dollars worth of enterprise level projects in web and mobile. They are upgrading me to 5K iMacs next week, but honestly, I'm more annoyed at the fact that I have to move my entire workload over, and possibly have to deal with setup issues than the fact that I'm getting a much faster computer.
 
  • Like
Reactions: Blindsam
Wow so much nate because the mac pro didnt get an update last year for 15% more performance..
If you go for the macbook update forum, for 15-20% boost is all about hate..but here, hey we change our minds bec its a mac pro?
Ferrari arent changes so quickly like toyota, its hard to do an update that matters in just 1 year
 
Ironic that Apple's testimony of a GPU's prowess is 1K gaming, when the iMac is making its bones at 5K. I'm not even going to address gaming. If Apple wants that market it should just flat out commit Pippin, I mean AppleTV, to it.

Isn't it AMD who does that testimony, not Apple?
 
  • Like
Reactions: Vanilla35
Omg, please...No. no more AMD!! The fact that they advertise the new GPUs on a per watt basis means that the real (absolute) performance improvement will be ****. Can't wait for Thunderbolt 3 and external GPUs. Never again AMD!!!

EDIT: Lol, did not know that swearwords are automatically masked. :D
 
When building my Oculus Rift gaming beast I avoided AMD graphic cards for two reasons; high power requirement and temperatures at the same performance level. But seeing that these are basically budget gpu's/laptop chips their fine for what they are. I built a HTPC using a low cost AMD chip with integrated graphics and it does fine with 1080p video. But 4K I don't know.
When I upgrade my Nvidia 970 this year we will see who has the best next gen video card; Nvidia or AMD.
You do understand that for VR you need the lowest possible latency, and only GPUs that will bring that type of latency is AMD's GCN?

Also, because Maxwell GPUs were so wide on front end of the GPU(64 ROPs in 2048 CUDA core GPU) people were fooled by the nature of the API, that 4.6 TFLOPs GPU can be as fast as 6 TFLOPs GPU and use much less power. Explicit APIs, that came from Mantle: DirectX12, Vulkan, LiquidVR show that when you lift off the bottleneck AMD GPUs jump up in the front of Nvidia. So you compare currently R9 390X to... GTX 980 Ti. Both GPUs have similar thermal envelope and similar compute power.

The same goes for R9 380X vs GTX 970 and GTX 980. AMD GPUs were bottlenecked by serial nature of DX11. But with newest drivers there has been a lot of change.
 
  • Like
Reactions: vmistery
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.