Why does apple only care about money ???
Asus ZenBook Pro UX501 is $1000 cheaper but has better specs than macbook pro 2015
That's pretty sexy and a nice find.
Why does apple only care about money ???
Asus ZenBook Pro UX501 is $1000 cheaper but has better specs than macbook pro 2015
so is there any reason that apple did not go with maxwell nvidia this year ?
Because they went with AMD, I imagine the reasons were far to numerous to list here and that only the apple employees involved in that decision know what they are and how the pro's and cons worked out.
Despite the NVIDIA love going on here (I don't game so I couldn't care less about them) the latest Fury AMD cards are benching faster in just about everything. Their new architecture with embedded RAM seems to be a bit of a beast with massive bandwidth capabilities. Once these get to the mobile stage we could have some wonderful graphics cards in the rMBP's and iMac's.
I have an Asus notebook, with already outdated core i5 2410 m processor, 8 GB of ram, gt 540m gpu. I use it from the very first day until the day I leave from my home to here for a higher degree.I'm gonna go ahead and provide some of my experiences and advice here:
Just use what you like. If you're into Windows, that's cool. If you prefer the cost effectiveness of a Windows notebook in terms of what you get, that's cool too. Macs aren't better than PCs by default, PCs aren't better than Macs by default. No need to defend purchases, build quality or other standpoints of what makes up a computer.
That Asus may last you a year, or it may last you 3 or even 4 - it all comes down to what you prefer, and what you prefer to do with your money. It's your computer after all - no need to bash or defend what someone else is considering or is happy with.
I, for one, in general with a few exceptions, believe the longevity of a computer is influenced mostly by how it is taken care of - not because its Apple, Asus, Acer, or what have you. I am aware of the idea and reality of build quality - but I had an Asus notebook for 3 years that I paid $650 for, a Lenovo Ideapad (also $650) that is still being used within my family for 3 and a half years now, and a Hannspree (that's right - who here has ever heard of Hannspree) netbook that I paid $250 for which I used for a while and now another family member is using it, and that was purchased 3 years ago as well.
There is no default answer to this. Macs don't last X amount of years and Windows notebooks don't last Y amount of years - and, as always ... hardware specifications are not a sole reason to purchase a notebook. There are many other factors, just do your homework and find what you prefer.
sad for your experienceI'm typing this on a cheapo Acer window 8.1 laptop. Within a year of purchasing this the DVD/CDR drive failed, the number keys above the keyboard stopped working; numbers three through six do not work although the function keys do and I have a long Ethernet cable snaking across my kitchen floor because the wireless network adapter stopped working. All within a year.
ere is no default answer to this. Macs don't last X amount of years and Windows notebooks don't last Y amount of years - and, as always ... hardware specifications are not a sole reason to purchase a notebook. There are many other factors, just do your homework and find what you prefer.
That's the thing, while I agree mostly about Apple's build quality, many people here who tout its quality typically buy new laptops every three years. To me, I'd rather spend < 1,000 every 3 years then > 2,000 every 3 years.
As for Apple quality, my faith in them has been shaken, with the issues with the dGPU, I still think Macs are great computers and I its hard for me to imagine getting anything else, yet if I'm spending over 2,000 for a computer I'd like the peace of mind that the dPGU won't blow out in 3 years or the anti-reflective coating won't start flaking off.
Fury has almost no overclocking headroom is still ever so slithly slower than the equally big 980 TI chip which has a quite serious overclock headroom. Fury is much better than the 290x/390x but it still needs more power than Maxwell despite the more power efficient and faster memory. It also is water cooled.Despite the NVIDIA love going on here (I don't game so I couldn't care less about them) the latest Fury AMD cards are benching faster in just about everything. Their new architecture with embedded RAM seems to be a bit of a beast with massive bandwidth capabilities. Once these get to the mobile stage we could have some wonderful graphics cards in the rMBP's and iMac's.
Have you actually used AMD and Nvidia GPU for anything like Final Cut Pro?Well if we go back to 2013 you may note that in June at WWDC 13 they announced the Mac Pro with Dual AMD graphics. This was the start of Apple turning away from NVIDIA.
In October that year they released the MacBook Pro 15" with 750m graphics but this was the same chip as the 650m, both kepler, same pin out on the chip. All Apple had to do was swap the chips over. Very little change.
Fast forward to the iMac Retina, AMD again, now we have the 2015 rMBP, also AMD.
I think what we are seeing here isn't really that Apple is choosing AMD because it's better but because they are curating a closer relationship with AMD when it comes to graphics. That close relationship has resulted in the D300, D500 and D700 for Apples Mac Pro which are completely Apple exclusives still to this day (although they are based on the HD 7970) and they have been first to get the M370X too.
NVIDIA seems to be much less flexible, they won't be dictated to. They will not for example give Apple GTX 980's or 780 Ti's (back then) rebranded as Quadro D780's. This is what AMD did for Apple.
So again I don't think it has anything to do with performance it's about a relationship the two companies are curating and it's pushing NVIDIA out of Apple's systems and I think it's detrimental as NVIDIA currently has the fastest parts in every thermal envelope.
It turns out, you may not be right. http://forums.anandtech.com/showpost.php?p=37519644&postcount=201Fury has almost no overclocking headroom is still ever so slithly slower than the equally big 980 TI chip which has a quite serious overclock headroom. Fury is much better than the 290x/390x but it still needs more power than Maxwell despite the more power efficient and faster memory. It also is water cooled.
They can close in on Nvidia on Desktops where water cooling works and the total power consumption is not as important but on notebooks power efficiency is everything. And Fury cannot beat Maxwell, the M370X is still a much older tech than Fury.
It is not about loving Nvidia or AMD but just objective results in which Maxwell quite simply is the best that goes around right now.
They are not faster in REAL WORLD TASKS! Use Maxwell GPU for Final Cut Pro and you will throw it away and go back to AMD solution. I have tested GTX970 and R9 280X in Final Cut Pro, and AMD GPU was 40% faster!I thought the same thing, but if you actually look at Maxwell OpenCL benchmarks all of NVIDIA's cards stomp AMD.
Really. Look at the 970 or 980 or Titan or Titan X or 980 Ti. Way above all the AMD cards in OpenCL benchmarks. Same situation on the mobile versions, 950m and 960m.
The older NVIDIA kepler cards were terrible at OpenCL absolutely rubbish, 70%-90% slower than AMD. But the Maxwell cards are 10-50% faster than AMD at the same market segments.
Even I was surprised to find this out when I investigated benchmarks recently, check this out: https://i.imgur.com/wiK5ZSG.png
Have you actually used AMD and Nvidia GPU for anything like Final Cut Pro?
They are not faster in REAL WORLD TASKS! Use Maxwell GPU for Final Cut Pro and you will throw it away and go back to AMD solution. I have tested GTX970 and R9 280X in Final Cut Pro, and AMD GPU was 40% faster!
Sony Vegas Pro benchamark show only a glimpse of difference between AMD and Nvidia GPU in transcoding, but its a pretty big scale.
No. Not all.
http://scr3.golem.de/screenshots/15...uxmark-v3-(complex-scene,-gpu-only)-chart.png
http://s28.postimg.org/a8bdgk7pp/Capture.png
The second one is from the best review site right now: computerbase.de.
And the performance of Final Cut Pro is not due to the optimization for AMD hardware. They are simply faster in OpenCL, real world tasks. If you want to believe that FaceDetection is key for OpenCL performance then go ahead, try transcoding a film using face detection.
Lets end it here.
I did not posted that link because of Fury. You have there Grenada/Hawaii GPUs and GM204. Also Grenada turns out to be faster in some tasks than GTX Titan X.
Also, Luxmark v2.1 is old benchmark. Luxmark v3 is the same but newer and more complex. Its like comparing scores of 3dMark Vantage with 3dMark FireStrike. The second one is newer, more complex and shows a bit more than the old one.
Third. Im completely calm. I just don't buy everything that is in reviews. If you want to believe that Maxwell GPUs are better in compute, which they are not, its up to you. I know that not everything what gets in reviews is true, and always gets really outdated with every new version of Drivers/technology that comes along.
You might want to take a closer look. The score jumped from 14k default to 17k while Maxwell OC hits 19.4k (while being Air cooled)It turns out, you may not be right. http://forums.anandtech.com/showpost.php?p=37519644&postcount=201
Check it out yourself. Before OC the GPU was at 12K overall score, and 15.5K Graphics score.
100 MHz, and the GPU starts to FLY! The score is higher than anything right now on the market.
A 280X is a 20-30% slower card on average than a 970 in 3D work. OpenCL work and a stress test are not the same thing. In theory you could get such load secnarios but in reality most CL benches don't come anywhere near it.Edit. About the power efficiency...
If you have a reference Nvidia GPU, then yes it will use less power than AMD GPU. However, Reference GTX 970 with BIOS clock cap will be 3 times slower than R9 280X. If you have non reference GTX 970, there will be no clock cap in BIOS, and it will be only 40% slower. The problem is: It will draw 270W of power in OPENCL work.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html Here you have a proof.
The 960m is 60% faster than the m370x. I don't think it's overrated, but rather a preferred card to the lower end AMD Apple chose for this refresh. The 950m is also a much better card. I do agree with the complaints. Apple is charging the same price as always, and while the new card is more powerful, it's a step down in overall class.
I would not go that far with power consumption. GTX960 should be within 35-45W power draw envelope. And by looking at notebookcheck review of MBP, looks like the power consumption is pretty high, because it can exceed even the 85W limit of power supply, by 6W. So I would not say that its more power hungry than M370X.
Also, I would not say its 60% faster than M370X, because performance in Final Cut Pro will be much higher on AMD chip than any Maxwell GPU available for 45W power budget.
In games, possibly Nvidia GPU will be even 60% faster. But games are not everything.