Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Could anyone enumerate some points why I should prefer a Iris Pro over a AMD or NVIDIA GPU ?
Like heat development and so on :)
 
Iris Pro is built into the CPU, whereas a AMD or NVIDIA dGPU are additional components that generate more heat. Dedicated graphics cards are >2.5x faster than integrated.

Maybe we'll see an Xeon CPU (no Iris Pro) paired with Polaris?
 
Iris Pro is built into the CPU, whereas a AMD or NVIDIA dGPU are additional components that generate more heat. Dedicated graphics cards are >2.5x faster than integrated.

Maybe we'll see an Xeon CPU (no Iris Pro) paired with Polaris?

I read a newspaper article which compared the Iris Pro 580 to a NVIDIA 945M. The performance is equal to a NVIDIA 945M !
 
I read a newspaper article which compared the Iris Pro 580 to a NVIDIA 945M. The performance is equal to a NVIDIA 945M !
Iris Pro has certainly improved significantly, but the chips are still no match for a modern dedicated GPU. Iris Pro 580 can run at ~1152 GFLOPS. The supposedly "leaked specs" for Polaris 11 put it somewhere at ~2500 GFLOPS.
 
Iris Pro has certainly improved significantly, but the chips are still no match for a modern dedicated GPU. Iris Pro 580 can run at ~1152 GFLOPS. The supposedly "leaked specs" for Polaris 11 put it somewhere at ~2500 GFLOPS.
I would be satisfied if the performance is truly equal to a NVIDIA 945M.
 
If no macbooks, then ok, we expected it, if there are actually macbooks, then imagine the surprise!

I kinda used the same attitude for my tests during the second grade school.
"Ok, I screwed up completely, I'm never going to pass it."
Bad was, most of the time I was right...
:confused:

new rMBPs announced at WWDC confirmed?

Nah, Kaby Lake is more likely.
:D
 
Well I guess as long as my MBA 2011 11" is somehow working out, Im in no urge for upgrading it?

Would love to have some more power considering RAM,CPU for my virtual machines though. Or just for a little round League of Legends etc.
Or to have a nice screen. Or to have a bigger screen. Or to have faster ports. (USB2 is slow as ****). Or to have WiFi ac.

Jesus Apple. Just bring out the new devices. I wont buy that oldschool 13" 2015 rMBP. Theres no reason to buy such an outdated machine for such a ridiculous price.

If their new machines are released end of the year with ! Skylake ! I guess Im going to buy some windows-machine.
 
Last edited:
Would 945m be around the same as Surface Book? If so it's not bad, but also not good.

The heat would be equal to zero !? Have you ever played with a > 940M ? Every notebooks feels like lava. Especially if the rMBP gets the best skylake cpu with this thinness ! Can't wait to see this damn Macbook..
 
The heat wouldn't be equal to zero.
Anyway I would rather have both. iGPU for mobility when running on battery and dGPU when powering external displays (and hooked up to a charger)
For sure if you compare it to a dGPU device. Especially the battery life/thinness profits from the iGPU :).
 
Iris Pro has certainly improved significantly, but the chips are still no match for a modern dedicated GPU. Iris Pro 580 can run at ~1152 GFLOPS. The supposedly "leaked specs" for Polaris 11 put it somewhere at ~2500 GFLOPS.

I remember watching the Stevenote on YouTube where he introduced the PowerMac G4. He touted it as the world's first personal supercomputer because it was capable of running at 1 GFLOP. Computing sure has advanced a lot of mindboggling ways, hasn't it? Yeesh!
 
  • Like
Reactions: ShadovvMoon
I wonder how long until there is no need for a dgpu because igpu's are just as good.
New ps4.5(over 4 tflops) comes in October, next xbox (over 6tflops) will debut in 2017 so If you like to play AAA games with graphics on high probably not early.
 
For sure if you compare it to a dGPU device. Especially the battery life/thinness profits from the iGPU :).
iGPU is more like a parasite that's forced onto the CPU by Intel. Check out this die shot of a Skylake chip. The GPU part likely has less perf/watt than a dedicated graphics card, but it just runs at a lower wattage (and almost 2.5x slower). Apple could include a dGPU with low wattage and better performance than the iGPU, but they might as well use the iGPU for "free" as it comes with the CPU anyway.

77a.jpg



Random quote from random article from 2014
"Nvidia claims to have an even bigger competitive advantage when it comes to power consumption—a critical issue when you’re talking about a notebook PC running on a battery. According to Nvidia, if you play the epic role-playing game Elder Scrolls at 1080p resolution with medium image quality on a notebook with an Intel Core i5-4210U, you’ll get a frame rate of just 10 frames per second, while the CPU and integrated GPU will consume 15 watts of power between the two components. Play the game at the same resolution and image quality on a notebook with the same CPU, but with Nvidia’s GeForce GTX 840M processor, and the company claims you’ll get 30 frames per second while the two components consume 17 watts—three times the frame rate at an added power cost of just two watts."
http://www.pcworld.com/article/2107...l-new-geforce-800m-line-of-notebook-gpus.html
[doublepost=1465205396][/doublepost]
I remember watching the Stevenote on YouTube where he introduced the PowerMac G4. He touted it as the world's first personal supercomputer because it was capable of running at 1 GFLOP. Computing sure has advanced a lot of mindboggling ways, hasn't it? Yeesh!
Wait until you see the new ~22 TFLOP nMP :D
 
Random quote from random article from 2014
"Nvidia claims to have an even bigger competitive advantage when it comes to power consumption—a critical issue when you’re talking about a notebook PC running on a battery. According to Nvidia, if you play the epic role-playing game Elder Scrolls at 1080p resolution with medium image quality on a notebook with an Intel Core i5-4210U, you’ll get a frame rate of just 10 frames per second, while the CPU and integrated GPU will consume 15 watts of power between the two components. Play the game at the same resolution and image quality on a notebook with the same CPU, but with Nvidia’s GeForce GTX 840M processor, and the company claims you’ll get 30 frames per second while the two components consume 17 watts—three times the frame rate at an added power cost of just two watts."
http://www.pcworld.com/article/2107...l-new-geforce-800m-line-of-notebook-gpus.html

However, according to this article, we waive on 20 frames per second because of 2 watts of power ?
This article was written in 2014, since 2014 Intel improved the efficiency. You can't compare a iGPU from 2014 with a iGPU from 2016.
[doublepost=1465206928][/doublepost]
However, according to this article, we waive on 20 frames per second because of 2 watts of power ?
This article was written in 2014, since 2014 Intel improved the efficiency. You can't compare an iGPU from 2014 with an iGPU from 2016.
 
Just as you can't compare a dGPU from 2014 with a dGPU from 2016. Especially considering dGPU's have gone from 28nm to 14/16nm whereas iGPU's have gone from 14nm to 14nm.
[doublepost=1465207228][/doublepost]Um, did I just find a bug in the forum? I think I accidentally posted your quote with nothing attached and now it's glitched into thinking that you double posted :p
 
iGPU is more like a parasite that's forced onto the CPU by Intel. Check out this die shot of a Skylake chip. The GPU part likely has less perf/watt than a dedicated graphics card, but it just runs at a lower wattage (and almost 2.5x slower). Apple could include a dGPU with low wattage and better performance than the iGPU, but they might as well use the iGPU for "free" as it comes with the CPU anyway.

77a.jpg



Random quote from random article from 2014
"Nvidia claims to have an even bigger competitive advantage when it comes to power consumption—a critical issue when you’re talking about a notebook PC running on a battery. According to Nvidia, if you play the epic role-playing game Elder Scrolls at 1080p resolution with medium image quality on a notebook with an Intel Core i5-4210U, you’ll get a frame rate of just 10 frames per second, while the CPU and integrated GPU will consume 15 watts of power between the two components. Play the game at the same resolution and image quality on a notebook with the same CPU, but with Nvidia’s GeForce GTX 840M processor, and the company claims you’ll get 30 frames per second while the two components consume 17 watts—three times the frame rate at an added power cost of just two watts."
http://www.pcworld.com/article/2107...l-new-geforce-800m-line-of-notebook-gpus.html
[doublepost=1465205396][/doublepost]
Wait until you see the new ~22 TFLOP nMP :D

Just think of how much faster our notebook processors would be if Intel were to (hypothetically, of course, because it will never happen) get rid of the iGPU. There would be lot more room on the processor die for about four more cores it looks like. Perhaps we would be seeing actual CPU performance increases. Though I am not sure how that would work with heat and power consumption though. Perhaps it wouldn't be feasible on a technical level.
 
  • Like
Reactions: Macasio
Just as you can't compare a dGPU from 2014 with a dGPU from 2016. Especially considering dGPU's have gone from 28nm to 14/16nm whereas iGPU's have gone from 14nm to 14nm.
[doublepost=1465207228][/doublepost]Um, did I just find a bug in the forum? I think I accidentally posted your quote with nothing attached and now it's glitched into thinking that you double posted :p

You are right, twice :)
[doublepost=1465207503][/doublepost]
You are right, twice :)

I edited my post that's the reason for the double post :)
 
  • Like
Reactions: ShadovvMoon
Guys... I gave up. I don't think they will announce it in WWDC. I'm sticking with my 2015 15 inch mbp
 
Guys... I gave up. I don't think they will announce it in WWDC. I'm sticking with my 2015 15 inch mbp
As opposed to buying... another 2015 15" rMBP? Unless your laptop is dying because you threw it in a swimming pool or something, what difference does the release date make? I'll be disappointed if it doesn't come out though because I advised so many people in December to wait until WWDC for those ~amazing~ graphics improvements :(

Apple has no excuse to not release a new rMBP in June/July. All of the parts should be available so I'll be very annoyed if they delay it simply because they're waiting for the new macOS to be released.
 
  • Like
Reactions: kanyehameha
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.