new rMBPs announced at WWDC confirmed?You mean new MBPs might get announced at WWDC ?
new rMBPs announced at WWDC confirmed?
NO. Mark Gurman said WWDC will be software only, which is said
if there are actually macbooks, then imagine the surprise!
You mean new MBPs might get announced at WWDC ?
Finally a confirmation from a reliable source that MBPs are announced at WWDC. yaynew rMBPs announced at WWDC confirmed?
Iris Pro is built into the CPU, whereas a AMD or NVIDIA dGPU are additional components that generate more heat. Dedicated graphics cards are >2.5x faster than integrated.
Maybe we'll see an Xeon CPU (no Iris Pro) paired with Polaris?
Iris Pro has certainly improved significantly, but the chips are still no match for a modern dedicated GPU. Iris Pro 580 can run at ~1152 GFLOPS. The supposedly "leaked specs" for Polaris 11 put it somewhere at ~2500 GFLOPS.I read a newspaper article which compared the Iris Pro 580 to a NVIDIA 945M. The performance is equal to a NVIDIA 945M !
I would be satisfied if the performance is truly equal to a NVIDIA 945M.Iris Pro has certainly improved significantly, but the chips are still no match for a modern dedicated GPU. Iris Pro 580 can run at ~1152 GFLOPS. The supposedly "leaked specs" for Polaris 11 put it somewhere at ~2500 GFLOPS.
If no macbooks, then ok, we expected it, if there are actually macbooks, then imagine the surprise!
new rMBPs announced at WWDC confirmed?
Would 945m be around the same as Surface Book? If so it's not bad, but also not good.
For sure if you compare it to a dGPU device. Especially the battery life/thinness profits from the iGPUThe heat wouldn't be equal to zero.
Anyway I would rather have both. iGPU for mobility when running on battery and dGPU when powering external displays (and hooked up to a charger)
Iris Pro has certainly improved significantly, but the chips are still no match for a modern dedicated GPU. Iris Pro 580 can run at ~1152 GFLOPS. The supposedly "leaked specs" for Polaris 11 put it somewhere at ~2500 GFLOPS.
New ps4.5(over 4 tflops) comes in October, next xbox (over 6tflops) will debut in 2017 so If you like to play AAA games with graphics on high probably not early.I wonder how long until there is no need for a dgpu because igpu's are just as good.
iGPU is more like a parasite that's forced onto the CPU by Intel. Check out this die shot of a Skylake chip. The GPU part likely has less perf/watt than a dedicated graphics card, but it just runs at a lower wattage (and almost 2.5x slower). Apple could include a dGPU with low wattage and better performance than the iGPU, but they might as well use the iGPU for "free" as it comes with the CPU anyway.For sure if you compare it to a dGPU device. Especially the battery life/thinness profits from the iGPU.
Wait until you see the new ~22 TFLOP nMPI remember watching the Stevenote on YouTube where he introduced the PowerMac G4. He touted it as the world's first personal supercomputer because it was capable of running at 1 GFLOP. Computing sure has advanced a lot of mindboggling ways, hasn't it? Yeesh!
Random quote from random article from 2014
"Nvidia claims to have an even bigger competitive advantage when it comes to power consumption—a critical issue when you’re talking about a notebook PC running on a battery. According to Nvidia, if you play the epic role-playing game Elder Scrolls at 1080p resolution with medium image quality on a notebook with an Intel Core i5-4210U, you’ll get a frame rate of just 10 frames per second, while the CPU and integrated GPU will consume 15 watts of power between the two components. Play the game at the same resolution and image quality on a notebook with the same CPU, but with Nvidia’s GeForce GTX 840M processor, and the company claims you’ll get 30 frames per second while the two components consume 17 watts—three times the frame rate at an added power cost of just two watts."
http://www.pcworld.com/article/2107...l-new-geforce-800m-line-of-notebook-gpus.html
However, according to this article, we waive on 20 frames per second because of 2 watts of power ?
This article was written in 2014, since 2014 Intel improved the efficiency. You can't compare an iGPU from 2014 with an iGPU from 2016.
iGPU is more like a parasite that's forced onto the CPU by Intel. Check out this die shot of a Skylake chip. The GPU part likely has less perf/watt than a dedicated graphics card, but it just runs at a lower wattage (and almost 2.5x slower). Apple could include a dGPU with low wattage and better performance than the iGPU, but they might as well use the iGPU for "free" as it comes with the CPU anyway.
![]()
Random quote from random article from 2014
"Nvidia claims to have an even bigger competitive advantage when it comes to power consumption—a critical issue when you’re talking about a notebook PC running on a battery. According to Nvidia, if you play the epic role-playing game Elder Scrolls at 1080p resolution with medium image quality on a notebook with an Intel Core i5-4210U, you’ll get a frame rate of just 10 frames per second, while the CPU and integrated GPU will consume 15 watts of power between the two components. Play the game at the same resolution and image quality on a notebook with the same CPU, but with Nvidia’s GeForce GTX 840M processor, and the company claims you’ll get 30 frames per second while the two components consume 17 watts—three times the frame rate at an added power cost of just two watts."
http://www.pcworld.com/article/2107...l-new-geforce-800m-line-of-notebook-gpus.html
[doublepost=1465205396][/doublepost]
Wait until you see the new ~22 TFLOP nMP![]()
Just as you can't compare a dGPU from 2014 with a dGPU from 2016. Especially considering dGPU's have gone from 28nm to 14/16nm whereas iGPU's have gone from 14nm to 14nm.
[doublepost=1465207228][/doublepost]Um, did I just find a bug in the forum? I think I accidentally posted your quote with nothing attached and now it's glitched into thinking that you double posted![]()
You are right, twice![]()
As opposed to buying... another 2015 15" rMBP? Unless your laptop is dying because you threw it in a swimming pool or something, what difference does the release date make? I'll be disappointed if it doesn't come out though because I advised so many people in December to wait until WWDC for those ~amazing~ graphics improvementsGuys... I gave up. I don't think they will announce it in WWDC. I'm sticking with my 2015 15 inch mbp