Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
I disagree. Yeah, it takes time to adjust to. But so do most things that are new and not familiar. If you think the MB is the machine which fulfills your needs, then go for it. You WILL get used to it, eventually. And maybe even love it.

Never said the rMB should not be taken into consideration! I'm doing it my self ;)

Just said that given how different/strange the keyboard might feels, it is better to extensively try it out and, if it doesn't fit at all, then discard it...otherwise let's go for it
 
First look at the touch stripe ;)
microsoft-displaycover.jpg


I just hope the touchbar isn't going to be black/white like this, and they use a color touch bar in the MBP... this one above just looks cheap.
 
I am in the "revive the 17-inch" camp. I edit video and music and would like to have a desktop-level machine for onsite work. Sure, it isn't as convenient for the normal user, but if you do anything from CAD to video editing the extra space of some non-Apple machines can create envy.

A resolution (not "retina" - the interface, OS resolution) of 1920 minimum would buy me some space for editing tools, especially in FCPX and Premier. The extra space, not just fine resolution, is key. I also miss having a matte screen option listed. I do not like glossy a screen, although they've improved dramatically.

Also ideal: 2TB SSD (I carry three drives, would like not to); SD slot (please don't remove that!), "A/B" USB 3 and 3.1 Type-C connectors (two of each, maybe?) and I'd still like an HDMI port to not carry one more adapter. I still want an audio port, too, but I do use USB headphones already.

Give us the best video card possible for the size, even if only when powered. I can live with that.
 
Razor Core finally shipping, Consumer photos:
https://imgur.com/a/jpSyd

Unboxing video:

So very unbelievably jealous right now. I'm stuck on 8 GB of RAM and my elderly 330m can't even play Dota II. Meanwhile these razer d00ds are highfiving themselves with their hotrod eGPU playin crysis multiscreen 4k at 240 FPS on max settings.
 
  • Like
Reactions: ENGINEjark
Yes, and they provide with a very good comparison of GPU performance: we have Haswell, Broadwell and Skylake.

View attachment 632924
Even the 540 (not the 550!) is 2x better vs HD6000, and 3x better vs HD5000 !

Source: http://arstechnica.co.uk/gadgets/2016/05/intel-nuc-quad-core-skull-canyon-review/

just some bare numbers from GFXBench tests for comparison
took all the rMBP graphic cards back to 2013

test: GFXBench 1080p t-Rex offscreen OpenGL Windows

iris pro 580 - 252fps

iris pro 5200 - 146.1fps
iris 6100 - 105.8fps
iris 5100 - 84.9fps
intel HD 4000 - 45.4fps

R9 M370X - 221fps
GT 750M - 132.9fps
GT 650M - 122.5fps

GTX 960M - 303fps
R9 M395 - 574.7fps

(iPad pro in the same test, OpenGL under iOS made 163.2 fps)



test: GFXBench 1080p Manhattan offscreen OpenGL Windows

iris pro 580 - 108.5fps

iris pro 5200 - 79.9fps
iris 6100 - 49.6fps
iris 5100 - 39.6fps
intel HD 4000 - -.-fps

R9 M370X - 68.6fps
GT 750M - 82.1fps
GT 650M - 71.8fps

GTX 960M - 164.2fps
R9 M395 - 122fps


Hey guys! I finally got my skull canyon NUC in the mail today. With iris 580.

So in these I have owned almost all of those igpus, posted in these benchmarks, and I have been following the benchmarks very closely. And now doing my own tests.

In most benchmarks and games the 580 is faster than the M370X currently in the rMBP.

But what is very confusing is, in tests I have seen online and in my own tests. I am finding that in many games and benchmarks the iris 580 is actually slower than the last generation GT4e, iris 6200 pro....

Can anyone explain this, I was expecting 580 to be 1.5x in best case scenario and like 1.2-3x in worst case scenario.

But that seems to not be the case, is it possible the Intel drivers are not up to par yet? ? Or the iris 6200 pro benchmarks I am finding online are all off?

I will be doing more tests, but let me know what you guys think.

Razor Core finally shipping, Consumer photos:
https://imgur.com/a/jpSyd

Unboxing video:

So very unbelievably jealous right now. I'm stuck on 8 GB of RAM and my elderly 330m can't even play Dota II. Meanwhile these razer d00ds are highfiving themselves with their hotrod eGPU playin crysis multiscreen 4k at 240 FPS on max settings.

About damn time ! I have been trying to get one of these since February.
 
  • Like
Reactions: dallegre and Dydegu
Hey guys! I finally got my skull canyon NUC in the mail today. With iris 580.

So in these I have owned almost all of those igpus, posted in these benchmarks, and I have been following the benchmarks very closely. And now doing my own tests.

In most benchmarks and games the 580 is faster than the M370X currently in the rMBP.

But what is very confusing is, in tests I have seen online and in my own tests. I am finding that in many games and benchmarks the iris 580 is actually slower than the last generation GT4e, iris 6200 pro....

Can anyone explain this, I was expecting 580 to be 1.5x in best case scenario and like 1.2-3x in worst case scenario.

But that seems to not be the case, is it possible the Intel drivers are not up to par yet? ? Or the iris 6200 pro benchmarks I am finding online are all off?

I will be doing more tests, but let me know what you guys think.



About damn time ! I have been trying to get one of these since February.
Isn't iris pro 6200 a GT3e?
Anyway, this seems strange. Maybe it is due to thermal throttling in 45W chips compare to 65W ones? Overclocked graphic and eDRAM on 5775C? :D
 
Last edited:
At release Intel's drivers are known to crash until the first updates, but performance-wise there weren't huge changes in the past with updated drivers.
 
some says that Iris Pro 580 is better than 960M...

Well it turns out the HD580 may not be as cool as they say, after all...

According to forum users all over the internet, the reason why HD 580 in HQ6770 is underperforming is... because of thermal limit to 45W. It declocks itself to 800 MHz all the time. It is more a problem with power limit rather than thermal limit, because 65W 1150 MHz GT4e SoC will not show this behavior.

It is still the best iGPU out there right, so...the rumours that is suspected to be on par with 950M is true

While I am not saying the 580 is faster than the 950m or 960m

I am finding in some benchmarks and games the 580 is faster than 950m, and in some not.

And even some benchmarks and games are showing the 580 being faster than 960m.

I am curious keyoot, what forums are you reading?
 
So are all these Q4 rumours having some substance?

If so this is massive loss for Apple, by Q4 Kabylake is going to be coming out and the new Macbooks will be already generation old lol :confused: And while Kabylake won't have much in terms of speed, it is gonna be leaps ahead in graphics department, so if someone is buying Mabook without discrete graphics card they should really consider their choices.

MBP chipsets for Kaby Lake are a year away. The Skylake ones are JUST now out. Why do people not get this?
 
  • Like
Reactions: MareLuce
Isn't iris pro 6200 a GT3e?
Anyway, this seems strange. Maybe it is due to thermal throttling in 45W chips compare to 65W ones? Over locked graphic and eDRAM on 5775C? :D
What is over locked graphic? And what isn't different about the eDRAM on 5775c and the iris pro ?
 
So I think the speed of the SSD's Apple is using right now corresponds with the Samsung SM951. As Samsung have announced the SM961 with even better speeds and availability 2H-2016, do you think Apple might include these in the next MacBook Pro 15"?

http://www.anandtech.com/show/10168/samsung-shows-off-sm961-and-pm961-ssds-oem-drives-get-a-boost

Faster is better right :D

Also, with 1TB SSDs now available, this opens the option for 2TB upgrade - doesn't it? I'm just hoping the 15" comes standard with 512GB, otherwise I'll need to upgrade.... even then I might upgrade for the hell of it ;)
 
I have gone to a few websites, some like BGR wrote "Kuo’s note reads in part:" and in the quote 4Q16 is mentioned. If it actually was part of Kuo's words then I imagine that if he would have meant fiscal he would have he mentioned 4Q only, without attaching the 16. That's the reason I think he and many others report it as the calendar and not fiscal 4Q. I believe it's calendar 4Q. Or has he done this before? I (well, we all) need clarification. But im done being an optimist concerning the release of that damn notebook. I'm now assuming that it's in sept-oct.
He has done this before, and is always referring to Apple's fiscal quarters, he is a securities analyst, and in the industry they always refer to a companies fiscal quarters.

Here is more proof that Ming-Chi Kuo is referring to 4Q16 as fiscal fourth quarter and not the actual calendar 4Q.

https://www.macrumors.com/2016/04/24/apple-declining-iphone-shipments-2016/

On April 24th he said: "Given the fact that shipments fell YoY for the first time in 1Q16, we don't think large-screen replacement demand will contribute much to growth."

Apple's 2nd quarter earning's (first calendar qtr of 2016) hadn't been released till April 26th. So when he said 1Q16, the info he was using was from the time frame of September 27th, 2015 - December 26th, 2015.

Which means that 4Q16 to him is June 26th - September 25th, 2016. If true then a unveiling at WWDC works perfectly with a late June/Early July MBP shipments.
 
Well Apple really needs to bring it in WWDC, especially after India's latest snub. This is from the latest India article on the front page.

Apple's products do not fall into the cutting-edge technology category

Ouch.
 
  • Like
Reactions: navaira
Hey guys! I finally got my skull canyon NUC in the mail today. With iris 580.

Can anyone explain this, I was expecting 580 to be 1.5x in best case scenario and like 1.2-3x in worst case scenario.

But that seems to not be the case, is it possible the Intel drivers are not up to par yet? ? Or the iris 6200 pro benchmarks I am finding online are all off?

Integrated graphics share the same total TDP as the CPU. So a 45w chip can either have full power to CPU (e.g. 40w CPU/5w iGPU) or full power to the iGPU (e.g. 25w CPU/20w iGPU). It will actually go over its 45w TDP for a time, but as soon as it hits 100c it will start throttling and the power will pull back closer to it's max TDP of around 45w.

For normal work loads (no CAD/gaming/3d graphics), this isn't a problem because the workload is generally intermittent so the the CPU/iGPU will hit its turbo speeds and you'll have a smooth, fast, computer.

However, as soon as you run a game that requires constant high power input, the power has to be shared between your CPU and iGPU. Your 3.5ghz (turbo boosted) CPU will under-clock itself to keep the heat within limits and will be running much lower than it's specified frequency, around 1.5Ghz - As well as having the GPU limited as it's operating under the same power and heat envelope - this is why iGPUs still suck for games.

Benchmarks will usually look okay as they are only run for a short time. But in real workloads, things run for much longer and the heat eventually slows it down.

Have a look the intel power gadget to see what I'm talking about in action: https://software.intel.com/en-us/articles/intel-power-gadget-20 - If you want to game on it, the external TB3 enclosure from Razer is a good option.
 
Last edited:
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.