Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How is this a software bug?

I've assumed—perhaps incorrectly, it seems—that the OS recognizes the type of CPU, etc., this then detailed in system info. Or in other words automatic and a basic software feature Apple has been using like forever. How would that change just because of a certain CPU, other than simply reporting what it was?

Pulling up specs from 'About This Mac' seemed about as simple and dead certain a computing process possible. If one cannot even trust that . . .
 
Apple didn't make the CPUs in the 13-inch MBPs that lack PEG lanes, nor did they make the PCH that only provided 12 total PCIe 3.0 lanes, Intel did. Apple's choices were to only have 2 Thunderbolt 2 ports, to use a PCIe 2.0 x2 connection for the SSD and gimp that, use a USB connection for the Wi-Fi module, or do what they did and *still* provide more I/O bandwidth than any other laptop in history aside from the 15-inch MacBook Pro.

Those "gimped" Thunderbolt ports still provide 2x DP 1.2 links, full USB 3.1 Gen 2 performance, 10 Gbit/s network bridges, and an external PCIe 3.0 x2 interface.

Apple also was not responsible for the first stepping of the Alpine Ridge controllers, many of which were sold off at a discount to be used just as USB 3.1 xHCI controllers because they were garbage. Intel created that mess, and once again Apple gets blamed. The lack of compatibility with the early Thunderbolt 3 silicon is by design, and lack of support for the 5 or so devices that were released with that first round of chips is not really a great loss.

Edit: And just so I'm not unduly dishing on Intel, the incompatible Thunderbolt devices MacRumors reported on a few days ago were the result of a Texas Instruments USB Power Delivery and USB Type-C plug orientation detection chip. That's neither Apple's fault nor is it even strictly speaking a Thunderbolt issue.

It's arguable at this point in time whether it would be a more practical solution to use 2 full speed Thunderbolt 3 USB-C ports and then, given the limitations with Intel, supplement them with other ports that aren't currently supported, like USB-A for the slew of peripherals people already have. That makes more sense as a transition from old to new. So, don't act like it's either offer 4 USB-C Thunderbolt 3 ports with differing speeds or don't offer any.
 
  • Like
Reactions: indognito
well, a minor bug (if any) not worth any buzz...

look at these screenshots of my 2015 rMBP:

but true, they could have adapted the overview to include both GPUs
(but honestly there are much more important tasks to be tackled, right?)

View attachment 673195 View attachment 673196

The overview in macOS Sierra includes both. However, you're on El Capitan, which doesn't include both.
 
  • Like
Reactions: CriticalVoice
So, you've never come across an insignificant OS bug on a newly released product, that had zero use impact, from another company, say, Microsoft? Or google? And if you did, did you express similar outrage?

You have probably never developed commercial software.

Sorry, but you cannot use "they do it too" as an excuse for ANYTHING.

We're talking about Apple, and Apple f'd up. Period.

I like to hear both sides of why Apple f'd up, but we cannot deny that they did.

To do so is to be a Kool-Aid SALESMAN (like a few others here are).

When you are paying for something, whether it's a dollar or a million, you have the RIGHT to pick apart, gripe, complain, whine, or raise all-kinds-of hell towards ANYTHING you consider to be an issue in the product, from Internet forums to the Supreme Court of the United States (or the applicable International authority).
 
Sorry, but you cannot use "they do it too" as an excuse for ANYTHING.

We're talking about Apple, and Apple f'd up. Period.

I like to hear both sides of why Apple f'd up, but we cannot deny that they did.

To do so is to be a Kool-Aid SALESMAN (like a few others here are).

When you are paying for something, whether it's a dollar or a million, you have the RIGHT to pick apart, gripe, complain, whine, or raise all-kinds-of hell towards ANYTHING you consider to be an issue in the product, from Internet forums to the Supreme Court of the United States (or the applicable International authority).

I can without any difficulty find a flaw, annoyance or non perfection in ANY product there is.

But, since I consider humans not perfect, I just live my life accordingly and do not expect perfection from anybody ever, even if they have been perfect in the past.

No issue, pointing at myself and including me in that class.

You belong as well. Guaranteed!

BTW: What would you do with the information why Apple ****** up? Not saying it's okay or supporting Apple. Just trying to illustrate that it is not important, especially given that it doesn't impact anything.
They will fix it, life goes on and we all live happily to the next bad Apple news:)
(Some of which are pulled up by very short hair)
 
  • Like
Reactions: bjet767
Sorry, but you cannot use "they do it too" as an excuse for ANYTHING.

We're talking about Apple, and Apple f'd up. Period.

I like to hear both sides of why Apple f'd up, but we cannot deny that they did.

To do so is to be a Kool-Aid SALESMAN (like a few others here are).

When you are paying for something, whether it's a dollar or a million, you have the RIGHT to pick apart, gripe, complain, whine, or raise all-kinds-of hell towards ANYTHING you consider to be an issue in the product, from Internet forums to the Supreme Court of the United States (or the applicable International authority).

Please show me any other complex system involving a huge team of software engineers writing many many thousands of lines of code where you have found perfection with zero insignificant errors.

And how strongly did you rail against those companies. Feel free to write about your personal experiences and how aggressively you've taken other companies to task. Have you ever developed large scale commercial software? And was it 100% perfect? If not, why not? Or, tell me about your mistake free life with respect to whatever you do.

Meanwhile, on planet Earth, anyone in a technical profession such as engineering understand that all such software has bugs, especially insignificant bugs, and certainly on newly released products.
 
I can without any difficulty find a flaw, annoyance or non perfection in ANY product there is.

The fact that you "can without any difficulty find a flaw, annoyance or non perfection in ANY product there is" supports my position that you have the right to complain about it. I was addressing the OP's and others opposition to those who are calling Apple out on this, because it's Apple.

But, since I consider humans not perfect, I just live my life accordingly and do not expect perfection from anybody ever, even if they have been perfect in the past.

No issue, pointing at myself and including me in that class.

You belong as well. Guaranteed!

True, but Apple is a company, not a person. Thus, companies (as well as individuals, really) are to be held accountable, whether on Internet forums or the Supreme Court.

BTW: What would you do with the information why Apple F*d up? Not saying it's okay or supporting Apple. Just trying to illustrate that it is not important, especially given that it doesn't impact anything.
They will fix it, life goes on and we all live happily to the next bad Apple news:)
(Some of which are pulled up by very short hair)

Such information will impact both my willingness to purchase and/or recommend or speak against said product. When we're talking about a purchase, especially one in the thousands of dollars, for a device that will carry my family's memories, my life's work, my important legal documents, and a lot of the things that bring my life joy, EVERY SINGLE ISSUE is important and should be considered.
[doublepost=1479493581][/doublepost]
Please show me any other complex system involving a huge team of software engineers writing many many thousands of lines of code where you have found perfection with zero insignificant errors.

And how strongly did you rail against those companies. Feel free to write about your personal experiences and how aggressively you've taken other companies to task. Have you ever developed large scale commercial software? And was it 100% perfect? If not, why not? Or, tell me about your mistake free life with respect to whatever you do.

Meanwhile, on planet Earth, anyone in a technical profession such as engineering understand that all such software has bugs, especially insignificant bugs, and certainly on newly released products.

Its amazing how you completely missed the point of my post, and provided the EXACT same argument.

Please re-read and try harder to understand. I could not have made it simpler.
 
It's arguable at this point in time whether it would be a more practical solution to use 2 full speed Thunderbolt 3 USB-C ports and then, given the limitations with Intel, supplement them with other ports that aren't currently supported, like USB-A for the slew of peripherals people already have. That makes more sense as a transition from old to new. So, don't act like it's either offer 4 USB-C Thunderbolt 3 ports with differing speeds or don't offer any.
They are full speed Thunderbolt 3 ports. They simply have a PCIe 3.0 x2 connection on the back end. The majority of PC interfaces have massively oversubscribed PCIe back ends. This wasn't mentioned in the news about this particular case because if Apple did it, it must be evil. Here's the thing, the interconnect between the CPU and PCH which is behind all 12 PCIe 3.0 lanes available to this platform only has bandwidth equivalent to PCIe 3.0 x4. That can be saturated by the SSD alone in these machines. This is, once again, a non-issue presented by people who don't know what they're talking about and will never be impacted by it anyway.

For actual professionals, 2 additional ports that can drive 4/5K displays or act as USB 3.1 10Gb/s natively, connect to any Thunderbolt device and still provide PCIe bandwidth equivalent to Thunderbolt 2, as well as act as 10 GbE ports, is massively more useful than having 2 legacy USB Type-A ports. If the insane difference in capability between these two choices isn't enough, a Type-A port is thicker than the edge of these machines anyway.
 
  • Like
Reactions: fastasleep
But it's a lot of small mistakes recently. From mysterious "accidental" encryption downgrade on local iTunes backups, disabled SIP on newly shipped Macbooks, mistakes in basic identifying information *on some models*... Weird things going on in Cupertino...

Not weird things. Apple is now just another big tech company trying to do to many things at once and the results show.
 
The overview in macOS Sierra includes both. However, you're on El Capitan, which doesn't include both.
It only shows both if you've opened an application that turns on the dGPU. I have the 2015 15" rMBP with the M370X and if I'm not using any applications that activate the dGPU it only shows the Intel Iris Pro. The bottom screenshot is with Sketch open as it will turn on the dGPU.
dGPU_off.png
dGPU_on.png
 
What are the odds of the wrong chip being soldered in on some models?

Impossible and therefore 100% odds against.

The above post shows what Sierra is doing, it shows the video in use. Who would of thunk an engineer could be so clever and save battery through selective use of gpu.

Wow!

I like it.
 
Don't excuse poor programming. This is coming from the company that couldn't fix their daylight savings bug over 2 years. And yet they were thinking about controlling my car? Yeah.. no thanks.

I remember one new year (2011?) when almost no-one turned up to work on time because the iPhone had a bug which meant their alarms didn't go off on 1st Jan.
 
Guess we'll see more rev. A issues in the coming weeks and months. In 2018 it will be a great machine (of course for some people it is or will be great already - no problem with that - they spent enough on it and hopefully have no problems).
 
Impossible and therefore 100% odds against.

The above post shows what Sierra is doing, it shows the video in use. Who would of thunk an engineer could be so clever and save battery through selective use of gpu.

Wow!

I like it.
The only reporting the GPU when active issue is separate from the reporting Iris Pro 580 integrated GPU issue. The latter would indicate that Apple may have intended to ship the higher end CPUs with Iris Pro and eDRAM cache in the 15-inch models but didn't for some reason, and failed to update the reporting mechanism in MacOS before shipping. Some are still hopeful that devices exist that contain those higher end chips, but that appears to be highly unlikely.
 
First generation product from Apple. Expect the worst.
Reminds of headaches from taking into the Apple Store or shipping out my first gen 2012 15" rMBP. I had to get the screen replaced twice and had the top case and logic board replaced because during the first screen replacement (the really really bad image retention issue) they stripped the screws that were part of the hinge/display oh and the hinge jacked up on the first replacement. They were so close to just replacing the entire machine but they basically did with all the part replacement.

I'm not an apologist for Apple's beta products they ship but I've learned over the last few decades 1st generation launches have issues. Remember the first Intel MBP's? Where people were taking them apart and redoing the CPU paste (forgot the name of it) because the way Apple did it caused the CPU to get crazy hot. And those ones were not great to use on your lap due to the heat. According to Wikipedia's Skylake page, it's using the CPU with the HD530 iGPU, I'm going to bet they fitted it, well Jony and team, with the CPU's with Iris Pro 580's and probably had some 1) Battery issues 2) Heat issues 3) Jony wanted it as thin as possible.

On a side note, I was almost certain that the 128mb of eDRAM from the Iris Pro was leveraged even with a dGPU model, thus giving it an extra bit of performance.

ADDED: Maybe this is a hiccup and they included the 580 as their next refresh chip upgrade would finally include CPU's with the 580. I agree with some that have said this was probably an issue where they intended to use the CPU's with the Iris Pro 580 but the 3 things I mentioned above many have forced them to use a different CPU. Or perhaps Intel couldn't meet the supply needs for Apple and they decided to march forward, again not updating the code. Long story short, expect to see the Iris Pro 580 make a comeback in the next refresh.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.