Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why? Intel invented it.

But this is a really good thing.

Actually it's more like "co-invent", Apple requested for a technology like this and offered the original direction, then Intel's engineers and Apple's engineers spent some together and realized it. That's why Intel allowed Apple to announce it before itself did, and allowed Apple to have exclusive access to the technology in the first year. Of course, when we look it back, that "exclusive access" means nothing to Apple, as the eventual ThunderBolt uses a totally different connector from the earlier versions.
 
AMD Ryzen Raven Ridge arrives this August with Vega GPGPU built in. You need to research instead of just denouncing the idea. This announcement is Intel already knowing they will lose exclusivity to Apple.

AMD gets a lot of hate for some reason...at the end of the day gaming plays the biggest role in CPU choices, which Apple has no serious market share. Regular users in the market rarely push their CPU to the limit to care.
 
Will moving the Thunderbolt controller to the CPU itself tax the processor during intensive transfers? I thought the separation of the controller from the CPU was a good thing. Or maybe there's something I'm not understanding.
 
This news just confirms that Apple was right in ditching all other ports. In 2/3 years, anyone who bought an expensive computer with old ports will regret it

Except Apple did it at a WRONG time, the correct time to have a ThunderBolt-3-ports-only computer should 4-5 years in the future. For the moment, it should keep at least a USB-A and a mini-display (ThunderBolt 2) port on the MacBook Pros.
 
Will moving the Thunderbolt controller to the CPU itself tax the processor during intensive transfers? I thought the separation of the controller from the CPU was a good thing. Or maybe there's something I'm not understanding.

Depends on implementation but not likely. Similar to other controllers being integrated into the CPU die, it would be a stand-alone controller, just integrated into the overall package, so that it doesn't really add CPU cycles. Benefit is smaller footprint, less chips needing to be added to the motherboard, and since it's on-die, much closer, with a much shorter and more efficient interconnect allowing for lower latencies.
 
Depends on implementation but not likely. Similar to other controllers being integrated into the CPU die, it would be a stand-alone controller, just integrated into the overall package, so that it doesn't really add CPU cycles. Benefit is smaller footprint, less chips needing to be added to the motherboard, and since it's on-die, much closer, with a much shorter and more efficient interconnect allowing for lower latencies.
Interesting, thanks for the clarification!
 
The guy, who posted this information on Twitter, and B&C site is also a member of SemiAccurate forum. His nick there is Fottemberg. And he have said in early 2016 that AMD provided engineering samples to Apple of Raven Ridge APUs.

Sorry, you still did not provide any sources on where it says AMD provided engineering sample to Apple. No where in the posts does it say that.

Raven Ridge traces were already apparent in one of previous builds of Mac OS Sierra, which one of people from TonyMac forum have claimed he found out.

That entire thread is discussing AMD GPUs, not CPUs. The one random person on the internet, who has just 5 posts on his profile all of which belonging to one thread from December 2016, made some post that 'I see references to Raven'. He didn't provide any proof or screenshots and you believe him? He also made posts like this

"My wild guess, Polaris 12 is for a new iPad Phablet 12" 4K."
"It was a wild guess, how about a 15" Phablet? The new iPhab! I also see Falcon along with Raven, does that ring any bells?"
"Maybe we get a Christmas card launch with HBM memory on a Polaris card? Dec. 13th will be interesting."

Should we believe all of this incorrect info too? LOL.


AMD Ryzen is definitely not happening this year. And it will not happen in the foreseeable future, 2-3+ years. No proof, no evidence. Just speculation like "Oh, these processors would make a good fit, maybe Apple will use them". Sorry but that's not a rumour, that is purely guessing.
 
Sorry, you still did not provide any sources on where it says AMD provided engineering sample to Apple. No where in the posts does it say that.



That entire thread is discussing AMD GPUs, not CPUs. The one random person on the internet, who has just 5 posts on his profile all of which belonging to one thread from December 2016, made some post that 'I see references to Raven'. He didn't provide any proof or screenshots and you believe him? He also made posts like this

"My wild guess, Polaris 12 is for a new iPad Phablet 12" 4K."
"It was a wild guess, how about a 15" Phablet? The new iPhab! I also see Falcon along with Raven, does that ring any bells?"
"Maybe we get a Christmas card launch with HBM memory on a Polaris card? Dec. 13th will be interesting."

Should we believe this too? LOL.
Gigamaxx is the guy who found, and posted the information, with screen from Mac OS Sierra build on TonyMac Forum. He is not known on Anandtech forum, but is known on TonyMac forum ;).

He evidently trolled the person, with which he was talking. I have seen that he was pissed a lot that nobody referenced him as the source of the information over the internet. This thread was made on Anandtech forum, and then reposted by every site, including WCCFTech.
 
Ryzen is more suited for playing games than for day-to-day tasks

No reason for Apple to switch from Intel.
 
Ryzen is more suited for playing games than for day-to-day tasks

No reason for Apple to switch from Intel.
Its the other way around. Games are not optimized for Ryzen platform overall, yet. Pure compute software - is, but still not everywhere.
 
It is about time.
[doublepost=1495648575][/doublepost]
would be epic to see a Zen based CPU in a mac. They are really much better for creatives, with a focus on high core count over high Ghz (after all apple never let you overclock so what is the point of an overclockable intel cpu)

It could bring down the price of the top end iMac with Ryzen chips being much cheaper for the same performance + you get then benefit of ECC memory :) without going for the mad range Xeon prices.

Also for a mac pro at the moment the new AMD epic looks much better suited than intel's offering (we don't know about prices yet through)
Definitely!!!
 
Gigamaxx is the guy who found, and posted the information, with screen from Mac OS Sierra build on TonyMac Forum. He is not known on Anandtech forum, but is known on TonyMac forum ;).

He evidently trolled the person, with which he was talking. I have seen that he was pissed a lot that nobody referenced him as the source of the information over the internet. This thread was made on Anandtech forum, and then reposted by every site, including WCCFTech.

Provide sources to original forums. And I am still waiting for the post on where the Italian guy said Apple is being provided AMD engineering sample.

It's time someone called out your blatant spread of misinformation ;)
 
Provide sources to original forums. And I am still waiting for the post on where the Italian guy said Apple is being provided AMD engineering sample.

It's time someone called out your blatant spread of misinformation ;)
Misinformation of what?

The possibility that Apple will use Raven Ridge APUs?

https://twitter.com/bitsandchipseng/status/805456477221715970
https://twitter.com/BitsAndChipsEng/status/805515262124584960
According to our source, Apple will use Zen APU just in MacBook Pro and MacBook. :)
Here you go. It was TT from B&C.

And last link: http://www.bitsandchips.it/52-english-news/7622-rumor-two-versions-of-raven-ridge-under-development

Another link: https://semiaccurate.com/forums/showpost.php?p=284628&postcount=7858
 
Last edited:
This is huge. If every Intel device supports Tunderbolt in a year or two, the demand goes way up. And if the licensing fees go down, so does the cost to make/buy Thunderbolt devices.

Bring on the sub-$100 Thunderbolt GPU docks please!
 
This news just confirms that Apple was right in ditching all other ports. In 2/3 years, anyone who bought an expensive computer with old ports will regret it

Why couldn't they have included the new connection while retaining the old ones?

My rMBP from 2013 has these unused thunderbolt ports which they could have replaced with USB-C/TB3 meaning that everyone gets the best of both worlds.
 
Sounds like Intel is trying to save TB from its slacking adoption rate and convince the market to give it a much-needed boost, and also their partner's a$$ (apple) in the process. Everything else seems like a mix of wild guessing and wishful thinking (as of now).
 
  • Like
Reactions: muratura
Didn't realize this was Intel's call - always assumed it was Apple who owned the rights.

Intel did most of the development. Apple gave the first version a connector in the form of mini displayport. This started as an offshoot of lightpeak.
 
It's definitely good news, but….. what took them so long to come to this conclusion? Had they waited a bit longer…. Thunderbolt would have 1) faced death by irrelevance (the industry might have moved on to something cheaper, better and less proprietary), or 2) maybe faced death by Apple abandonment (something Apple does to its own techs that have failed to "take off" after a few years)
 
Why? Intel invented it.

But this is a really good thing.
Intel invented it, but in collaboration with Apple. Which is why, I believe, the Mini DisplayPort port was used. Few besides Apple would consider that port to be the most convenient one to choose. There were rumors of them using the USB port originally, but I'd assume Apple swayed their decision.
 
This news just confirms that Apple was right in ditching all other ports. In 2/3 years, anyone who bought an expensive computer with old ports will regret it

Good luck with that in school and work environments; AKA the environments that matter in the real world. The average Joe/Jane doesn't know what USB-C is, nor cares about Thunderbolt. Ask Average Joe/Jane what port they want the most and they'll say a standard USB-A port every freak'in time. Can we please stop acting like the world revolves around enthusiasts on a tech forum?

I actually like thunderbolt and USB-C, but these sensational posts about the entire world going USB-C in a few years is getting ridiculous.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.