Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dabotsonline

macrumors member
Apr 14, 2014
44
8
Given that Thunderbolt 3, scheduled for release alongside Skylake in autumn/fall 2015, only has support for DisplayPort 1.2, we will probably have to wait for Thunderbolt 4, which will presumably be released in autumn/fall 2017, for DP 1.3 support which will include the newly-announced DSC / Display Stream Compression.
 

WhoDaKat

macrumors 6502
May 20, 2006
379
665
It's still expensive at 'under $99', but have you seen this? It's a shame it doesn't come with 2 USB 3.0 ports: http://www.kanexlive.com/article/thunderbolt-adapters

I agree, 2 USB3 ports would be better, but the eSATA connection would probably have come in handy a couple of times over the years. Very nice find by the way, I hadn't seen those. I'll be interested to see what the price on this will be.
 

winterspan

macrumors 65816
Jun 12, 2007
1,008
0
Apple is using slower technology in 2014 to transfer data between iPods and iPhones with computers than they were in 2001 (USB 2 vs. FireWire 400). Hope some of these technologies actually come to fruition eventually. If you fill an iPhone with video, it still takes a couple of hours to offload it.


Totally agree, although I wonder how fast apples flash storage is on the iphone/iPad. I would think it's faster than the 30Mb/sec speeds of USB2.. Then again, it's a parallel eMMC interface with a tiny controller. This isn't your 16-channel, 5watt SSD.

----------

Stop being lazy and watch the keynote. Wireless sync was announced BECAUSE nobody was doing wired sync even BEFORE there was an alternative. That's right, people would rather not sync at all than do wired sync and that was back in 2011. There is nobody left doing wired sync three years later except for the odd loudmouth.


Perhaps I just haven't had an iphone for long enough, but how do I wirelessly have my PC pull the photos off the camera roll? Is this possible? Until now, I've been connecting the cable and accessing it like a media device through windows explorer. I don't use itunes to "sync" because I don't want all of my photos from my PC going to my iphone.. And if you choose sync a particular folder folder, it'll still cause my iphone to end up with all of the pics in that folder, when I really want to move them and delete them off my iphone.
 

LordVic

Cancelled
Sep 7, 2011
5,938
12,458
Stop being lazy and watch the keynote. Wireless sync was announced BECAUSE nobody was doing wired sync even BEFORE there was an alternative. That's right, people would rather not sync at all than do wired sync and that was back in 2011. There is nobody left doing wired sync three years later except for the odd loudmouth.

Here's a very, VERY important hint, which I'm surprised that more people don't seem to understand.

Just Because our lord and saviour Steve Jobs and his merry band get up on a stage and claim something to be fact, Does not actually mean, that is in fact a true statement.

Steve Jobs was often caught to be factually incorrect in his keynotes. He didn't care. He just wanted to convince US the consumer that Apple was the company you wanted to buy from, and he'd be damned if fact got in the way.

EG: Antennagate. Jobs got up on stage and proclaimed that the issues that plagued the Apple iPhone's antenna and dropped calls was "universal!" and even went on to name several other phones at the time that would also exhibit this behaviour.
Within a week, those competitors that he called out and slammed had white papers, scientific papers and more than enough evidence out to completely debunk his claim.
Did Job's apologize or at least release that he was wrong and lied on stage? no. he never issued another word.
he did the damage to the competition he wanted to. that was enough for him.
 

blackcrayon

macrumors 68020
Mar 10, 2003
2,256
1,824
Within a week, those competitors that he called out and slammed had white papers, scientific papers and more than enough evidence out to completely debunk his claim.
Did Job's apologize or at least release that he was wrong and lied on stage? no. he never issued another word.
he did the damage to the competition he wanted to. that was enough for him.

I think you're exaggerating quite a bit. Lots of people demonstrated you could attenuate the signal of other phones depending on how you held them. Some competitors even warned as much in the materials included with their phones. And then Apple went on to sell the same iPhone 4 (GSM anyway) for years and it wasn't a big deal.
 

LordVic

Cancelled
Sep 7, 2011
5,938
12,458
I think you're exaggerating quite a bit. Lots of people demonstrated you could attenuate the signal of other phones depending on how you held them. Some competitors even warned as much in the materials included with their phones. And then Apple went on to sell the same iPhone 4 (GSM anyway) for years and it wasn't a big deal.

the problem is, that he still did lie

Yes, every phone, Even today can have some attenuation based on hand position.

But very few, especially the ones he called out on stage would randomly drop calls based on normal hand positioning. it was debunked.

He never apologised, and he never issued a statement to correct what he said.
 

toke lahti

macrumors 68040
Apr 23, 2007
3,270
502
Helsinki, Finland
No DisplayPort 1.3? *sigh*
If they don't put dp1.3 to it, there will be TB4 in 2015, merry goes around...
Apple puts USB3 ports on their machines for those that don't need the speed that Thunderbolt affords. Those that need it are professionals for whom the extra cost is easily recouped with the work they do.
They never should have mixed the display and data. Now TB is too expensive for most people and if you have high end need for both display and data, the other chokes the other. If they had commercialed lightpeak as for data and left display protocols as they are outside "TB distorsion field", LP might have kicked off and got very affordable over the years. Now it would be very hard to change to optical in gen5 with all already installed base. Also TB will always be one gen behind the display standards. What a mess. Many people have waited for TB equipped MP for much longer than they can actually use them (with typical dual screen high end setting needing dp1.3 or hdmi2.0 in next year. Sad for them, good for us poor, since there will be lots of second hand MP's available and the resale value won't hold like it did last decade...
Who would have thought when Apple released all those amazing displays back then, that some day, Apple couldn't release new flagship display (like 2880p) because you could connect the display to every other new workstation on the planet, but not to Apples own...
It's still expensive at 'under $99', but have you seen this? It's a shame it doesn't come with 2 USB 3.0 ports: http://www.kanexlive.com/article/thunderbolt-adapters
No wonder that TB was newer advertised as "twice as expensive if you want that daisy chaining". Funny also that for most people TB3's most remarkable advancements will be that you can use it like usb3.0. And maybe next to that port there will be usb3.1 port...
 

dabotsonline

macrumors member
Apr 14, 2014
44
8
They never should have mixed the display and data. Now TB is too expensive for most people and if you have high end need for both display and data, the other chokes the other. If they had commercialed lightpeak as for data and left display protocols as they are outside "TB distorsion field", LP might have kicked off and got very affordable over the years. Now it would be very hard to change to optical in gen5 with all already installed base. Also TB will always be one gen behind the display standards. What a mess.
What was Intel's stated reason for combining display and data in the final Thunderbolt protocol? Was it nothing more than the elegance of a single cable?
 

toke lahti

macrumors 68040
Apr 23, 2007
3,270
502
Helsinki, Finland
What was Intel's stated reason for combining display and data in the final Thunderbolt protocol? Was it nothing more than the elegance of a single cable?
Maybe it was Apple's hate for connectors, who knows?
Maybe Apple said to Intel, that macbooks' profits should be bigger portion of the whole revenue and therefore TB can cost 10$ per mac, do it anyway you want it.

Considering how much articles have been written about TB, it really shows something about the level of IT journalism, if nobody has asked Intel about this.

Combining would have worked if there would have been massive overhead like there would have been with real optical interconnection. Slowly evolving cheap and therefore widespread copper solution would have been success without combining. Making TB cheap and available for the whole industry from day 1 without too tight restrictions would have worked in either of the cases. All 3 failed.

Now we know there's need for TB4 in 2015 when dp1.3 is released and simultaneously we are waiting for TB1 products to finally enter the shelves with price tags that assure that they mostly stay on the shelf. Seen lots of product release announcements using usb1.1 lately?
 

2499723

Cancelled
Dec 10, 2009
812
412
Stop being lazy and watch the keynote. Wireless sync was announced BECAUSE nobody was doing wired sync even BEFORE there was an alternative. That's right, people would rather not sync at all than do wired sync and that was back in 2011. There is nobody left doing wired sync three years later except for the odd loudmouth.

Wireless sync is hit and miss. Sometimes the phone is seen by iTunes, sometimes not. It's also not particularly helpful when you want to do a full backup/restore of the phone. It would be nice if the flash memory in the iPhone was a bit faster to take advantage of USB 3, but Apple would likely just jack up the price even more. Anyway, in my 'odd loudmouthed' opinion, a USB connection is still my go-to syncing solution given the the sporadic operation of wireless syncing. iTunes seems to have a mind of its own when it determines what devices it wants to see at any given moment. As if a cable was all that difficult to work out on the first place. If we weren't meant to use it, why bother allowing it to transfer data at all? Apple could have saved loads on R&D for lighting cables!
 

repoman27

macrumors 6502
May 13, 2011
485
167
If they don't put dp1.3 to it, there will be TB4 in 2015, merry goes around...

They never should have mixed the display and data. Now TB is too expensive for most people and if you have high end need for both display and data, the other chokes the other. If they had commercialed lightpeak as for data and left display protocols as they are outside "TB distorsion field", LP might have kicked off and got very affordable over the years. Now it would be very hard to change to optical in gen5 with all already installed base. Also TB will always be one gen behind the display standards. What a mess. Many people have waited for TB equipped MP for much longer than they can actually use them (with typical dual screen high end setting needing dp1.3 or hdmi2.0 in next year. Sad for them, good for us poor, since there will be lots of second hand MP's available and the resale value won't hold like it did last decade...
Who would have thought when Apple released all those amazing displays back then, that some day, Apple couldn't release new flagship display (like 2880p) because you could connect the display to every other new workstation on the planet, but not to Apples own...

No wonder that TB was newer advertised as "twice as expensive if you want that daisy chaining". Funny also that for most people TB3's most remarkable advancements will be that you can use it like usb3.0. And maybe next to that port there will be usb3.1 port...

If you look at Intel's slide, Alpine Ridge is set to release alongside Skylake, which is slated for a 2H 2015 ramp with products shipping in the Q4 2015 - Q1 2016 timeframe. In order to ensure sufficient time for validation, tapeout of Skylake is likely already completed, with the initial tapeout possibly having been done as early as last June. The DisplayPort 1.3 specification has not even been released yet. If it was finalized in the next couple months, there's a slight chance Intel could make a Herculean effort to get DP 1.3 included in Alpine Ridge, but why on earth would they when they know full well none of the Skylake IGP's will support it? And would you really want to see just how badly a Skylake IGP would perform while attempting to drive an 8K display? You need to come to grips with the reality of the silicon development cycle. A specification is just a pdf file; producing a functional chip with over a billion transistors in shipping volume takes time.

It is slightly concerning to me that the slide lists USB 3.0 as an LSPCon mode, when Intel's release of xHCI 1.1 this past December seemed to bode well for USB 3.1 being included in the chipset for Skylake. If USB 3.1 doesn't become ubiquitous until 2017, I'm sure there will be some sad pandas out there.

The other problem with complaining about Thunderbolt's slow adoption of the latest DisplayPort revisions is that the overarching issue lies with the state of external display technology in general. The DisplayPort 1.2 specification was released in January of 2010, and by the end of that year there were AMD cards shipping with DP 1.2 source support, although they wouldn't receive final certification until December of 2011. Meanwhile, the first DP 1.2 sink devices, in the form of integrated and standalone MST hubs, didn't appear on the market until Q4 of 2012, and to this day there is not a single shipping external display with a 4-lane, HBR2 capable, DisplayPort TCON. It is absolutely baffling to me why AMD, LG, Parade, TI, or any of the companies you would expect to benefit from such silicon being available declined to step forward to fill the void. So maybe we should work on producing the first end-to-end DP 1.2 external display solution before worrying about DP 1.3 source support, which might not be truly relevant until 2020 at this rate.

Your argument that combining DisplayPort and PCIe protocol support somehow limits available bandwidth or increases the cost of Thunderbolt is deeply flawed. Regardless of what types of packets they're used for, Thunderbolt and Thunderbolt 2 are 10 Gbit/s per channel serial I/O interfaces, which means they need 10 Gbit/s transceivers at each end of a link. The transceivers weren't integrated into the controllers because Intel originally intended to use discrete optical VCSEL modules, and there are quite a few sound arguments for keeping them separate regardless. Putting the transceivers in the cable assemblies allows you to support a much wider range of media, reduces the cost and size requirements for both the host and devices, and reduces the total cost for end users because they only ever have to pay for the transceivers they require. Since Apple sells Thunderbolt cables for as little as $29, it's pretty safe to assume that the transceivers at each end cost less than $5. There is no other solution that comes even close to being that cheap for dual-channel, full-duplex 10 Gbit/s.

Despite your perpetual claims that DisplayPort traffic negatively impacts PCIe performance when both are transported over the same Thunderbolt link, I bet you can't provide a single example that isn't either intentional or avoidable. Think about it. OG Thunderbolt didn't support channel bonding or DP 1.2, so you couldn't use more than 10 Gbit/s of PCIe data per device, you couldn't use more than 8.64 Gbit/s per display, and the number of ports supported by each controller was matched to the number of displays you could drive. Thunderbolt 2 is limited to 16 Gbit/s of PCIe bandwidth by the PCIe 2.0 x4 back end, and while there are two DisplayPort 1.2 inputs that could theoretically deliver up to 17.28 Gbit/s each, there aren't any existing display configurations that can use more than 13.31 Gbit/s. So once again, a total non-issue when you have two ports. Even in situations where Apple has shipped Macs with 4C controllers but only a single Thunderbolt port capable of driving multiple displays, you can drive a pair of 2560 x 1440 displays and still get data throughput greater than that of a USB 3.0 port.
 

repoman27

macrumors 6502
May 13, 2011
485
167
What was Intel's stated reason for combining display and data in the final Thunderbolt protocol? Was it nothing more than the elegance of a single cable?

The DisplayPort protocol adapters were part of Thunderbolt/Light Peak from day one. Other than Thunderbolt, the only external I/O interfaces to ever grace a consumer PC and provide more than 4.8 Gbit/s of bandwidth are the digital display interfaces: DisplayPort, High-Speed HDMI and Dual-Link DVI. Since DisplayPort and PCIe are both packet based, converting them to the Thunderbolt meta-protocol is a relatively straightforward process. Pretty much the only application most users could be expected to need that much bandwidth for was digital display output. Taking advantage of Thunderbolt's ability to flexibly transport other packet based protocols, the Thunderbolt Ethernet Bridge introduced in Mavericks has essentially given every Mac released since 2011 10GbE for free.

If you look at the back end of a 4-channel Thunderbolt controller, it provides connections equivalent to that of Intel's recent platform controller hubs. It's a way to connect anything that would normally hang off of the PCH outside of the box instead. This is extremely compelling, especially for a company like Apple that is so focused on making their PCs as small as possible.
 

toke lahti

macrumors 68040
Apr 23, 2007
3,270
502
Helsinki, Finland
You need to come to grips with the reality of the silicon development cycle. A specification is just a pdf file; producing a functional chip with over a billion transistors in shipping volume takes time.
Right on the spot and that's why TB will newer became commercial success if there's another new version every year.
It is slightly concerning to me that the slide lists USB 3.0 as an LSPCon mode, when Intel's release of xHCI 1.1 this past December seemed to bode well for USB 3.1 being included in the chipset for Skylake. If USB 3.1 doesn't become ubiquitous until 2017, I'm sure there will be some sad pandas out there.
Looks like TB development is badly out of sync of both DP and usb development. To give time for TB to get widespread they should update TB only after new gen of DP and usb. Now it seems that Airs will be using TB1 at the same time when MBP and MP will use TB3. TB1 products will never get affordable for Air users when MBP & MP users are asking TB3 products and manufacturers are offering TB2 products.
So maybe we should work on producing the first end-to-end DP 1.2 external display solution before worrying about DP 1.3 source support, which might not be truly relevant until 2020 at this rate.
Maybe there would have been end-to-end DP1.2 solutions available if TB would have supported it 2-3 years earlier. Also it will not help TB3's succes if it does not support usb3.1. Btw, aren't Dell UP3214Q and ASUS PQ321Q end-to-end DP1.2?
Putting the transceivers in the cable assemblies allows you to support a much wider range of media, reduces the cost and size requirements for both the host and devices, and reduces the total cost for end users because they only ever have to pay for the transceivers they require. Since Apple sells Thunderbolt cables for as little as $29, it's pretty safe to assume that the transceivers at each end cost less than $5. There is no other solution that comes even close to being that cheap for dual-channel, full-duplex 10 Gbit/s.
Putting transceivers to cables makes devices cheaper, but you need always new cables. Anyway you need same amount of transceivers, so how does placing them change the overall cost? What wider range of media? If you could use same passive cheap plastic optical cables with every gen of TB, why wouldn't it be cheaper?
Well, another "cheap" 10Gbit product is 10GbE. It's another well example when industry fails to make something popular. Feels like it has a lot in common with TB.
Despite your perpetual claims that DisplayPort traffic negatively impacts PCIe performance when both are transported over the same Thunderbolt link, I bet you can't provide a single example that isn't either intentional or avoidable. Think about it. OG Thunderbolt didn't support channel bonding or DP 1.2, so you couldn't use more than 10 Gbit/s of PCIe data per device, you couldn't use more than 8.64 Gbit/s per display, and the number of ports supported by each controller was matched to the number of displays you could drive.
When first TB products shipped, there were test where they got even audio with cd-bitrate to stutter. Although these were artificial situations, they showed up what can happen.
Let's take an average indie movie editing situation in 2015: you have 2 monitors with 10bit colors, 4k@60Hz. That eats 32Gbit/s. Your footage is from cheap BM 4k camera. Lets say you need to play 3 streams simultaneosly because of beautiful crossfade. That needs 2.64Gbit/s. There would be no problem handle the data stream with TB1/2 or LP or even usb3, IF display stream would be handled with dedicated DP1.2 pipes. But since both display and data are mixed to TB even TB2 can't handle this. If MBP would have separate DP1.2 connector for displays and TB1/TB2/usb3 for data this would have been possible with Apple's lineup of macs in 2011 if Apple would have stayed with miniDP and implemented cheap 3rd party usb3 controller. Pretty funny, btw, that when VESA included miniDP to DP1.2 and ATI had implemented it in Radeon 6000 (which Apple used) Apple ditched DP. This 32G for display & 2.6G for data can also be done with MP3,1 from 2008 by adding few cards. Something that will never be possible with present lineup (updating 7 years old mac for today's needs)...
Thunderbolt 2 is limited to 16 Gbit/s of PCIe bandwidth by the PCIe 2.0 x4 back end, and while there are two DisplayPort 1.2 inputs that could theoretically deliver up to 17.28 Gbit/s each, there aren't any existing display configurations that can use more than 13.31 Gbit/s. So once again, a total non-issue when you have two ports. Even in situations where Apple has shipped Macs with 4C controllers but only a single Thunderbolt port capable of driving multiple displays, you can drive a pair of 2560 x 1440 displays and still get data throughput greater than that of a USB 3.0 port.
Having two ports does not make it non-issue, if you consider that many TB products are not chainable and lots of people want to use DP monitors.
What exists now is non-issue. These products should have good future proof. TB would be much more successful, if those that can't replace their gear every year, could buy powerful and expensive TB products with assurance that they will work fine with next gen products.
 

toke lahti

macrumors 68040
Apr 23, 2007
3,270
502
Helsinki, Finland
The DisplayPort protocol adapters were part of Thunderbolt/Light Peak from day one.
But was it wise decision? "One port to handle all" has lead to "no port can do all".
Taking advantage of Thunderbolt's ability to flexibly transport other packet based protocols, the Thunderbolt Ethernet Bridge introduced in Mavericks has essentially given every Mac released since 2011 10GbE for free.
What's the use of this without switches and few feets of cable. Yes, you can connect 2 macs together if they are close to each other. Nothing more. Oh yes, 3rd party TB cable 330$ for 10 meters. Not so free anymore...
This is extremely compelling, especially for a company like Apple that is so focused on making their PCs as small as possible.
Still you can connect 2 pieces of 4k monitors to any laptop with DP1.2, but not with macbook with TB2.
 

repoman27

macrumors 6502
May 13, 2011
485
167
Right on the spot and that's why TB will newer became commercial success if there's another new version every year.

Looks like TB development is badly out of sync of both DP and usb development. To give time for TB to get widespread they should update TB only after new gen of DP and usb. Now it seems that Airs will be using TB1 at the same time when MBP and MP will use TB3. TB1 products will never get affordable for Air users when MBP & MP users are asking TB3 products and manufacturers are offering TB2 products.

You've certainly been following technology long enough to know the drill... All vendors refresh their platforms every 12 to 18 months. The point of Thunderbolt is to be the fastest I/O interface available on consumer PCs, not to be cheap or widespread, thus it's a technology that's not going to stand still. However, there's hardly been a new version annually. Thunderbolt 2 didn't ship to consumers until 32 months after OG Thunderbolt did, and Alpine Ridge should arrive 24 to 30 months after that. Considering Thunderbolt 2 is fully backwards compatible and uses the same cables, it doesn't seem like such a big ask to require a new cable with a smaller connector after 5 years to accommodate a doubling in transfer rates and a new signaling mode. And why should the Thunderbolt design cycle be in any way impacted by that of USB? If anything, it should be tied to new PCIe releases (and actually it seems to be, but sadly to the maximum PCIe revision supported by the PCH, not the CPU).

Maybe there would have been end-to-end DP1.2 solutions available if TB would have supported it 2-3 years earlier. Also it will not help TB3's succes if it does not support usb3.1. Btw, aren't Dell UP3214Q and ASUS PQ321Q end-to-end DP1.2?

Intel had Thunderbolt controllers that supported DP 1.2 available at the same time as they released CPUs with IGPs that supported DP 1.2. You might think they were dragging their feet on moving to DP 1.2, but in reality, why support an interface with higher signaling rates when your GPU can't push that many pixels or display streams in the first place? Were you expecting Intel to do this as a favor for AMD or NVIDIA? Do you expect things to be any different with DP 1.3?

And no, those displays are both based on a Sharp panel with an 8-channel LVDS interface and driven as two separate tiles. A DP 1.2 MST signal is used to transport the display data for both tiles over a single cable, and then an embedded MST hub outputs the two streams which get converted to LVDS (and quite possibly to TMDS first) before being sent to the panel.

Putting transceivers to cables makes devices cheaper, but you need always new cables. Anyway you need same amount of transceivers, so how does placing them change the overall cost? What wider range of media? If you could use same passive cheap plastic optical cables with every gen of TB, why wouldn't it be cheaper?
Well, another "cheap" 10Gbit product is 10GbE. It's another well example when industry fails to make something popular. Feels like it has a lot in common with TB.

You might have to buy an adapter or two once every 5 years, but the old cables will in fact continue to work just as they always did. Those who don't use Thunderbolt actually never have to pay for any transceivers. They can just use the port like a normal mini-DP port. Every unused port on any 2-port Thunderbolt device would be another transceiver that didn't need to be purchased. Placing the transceivers in the connectors allows you to use any type of copper or fiber you like that can support the required signaling rate over the prescribed distance. If Intel had actually been able to come up with a less expensive transceiver that could manage 10 Gbit/s over cheap plastic fiber, I'm pretty sure Apple would have gone along with it, but as history has shown, that was not the case.

10GbE is actually massively popular, just not for consumer PCs. If you're thinking specifically of 10GBASE-T, well it requires tons of power and new cables :eek: to jam 10 Gbit/s down any significant length of UTP. Most datacenters deploy 10GbE equipment based on SFP+ modules instead. If you think that 10GbE is expensive at the moment because the "industry failed to make it popular", then you're an idiot. Economies of scale can drive costs down, but there are limits. You can't make an F1 car today for $15,000 even if you make a 100 million of them.

When first TB products shipped, there were test where they got even audio with cd-bitrate to stutter. Although these were artificial situations, they showed up what can happen.
Let's take an average indie movie editing situation in 2015: you have 2 monitors with 10bit colors, 4k@60Hz. That eats 32Gbit/s. Your footage is from cheap BM 4k camera. Lets say you need to play 3 streams simultaneosly because of beautiful crossfade. That needs 2.64Gbit/s. There would be no problem handle the data stream with TB1/2 or LP or even usb3, IF display stream would be handled with dedicated DP1.2 pipes. But since both display and data are mixed to TB even TB2 can't handle this. If MBP would have separate DP1.2 connector for displays and TB1/TB2/usb3 for data this would have been possible with Apple's lineup of macs in 2011 if Apple would have stayed with miniDP and implemented cheap 3rd party usb3 controller. Pretty funny, btw, that when VESA included miniDP to DP1.2 and ATI had implemented it in Radeon 6000 (which Apple used) Apple ditched DP. This 32G for display & 2.6G for data can also be done with MP3,1 from 2008 by adding few cards. Something that will never be possible with present lineup (updating 7 years old mac for today's needs)...

Your contrived scenario would work just fine, actually. I'll accept your leap of faith that 10-bit display support under OS X will happen by the end of 2015, and I certainly hope that we'll have a 10 bpp 4K display that can be driven using DP 1.2 SST and thus only need 16 Gbit/s. So any Mac with at least two Thunderbolt 2 ports and a sufficiently beefy GPU could drive both of those displays and still read 11 Gbit/s (real world throughput) of data from a daisy chained Thunderbolt 2 storage device. So that's good for 12 streams actually. Of course you would only be able to write 2 streams back, unless you used two Thunderbolt drives in RAID 0 (one on each port) and then you might be able to manage 5. Or if you had a new Mac Pro, you could read 33 and write 29 streams while driving both displays and watching a 4K movie on an additional UHD TV. With OG Thunderbolt you could swing a single 4K display driven using two cables, or 2 4K displays at 30 Hz. But I'm pretty sure that even in 2015, if you can afford a pair of production quality 4K displays, you can also shell out $1299 for a MacBook Pro that has dual Thunderbolt 2 ports.

Apple never ditched DisplayPort, they just gave it additional functionality and changed the little icon next to it. Intel not bringing DP 1.2 to Thunderbolt before they brought it to their IGPs may have gimped the port a bit for Macs with dGPUs, but it also enabled display daisy chaining before MST did, and the 4K limitations have hardly affected anyone. Aside from the iMacs, which Thunderbolt 1 equipped Macs even had GPUs that could reasonably drive two 4K displays at 60 Hz?

Having two ports does not make it non-issue, if you consider that many TB products are not chainable and lots of people want to use DP monitors.
What exists now is non-issue. These products should have good future proof. TB would be much more successful, if those that can't replace their gear every year, could buy powerful and expensive TB products with assurance that they will work fine with next gen products.

The bandwidth contention issue is entirely separate from the problem of only being able to support one chain ending device per port. The latter is true of any type of port on the planet.

But was it wise decision? "One port to handle all" has lead to "no port can do all".

Yes, it was the right decision. And the reasons why are really rather obvious.

What's the use of this without switches and few feets of cable. Yes, you can connect 2 macs together if they are close to each other. Nothing more. Oh yes, 3rd party TB cable 330$ for 10 meters. Not so free anymore...

You can connect more than 2 Macs if you like, Windows PCs too, although bridging multiple machines will result in some seriously heavy CPU usage if you lean on it too hard. However, it is a game changer for deployment when you can create a 10 GbE bridge with a $39 cable and push an image to a machine as fast as its internal flash storage can handle. And don't forget that a $299 10 m Thunderbolt cable is still cheaper than a single 10GBASE-T NIC at $334, which is also useless without a second NIC, and no more capable than the Thunderbolt solution without an $880 switch and yet more NICs. So yeah, it's essentially free 10 GbE.

Still you can connect 2 pieces of 4k monitors to any laptop with DP1.2, but not with macbook with TB2.

Please link to a laptop with 2 DisplayPort 1.2 ports, plus the GPU and drivers to light up 2 currently existing 4K displays at 24 bpp, 60 Hz. (Hint: with the MacBook Pro, the problem lies with the GPUs/drivers, not the Thunderbolt 2 controller.)
 

toke lahti

macrumors 68040
Apr 23, 2007
3,270
502
Helsinki, Finland
The point of Thunderbolt is to be the fastest I/O interface available on consumer PCs, not to be cheap or widespread, thus it's a technology that's not going to stand still. However, there's hardly been a new version annually. Thunderbolt 2 didn't ship to consumers until 32 months after OG Thunderbolt did, and Alpine Ridge should arrive 24 to 30 months after that. Considering Thunderbolt 2 is fully backwards compatible and uses the same cables, it doesn't seem like such a big ask to require a new cable with a smaller connector after 5 years to accommodate a doubling in transfer rates and a new signaling mode. And why should the Thunderbolt design cycle be in any way impacted by that of USB? If anything, it should be tied to new PCIe releases (and actually it seems to be, but sadly to the maximum PCIe revision supported by the PCH, not the CPU).
I disagree whether TB is meant to be widespread. It is end user tech. All end user tech must aim to be widespread, otherwise cost will be too high for it to compete with others. DockPort might very well make TB obsolete, if TB developers won't play their cards well. How advanced tech is does not matter. This is once again like VHS vs. Beta. Or firewire vs. usb, etc. If dp1.3 will be available about 30-40 months before TB4 is shipping with dp1.3, it will get very hard to advertise TB as "most advanced" tech and justify the costs. At least if cheap dockport with current dp has been available all the time. First dp1.2 gpu cards started selling in 2010. TB got dp1.2 3 whole years after. Luckily it didn't make so big harm at that time, but now 4k displays are coming hard and getting cheap.
If TB is updated only because of Intel's needs, then Apple should get rid of it. Apple does use discrete gpus and especially in those macs that power users need. Or Apple could do the sane thing: offer both. It wouldn't be trouble to anyone if macs would have dedicated dp port in addition to one or more TB ports.

Why should TB design cycle follow dp & usb design cycle? Because if it always fall behind (like 30 months), people don't want to pay the extra amount for it. If you bought some expensive TB1 stuff last year, you won't like that as fast usb3.1 stuff will be available next year. It doesn't matter how many gigabits there are lying on some port or cable, if you can't use that new display because of chaining is not possible since you have bought too many non-chainable TB products. People will get tired to change their macs every year. Macs used to be computers that held their value for the longest. Now they have soldered ram and expensive standards wich are falling behind. Mac won't hold their value any more. Not ecological, not economical, not fun. Apple should sell macs and osX to someone who'd care.
Intel had Thunderbolt controllers that supported DP 1.2 available at the same time as they released CPUs with IGPs that supported DP 1.2. You might think they were dragging their feet on moving to DP 1.2, but in reality, why support an interface with higher signaling rates when your GPU can't push that many pixels or display streams in the first place? Were you expecting Intel to do this as a favor for AMD or NVIDIA? Do you expect things to be any different with DP 1.3?
Again AMD (still ATI at that time?) released gpus that supported dp1.2 in 2010. They will release dp1.3 support in 2015. If TB3 is released 27 months after TB2 (2016q1) and it does not have dp1.3 AND it will take again 27 months before TB4, Apple's products start to support dp1.3 in somewhere in 2018q1. You don't think that will be a problem? If I buy a mac in 2017 I would wish it would support same displays that other industry has supported already for 2 years. And I don't want to buy another mac next year, if I need that support. Having support for latest standards before you need them, saves usually a lot of money, when you can skip few generations of upgrading.
Things are different for dp1.3, that there will be need for it when it starts shipping. People want to chain 4k displays. When dp1.2 started shipping there was no similiar need for chaining 2.5k displays. TB has taught us that it's nice and dandy, dp1.2 made it possible for those without TB and now dp1.3 makes it possible to chain 4k, but suddenly TB users drop out from wagon.
And no, those displays are both based on a Sharp panel with an 8-channel LVDS interface and driven as two separate tiles. A DP 1.2 MST signal is used to transport the display data for both tiles over a single cable, and then an embedded MST hub outputs the two streams which get converted to LVDS (and quite possibly to TMDS first) before being sent to the panel.
Oh well, MST or not, you can't use these displays with any mac released before November 2013, but there's tons of "PC"s that have had support for them for years.
You might have to buy an adapter or two once every 5 years, but the old cables will in fact continue to work just as they always did. Those who don't use Thunderbolt actually never have to pay for any transceivers.
Sad but true; TB was made to be cheap for that massive majority, that will never need or use it.
10GbE is actually massively popular, just not for consumer PCs. If you're thinking specifically of 10GBASE-T, well it requires tons of power and new cables :eek: to jam 10 Gbit/s down any significant length of UTP. Most datacenters deploy 10GbE equipment based on SFP+ modules instead. If you think that 10GbE is expensive at the moment because the "industry failed to make it popular", then you're an idiot. Economies of scale can drive costs down, but there are limits. You can't make an F1 car today for $15,000 even if you make a 100 million of them.
I was thinking specifically of 10Gbase-T and solely for end user's point of view.
And unfortunately I'm not alone in thinking that 10Gbase-T was huge failure. Like this: http://www.knxtraining.gr/uploads/8/2/1/2/8212652/davidebadiali_bicsigreece_oct2013_rev002.pdf
For any mac user, who'd been waiting for faster local network (mostly pepole doing video & visual things?) it certainly has been strange that there has been no upgrade in local network speeds. This also has increased interest to TB. Datacenters and ISP's have of course used 10G for a long time like anybody who has terabits of traffic daily and therefore it doesn't matter what the port costs. Now it looks like they even managed to even orphan cat7 cables? Too many silicon gens to get power low enough. They really didn't think end user business at all. If there would have been 2GbE 8-port switch for less than $300 a decade ago, nobody would have current boxes any more. Or 4GbE 5 years ago.
People don't need F1 cars. They need Tesla's and they get affordable only by popularity.
With OG Thunderbolt you could swing a single 4K display driven using two cables, or 2 4K displays at 30 Hz. But I'm pretty sure that even in 2015, if you can afford a pair of production quality 4K displays, you can also shell out $1299 for a MacBook Pro that has dual Thunderbolt 2 ports.
Well, try to find that 4k monitor that is driven by 2 TB cables...
Again, I might be able to shell out new MBP every year, since TB is crippling mac's display output connections, but that's not the whole picture. What if you want to use monitors with displayports. 2 ends of chain used. And then you have hard drive or other peripheral that don't have TB out. Once again this "fastest i/o on the planet" can't do same thing than cheap pc years ago with dp1.2 & usb3. Also I was trying to show where the bandwidth need is. 32Gb to display and < 3Gb of data. The latter one could easily be handled with usb3. Then there's no additional benefit from TB other than being "last gen dp".
(I know, the right answer to this is: Buy new MP! Totally worth the money. Maybe not...)
Apple never ditched DisplayPort, they just gave it additional functionality and changed the little icon next to it. Intel not bringing DP 1.2 to Thunderbolt before they brought it to their IGPs may have gimped the port a bit for Macs with dGPUs, but it also enabled display daisy chaining before MST did, and the 4K limitations have hardly affected anyone. Aside from the iMacs, which Thunderbolt 1 equipped Macs even had GPUs that could reasonably drive two 4K displays at 60 Hz?
AMD's gpu's supported dp1.2 from 2010. I'd guess that since they fullfil the spec, they have supported chaining ever since. So I'd guess that in 2011 Apple could have chosen also dp1.2, which would have supported chaining monitors in same way than TB did. MBP used AMD Radeon HD 6490M, 6750M, 6770M and NVIDIA GeForce GT 650M (btw last one was used in both retina and non-retina models in 2012 and 2013). These could still drive 2x4k@60Hz. Of course MP have been able to do this all the time, no thanks to Apple between years 2009-2013...(Radeon 5xxx did not support dp1.2 and Apple never bothered to offer newer cards...)
You can connect more than 2 Macs if you like, Windows PCs too, although bridging multiple machines will result in some seriously heavy CPU usage if you lean on it too hard. However, it is a game changer for deployment when you can create a 10 GbE bridge with a $39 cable and push an image to a machine as fast as its internal flash storage can handle. And don't forget that a $299 10 m Thunderbolt cable is still cheaper than a single 10GBASE-T NIC at $334, which is also useless without a second NIC, and no more capable than the Thunderbolt solution without an $880 switch and yet more NICs. So yeah, it's essentially free 10 GbE.
TB is probably quite nice with 3 workstations and 1-2 NAS, although I haven't heard any small post production facility to use it, but if 10Gbase-T would have been rolled out right, this wouldn't be the case. 10Gbase-T will cost half next year and TB will cost same. And then next year? Neither are or will be free.
Please link to a laptop with 2 DisplayPort 1.2 ports, plus the GPU and drivers to light up 2 currently existing 4K displays at 24 bpp, 60 Hz. (Hint: with the MacBook Pro, the problem lies with the GPUs/drivers, not the Thunderbolt 2 controller.)
There are laptops with 2 dp1.2 ports. Then there are laptops with one port and the other on docking stations. And there are docking stations with 2 dp1.2 ports. Sadly manufacturers don't publish how many pixels these can drive, but I guess they fulfill the specs.
If 2012 MBP with NVIDIA GeForce GT 650M launched originally in March 2012 in could push over 15M pixels, why wouldn't 2 generations newer gpu's drive same amount considering the state of osX's gpu drivers?
 
Last edited:

HurryKayne

macrumors 6502a
Jun 9, 2010
982
13
It's not gonna happen this year. The leaked charts themselves show TB3 with Skylake which is after Broadwell. Right now Intel's saying the manufacturing ramp of Skylake will begin in 2H2015 and that's if there aren't any delays. So I wouldn't expect Skylake, and by extension, TB3, until the end of 2015 or early 2016.

Ok,thanks:D,so more time for my wallet
 

jdiamond

macrumors 6502a
Dec 17, 2008
699
535
Sadly, TB 3 has already missed the boat with DisplayPort 1.3...

Someone asked on this post, why do we need faster Thunderbolt?

For me, there are 3 main reasons:

1) Support of 4K monitors, 5K monitors, 4K televisions (w HDCP), 3-D 4K televisions. I don't really care about 8K support...
2) PCIe over TB at a usable rate - it's currently lower than PCIe2 speeds. Powering an external graphics card is a big deal.
3) I don't want to have to buy a *new* $3,000 laptop just to do this...

To be fair, the DisplayPort 1.2a on TB 2 has shown to be capable of running a single 4K display at 60Hz, and the article claims HDMI 2.0 can be emulated over TB 3...

But DisplayPort 1.3 is SO much better, including native support for HDMI 2.0 and USB 3.0/3.0 superspeed. It ups the underlying protocol bandwidth to that of TB 3, so one TB 3 port could in theory handle these protocols.

Unfortunately, AFAIK, TB 3 arrived too early for DP 1.3, and it's built into SkyLake, the CPU. So unless Apple is willing to go with a separate I/O chip, we're all locked out of DP 1.3 until 2017 or so... :(

Here's hoping TB 3 can be software upgraded to support DP 1.3.

Because TB is supposed to be the general purpose external connector like PCIe.

Obviously, for those people that don't hook their laptop up to anything, it's all moot. You guys win, because you can buy slightly used Macbook Pros at half price every time they change the ports. :)
 

Michael Goff

Suspended
Jul 5, 2012
13,329
7,421
Can't they just release 1 version and get it over with? All these revisions just make it impossible to take this technology to mainstream adoption. The biggest advantage of USB 2.0 wasn't the technology, it was the fact that it stuck around for over 10 years.

So they should never improve it? It isn't like they're removing backwards compatibility or anything.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.