Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Of course, I meant that more lanes could be used in one cable. Not multiple sockets for one data path.

More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason. Nobody wants to see a return of the old parallel SCSI cables with 50-pin Centronics connectors.

Good selection of numbers you have.
I think that when tb designers were pushing the envelope a bit too much when they made the hard decisions somewhere in 2010. Maybe they didn't realize, that to make a new useful standard, it has to be mass-adopted by consumers to get price efficient.

We are very close to the time, when "pro" tech will not be more advanced than consumer tech. Pushing the envelope is getting so expensive, that you simply can't do that with any smaller market size than the biggest.

Apple has significant enough market share at the moment to create a robust ecosystem for both software developers and accessory manufacturers all by themselves. A standard does not need to be adopted by billions of consumers or even the majority in order to become useful or a good value proposition.

There will always be a minority segment of the market that demands more performance than is available from run-of-the-mill consumer electronics devices and are willing to pay a premium for it: the pros and enthusiasts. There is also a key group of purchasers to whom price is virtually no object for products that meet their specific requirements: enterprise and government. While the market may be far larger for 1080p HDTVs than 4K medical imaging monitors, the profits are way better for the latter. Trickle down will continue to happen as it always has. 10.3125 Gbps per channel is old news to the telecom industry, who will be deploying 25-28 Gbps this year and paying thousands of dollars per port to do so. In 2014 Thunderbolt will get a speed bump and consumers will have access to those same speeds for a couple hundred dollars per port.

Apple only has a little more than 5% of the global market and around 12% of the US market for PCs based on unit shipments, but they take in over 35% of the operating profit for the entire industry. Apple needs to differentiate itself from the competition in order to justify higher unit prices and maintain those higher profit margins. Thunderbolt is just one more way to reinforce the perception that their devices are more elite, capable and valuable.

Usb3 was released in 2008, 8 years later than usb2.
If usb4 comes in 2016 with greater speed than tb now, a fraction of price of tb and there's no tb2 at the time, tb will die slowly away.
Then we can keep speculating that if tb had had 8 pairs of wires in the cable and therefore used cheap passive cables, it would have become as popular as usb...

But USB 3.0 devices were not available until 2010, two years later. Do you think USB would be as popular today if it had been UPB (Universal Parallel Bus) instead? Also note that USB is massively popular, whereas SuperSpeed USB is just beginning to gain momentum. There are 6 billion USB devices in use, but even by the end of 2012, only 7% will be of the USB 3.0 variety.

I disagree fully.
Apple could have included usb3 in macs in 2010.
And more usb3 controllers were sold in 2010 than macs.
Now which one of these minorities are significant again and why?
There's no technical or economical reason for not to do this, if they would have wanted to maintain state-of-the-art imago. Saving $5 chip in cost just isn't that. Maybe this is just the one legacy baggage that Apple is carrying. After fw loosing to usb, they couldn't just face their defeat and had to come up something sexier than usb and that became tb. Also tb might be a whole lot less sexier, if macs would have had usb3 before tb.

According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.

Why would Apple spend time developing and testing a driver for silicon that didn't meet their design criteria and was not available in sufficient quantity? Seriously, not even joking, the earliest Apple could have adopted USB 3.0 was mid 2011, and they would have had to lean hard on their suppliers to do so. Whether they should have done so or not at that point is entirely debatable, but saying that they could have added USB 3.0 in 2010 or even early 2011 is just not in line with reality.

The problem with standards is, well, you need to wait until things are standardized. Despite USB 3.0 being ratified in December of 2008, the first 4-port controller wasn't even certified until April of 2011.

Although you can't realise from controller sales numbers how much certain port is used and even more importantly how much there was benefit using this port compared to other alternatives, I just don't get this when something is widely enough accepted. Do you have to sell 40 million controllers a year, before you can say that the used standard is adopted? It would be very interesting to know how many tb devices has been sold that really benefits from tb? 4 digits or even 5? Tb displays and fw & ethernet dongles should be excluded from the list, since all these would be handled more price efficiently with usb3.

The industry generally relies on a metric known as "attach rate" (a term which I now realize I have been misusing for some time). It's the number of complimentary devices sold for each primary product sold. As far as I'm aware, these figures just haven't been made public for Thunderbolt, but without a doubt they are lower than for USB 3.0 at this point.

My guess is that you'd be solidly into 6-digit territory for the number of Thunderbolt devices sold. And discounting the Apple Thunderbolt Display would be asinine. As would not including Ethernet dongles, unless you can point to a currently available USB 3.0 GbE adapter. And replacing the functionality of FireWire gear you already own with USB 3.0 stuff is not only potentially impossible at this point, but also probably more expensive than a buying a $29 adapter. However, the adapters aren't even available yet, so it makes no difference whether you include them or not.

I just don't get this excuse that Apple shouldn't have included usb3 before it has certain percentage of market adoption. How about using the same rule for tb? Is there some reason why macs should be some kind of "average PC" and therefore have only features that most computers have? Does retina need to be ubiquitous to be justified? Comparing pc market to macs is just plain stupid old fruit comparison. When most of pc crap is way cheaper than macs, you just can't expect them having any advanced tech. On the other hand almost all pc hardware in macs' price group has had usb3 (and bd) for second of third year now.

I never argued that Apple needed to wait for a certain level of adoption before adding USB 3.0, or should only include mainstream technologies in their products. Just like everyone else, though, they can only purchase what their suppliers can produce. When Apple moves first, they can be aggressive and buy up all of a certain item. When it comes to things like USB 3.0 host controllers, or 4G LTE modem and baseband chips, there are a lot of other prospective customers, and any drawbacks due to early implementations can make waiting it out for a round or two seem like a better idea.

Just compare usb3 and tb 18 months after first products were on the shelf. Usb3 had 100x more different products and adoption rate than tb. Maybe 1% of mac users will ever use tb and at the same time 99% of users will benefit usb3. Apple's customers who bought mac in 2010-2012 would also benefit from usb3 still many years. Apple chose not to offer it, since so few undersood to even ask it and now they can buy new macbooks sooner than ever before.

But to be more realistic, the attach rate numbers are probably at least 2% for Thunderbolt, and according to the USB-IF were only around 60% for USB 3.0 after the first 18 months. The number of certified devices was only 250 for USB 3.0 vs. 50 for Thunderbolt, so that would be 5x more different products and 30x the attach rate.

Also, there is still only a very limited range of device silicon available for USB 3.0. You have thumb drives, card readers, SATA bridges, 4-port hubs, cameras and a media player. In the first 18 months, how many USB 3.0 to 6Gb/s SAS/SATA bridges, GbE or 10GbE adapters, HD video interfaces, Fibre Channel adapters, PCIe or ExpressCard expansion chassis, or displays anything remotely like the ATD were available? Despite some initial overlap, USB 3.0 and Thunderbolt will end up being used for very different purposes. Basically if a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?

Well, we are in the verge of external retina displays. If Apple brings any kind of other external display to market, it will be disappointment. (Hmm, next model could be also just the same as now, but with usb3. Even if the display doesn't have intel's chipset that includes usb3... ;) ) And with their situation with tb versus retina they have to either drop external monitors from their products altogether or limit the amount of external retina displays to one per mac or upgrade tb specs and have a mess with angry customers, whose expensive hardware just turned obsolete even before anybody thought. None of the options are very nice for anybody.

Do you really think there will be crowds of angry customers with pitchforks when they discover they can only run a single external display with a resolution higher than 6MP? I’d love to see the performance of an MBA trying to drive more than that many pixels. According to your logic, consumers should only be allowed to have 1920x1080 displays anyway, because they are so much cheaper due to economies of scale.

So, maybe those cables should have stayed where they belong and the desktop alternatives should be designed with better price efficiency?

Yes it will.
Nobody, needs anything that is too expensive and everybody wants as fast as they can reasonably justify to afford. So there's no binary solution for this.

Ahh, but don't you see that many people already can afford and justify the additional expense of Thunderbolt? Having a choice is a good thing, and I'm definitely glad that we finally have the option of using either USB 3.0 or Thunderbolt on the same platform.

If macs would had usb3 over 2 years now, maybe usb3 would had been adopted way faster. But yes, hdmi in new mbp is quite an oddball, maybe average mac user is stupid enough for not to know that dp-hdmi dongle does the same thing OR maybe Apple is preparing their customers to that if you want 2 external displays, the retina one takes the whole tb and the second external, non-retina model can use the hdmi OR if you need all tb bandwidth for data, you can use hdmi for display, which then does not take any bandwidth away from tb.

Even driving 2 daisy-chained 2560x1440 ATDs, PCIe bandwidth over Thunderbolt is only reduced by about 16%, and just in the outbound direction and only for devices attached to that chain. You constantly exaggerate the impact of DP on Thunderbolt's PCIe performance, when the odds of it significantly affecting any real-world workflows are slim to none. How would a USB 3.0 controller fare trying to drive two 2560x1440 displays while writing more than 8.4 Gbps to an external RAID array? Oh right, it can't do either of those things anyway...

Also, a very tiny fraction of computers, but still few million mbp's have tb but no usb3. Unfortunately there might never be a affordable tb-usb3 dongle or tb hub to use more these dongles...

But it is far more likely that an inexpensive Thunderbolt to USB 3.0 dongle based on the Port Ridge controller and leveraging Apple's drivers will see the light of day as soon as Mountain Lion and/or the 10.7.5 update become available to the general public.

Again there were more usb3 controllers sold than macs. Are they both not significant then?
Again, that 20% of pc's were on price group of macs and had usb3. At the same time the macs did not have. Is this a sign of bad design in pc's or macs?
Those controllers were still mostly finding their way onto enthusiast motherboards or drop-in PCIe expansion cards for desktop PCs, by a factor of 2:1 compared to notebook PC deployments. Once again, it is very clear that integrating third-party USB 3.0 controllers into Intel PCs with form-factors similar to Apple's product line did not happen in any significant way until mid-2011 or later. Until someone at Apple steps forward and recounts the tale of why the decision was made to wait until 2012, we can only speculate. I highly doubt that it was the result of the controllers being too expensive, or Apple's engineers not being up to the design task.

Good question!
Either way, chips inside the device or inside the cable, you need same amount of them, so it doesn't affect the price of the whole system.

The thing I have issue with is this:

In your own thought-experiment, take two thunderbolt devices connected by a thunderbolt cable. Now cut the ends of the cable and put the ends inside the tb devices.

Now you have two (differently designed) thunderbolt devices, communicating perfectly with each other as before, but via a cheap, passive cable.

The benefit being, you only buy the expensive electronics once per device, not twice per cable.

Why on earth didn't they implement it like that???

To ask the question a different way, why did Silicon Image not decide to put the transmit and receive electronics into the DVI cable (or HDMI). They could have done so, but they thought that would be a daft idea.

If the logic were integrated into the T-Bolt controller itself (instead of active cables) you'd need zero additional chips.

Intel's Light Peak controller, which became the Light Ridge Thunderbolt controller, was designed to be connected to an on-board optical transceiver. The output from the optical engine was then routed via fiber to a hybrid Cu/optical port containing a specially designed lens which allowed the use of passive optical cables. This system had many drawbacks which made commercialization challenging or downright impractical. Sony was the only OEM to go this route and only for a laptop that started at $3000.

Using the controller as-is with the active copper cables we have today was a far more flexible, cost-effective and expedient solution than going back to the drawing board and taping out a 120mm^2 controller all over again just to integrate a new PHY. I realize there is a lot of resistance to the notion that putting the logic in the cable may have been the best all-around solution, but there are reasons why it is the industry norm at 10.3125 GBaud per lane.

Active circuitry also allows the use of very thin wire for the signaling pairs, as small as 40 AWG, at power levels as low as 0.7 W per lane. Compare that to Intel's latest X540 10GBASE-T controller with integrated MAC/PHY. The 2 port controller weighs in at 625mm^2 and uses 6.25 W max per port when paired with UTP cables using 22 AWG wire. This is advertised as one of the lowest power 10GBASE-T solutions on the market. Meanwhile SFP+ modules get the job done using less than 1 W per port and allow considerable flexibility regarding the type of media used.
 
Last edited:
More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason.

But T-Bolt is 4 parallel lanes, and 16 parallel lanes is the standard connector for most graphics cards.

Clearly skew is not a serious problem - when you have multiple packet-oriented serial interfaces running as a team, reassembling the packets in the correct order is a solved problem.

(And for people who don't realize it, T-Bolt takes 4 parallel signals, multiplexes it onto one serial signal, then de-muxes it back to 4 parallel lanes.)


According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.

I'd love to see the statistics for USB 3.0 adoption on systems based on selling price.

Since there are no inexpensive Apples (although Apples seem to be getting "cheaper" on the quality front, but not the price front), I would bet that when you look at PCs comparably priced to Apples a much higher proportion of them would have USB 3.0 than the stats that you quote.

At a meeting today I looked around at the laptops. Eight out of ten of them had USB 3.0 (the "blue port" - very easy to spot). The two that didn't had a half-eaten apple on the lid.

Stop apologizing for Apple's failure to support USB 3.0 in favor of its higher-priced connection. (I almost said "alternative", but T-Bolt and USB 3.0 have clearly different goals - and for most people USB 3.0 aligns with their goals.)
 
But T-Bolt is 4 parallel lanes, and 16 parallel lanes is the standard connector for most graphics cards.

Clearly skew is not a serious problem - when you have multiple packet-oriented serial interfaces running as a team, reassembling the packets in the correct order is a solved problem.

(And for people who don't realize it, T-Bolt takes 4 parallel signals, multiplexes it onto one serial signal, then de-muxes it back to 4 parallel lanes.)
Did we ever really complain about Thunderbolt's lack of lanes or bandwidth? I know we would "love" 16 lanes but even at x4 you can push a near flagship video card. If you are that concerned about storage speeds on a notebook when you have 10 Gbps bidirectional...
 
But T-Bolt is 4 parallel lanes, and 16 parallel lanes is the standard connector for most graphics cards.

Clearly skew is not a serious problem - when you have multiple packet-oriented serial interfaces running as a team, reassembling the packets in the correct order is a solved problem.

(And for people who don't realize it, T-Bolt takes 4 parallel signals, multiplexes it onto one serial signal, then de-muxes it back to 4 parallel lanes.)

No, most Thunderbolt controllers have a PCIe 2.0 x4 back end, but Thunderbolt is in no way 4 parallel lanes. Furthermore, we were discussing Thunderbolt cables, which carry two, full-duplex channels. As far as I know, data is not generally striped over the two channels, they operate as independent serial links.

How is inter-lane skew a solved problem? It still needs to be within certain bounds for the system to work, and as the frequencies go up, the UIs get tiny and it's all too easy to fall outside those bounds. Granted I probably should have picked something more obviously problematic such as differential skew, clock jitter, or the increased crosstalk that comes along with higher degrees of parallelism.

(And for those of you who don't realize it, AidenShaw's explanation of Thunderbolt is oversimplified to the point of being completely inaccurate. And yes, I do realize that the Thunderbolt protocol adapters are essentially SerDes.)

I'd love to see the statistics for USB 3.0 adoption on systems based on selling price.

Since there are no inexpensive Apples (although Apples seem to be getting "cheaper" on the quality front, but not the price front), I would bet that when you look at PCs comparably priced to Apples a much higher proportion of them would have USB 3.0 than the stats that you quote.

At a meeting today I looked around at the laptops. Eight out of ten of them had USB 3.0 (the "blue port" - very easy to spot). The two that didn't had a half-eaten apple on the lid.

Stop apologizing for Apple's failure to support USB 3.0 in favor of its higher-priced connection. (I almost said "alternative", but T-Bolt and USB 3.0 have clearly different goals - and for most people USB 3.0 aligns with their goals.)

I'm going to try to navigate your littered with trolling comments response as deftly as possible here. My overly wordy and pointlessly statistic laden response to toke lahti was an attempt to curb some of the hyperbolic rhetoric I often see on the forums. Namely, if you want to say that Apple held back on including USB 3.0 controllers in their PCs, there is only a certain window of time for which the argument is valid. By any reasonable interpretation of the available evidence, that window lies between May of 2011 and May of 2012.

Apple could not have shipped Macs with USB 3.0 in 2008 because the standard didn't exist.

Apple could not have shipped Macs with USB 3.0 in 2009 because no host controllers existed.

Apple had only one option for a USB 3.0 host controller in 2010, the NEC/Renesas µPD720200, which was not at all well suited for notebook or compact form factor PCs. It also wasn't available in quantities that would allow Apple to shift entirely to USB 3.0 across their entire product line. Arguing that Apple should have adopted at this point is pretty daft.

Apple actually could have shipped Macs with USB 3.0 in 2011 using discrete host controllers because certified controllers were now available from several vendors. In particular, Renesas introduced their 3rd gen controllers in March of 2011 and began ramping up production to levels sufficient for Apple shortly thereafter.

All of the Macs introduced thus far in 2012 do have USB 3.0.

So basically, the mid to late 2011 Macs could have had USB 3.0 if Apple opted to use discrete host controllers. Apple held out for one generation until an integrated solution was available from Intel.

At this point we have crossed the Rubicon and more PC's will ship with USB 3.0 than without. It's no longer a premium feature, it's the norm.

While USB 3.0 may align with the goals of most people, Apple doesn't necessarily try to sell to the majority. They target a wealthier more educated demographic. One that thinks differently. Even if it's a bunch of BS, it seems to be working for them.
 
No, most Thunderbolt controllers have a PCIe 2.0 x4 back end, but Thunderbolt is in no way 4 parallel lanes. Furthermore, we were discussing Thunderbolt cables, which carry two, full-duplex channels. As far as I know, data is not generally striped over the two channels, they operate as independent serial links.

As I said, the 4 parallel lanes PCIe lanes are encapsulated onto T-Bolt, then split back out to 4 parallel PCIe lanes on the other side.


How is inter-lane skew a solved problem?

If PCIe x16 is successful, it must have been solved. I suspect having packetized data helps - since each packet could have a sequence number or timestamp so that it would be easy to de-skew. (edit: see attachments)


(... And yes, I do realize that the Thunderbolt protocol adapters are essentially SerDes.)

Right. Doesn't that mean that I'm essentially correct?


http://www.pcisig.com/developers/ma...c_id=d9967efa833bbf0f223276571d647482be183e18
 

Attachments

  • pcie.jpg
    pcie.jpg
    129.3 KB · Views: 89
  • pcie2.jpg
    pcie2.jpg
    130.7 KB · Views: 88
Last edited:
Apple had only one option for a USB 3.0 host controller in 2010, the NEC/Renesas µPD720200, which was not at all well suited for notebook or compact form factor PCs. It also wasn't available in quantities that would allow Apple to shift entirely to USB 3.0 across their entire product line. Arguing that Apple should have adopted at this point is pretty daft.

You sound pretty daft. Apple COULD have and SHOULD have adopted USB 3.0 for iMacs, the Mac Mini and the Mac Pro at this stage, NONE of which have it as of this writing. Instead, they are STILL behind and that irritates the heck out of some of us. I've been waiting to replace my PowerMac Server with a Mac Mini server for some time now and the ONLY thing holding me up is the lack of USB 3.0.

Worse yet, that recent "update" to the Mac Pro in particular was PATHETIC. They would have been better off not doing anything at all than cheesing off all the professionals waiting for an update and then offering them crap. The VERY least they would have done was lower the price on the thing. Apple seems to think high iOS and notebook sales means they can just ignore the rest of their lines and sadly they are wrong. They are losing the professional market entirely. One might argue it's not worth it, but that's like saying car companies like Subaru shouldn't enter rally races since they aren't selling THAT car. But it brings the entire name and respect level up, which are steadily falling with Apple dropping professional features left and right. Apple seems to be aiming to become the Radio Shack of the 21st Century and that's SAD given the amount of capital they have which could ensure ALl those lines are up to date before anyone else.

Apple actually could have shipped Macs with USB 3.0 in 2011 using discrete host controllers because certified controllers were now available from several vendors. In particular, Renesas introduced their 3rd gen controllers in March of 2011 and began ramping up production to levels sufficient for Apple shortly thereafter.

And they didn't do that either. :rolleyes:

While USB 3.0 may align with the goals of most people, Apple doesn't necessarily try to sell to the majority. They target a wealthier more educated demographic. One that thinks differently. Even if it's a bunch of BS, it seems to be working for them.

Bullcrap. They're selling iPhones and iPods and iPads to the lowest common denominator at this point (well maybe the 2nd lowest if you count Samsung's devices as the lowest). The truth is they're more interested in form factor (thin thin thin thin thin) and style (unibody aluminum, glass, etc.) than UTILITY and there is no obvious market for that given no one has asked for an all encased with glass iPhone or even thinner Macbook Pros at the cost of ports and drive options. No, they seem to have gotten that from the late Mr. Jobs who was OBSESSED with thin (to no obvious account). And frankly, I wish they'd stop. Looks are OK, but don't destroy functionality for an extra 1/8" thickness off the thing. That's STUPID. Sadly, they've done just that lately (and a lot longer for things like graphics capability).
 
As I said, the 4 parallel lanes PCIe lanes are encapsulated onto T-Bolt, then split back out to 4 parallel PCIe lanes on the other side.

Existing Thunderbolt controllers have connections for either 2 or 4 PCIe 2.0 lanes, 0 to 2 DisplayPort sources, sometimes a DisplayPort sink, and 1, 2 or 4 Thunderbolt channels.

Let's take the DSL3510L Cactus Ridge 4C controller as an example, since it has a little of everything and that's what is in the 2012 Macs.

It has connections for 4 PCIe 2.0 lanes, which lead to an on die 8-lane, 5-port PCIe 2.0 switch. There is no requirement for those PCIe lanes to be used in parallel or even connected to anything at all, as in the case of a hypothetical DisplayPort only Thunderbolt device. They can be configured as a single 4-lane link, 1 or 2 2-lane links, 1 to 4 single lanes, or one 2-lane link plus 1 or 2 single lanes. Not only do the links not have to utilize lanes in parallel, they can even operate at different speeds. You could connect a PCIe 2.0 x2 SATA controller and a PCIe 1.1 x1 GbE controller and each would operate at the highest rate possible.

The PCIe switch in the Thunderbolt controller has to deserialize, decode and descramble the incoming symbols. Then it has to un-stripe any bytes from lanes that were operating in parallel, strip off the framing characters used by the physical layer, and hand the data off to the data link layer. The DLL then has to disassemble and sequence the link layer packets, perform error checking, process any DLLP's, and then hand the transaction layer packets up to the transaction layer. The packets can then be forwarded to the correct destination port via the switch, and the whole process is reversed.

The upstream port of the PCIe switch is linked to the PCIe to Thunderbolt protocol adapter, where the TLPs are encapsulated as Thunderbolt packets, which are then forwarded to the appropriate Thunderbolt channel by the Thunderbolt crossbar switch.

Meanwhile, the DisplayPort source signals enter the Thunderbolt controller and are demuxed, with one set of signals bypassing the Thunderbolt logic and being fed to the DisplayPort legacy PHY for each Thunderbolt port. The other set of signals go to the DisplayPort source to Thunderbolt protocol adapters. There the DP main link signals are deserialized, decoded, descrambled, deskewed, decrypted, demuxed and fed to the main stream and secondary data packet unpackers. After unstuffing, unpacking and clock recovery, the packets can be reframed as Thunderbolt packets and forwarded to the appropriate Thunderbolt channel by the Thunderbolt crossbar switch. Aux channel data appears to be relayed as well, so this too must be digested and packetized for transport over Thunderbolt. And of course there is a Thunderbolt to DisplayPort sink protocol adapter that does the same in reverse.

The Thunderbolt packets headed outbound are handed down to the data link layer which I presume is also responsible for the "novel time synchronization protocol" that Intel advertises. From there it is off to the Thunderbolt PHY, which encodes and serializes the data stream into four individual channels and sends two each to a pair of muxes which can switch between them and the legacy DP signals. The selected signals are then output from the controller and travel a very short distance to however many Thunderbolt ports the device offers.

That's one side of the Thunderbolt equation, and I pretty much just glossed over it, which is why I felt you might be oversimplifying things a wee bit too much.

If PCIe x16 is successful, it must have been solved. I suspect having packetized data helps - since each packet could have a sequence number or timestamp so that it would be easy to de-skew.

Well, the striping happens at the byte level, not the packet level though. And to be honest, PCIe 1.0 was designed to solve the inter-lane skew issues that PCI had at higher frequencies. It also wasn't much of an issue at the 5 GHz speeds of PCIe 2.0. At 8 GHz things start to get a little tight, and it is once again problematic at 10 GHz, which is why PCIe 3.0 didn't just double the frequency again.

I believe a bit of skew is intentionally introduced to prevent voltage fluctuations caused by all of the lanes signaling at the same exact time. That means that at high enough frequencies, additional skew from unequal trace lengths can push things to the point where all of the symbols may not arrive within the necessary window to correctly de-stripe the data coming off the lanes. If error correction can't fix things, then you have to retransmit the whole packet again. With traces on a motherboard (i.e. PCIe) this is generally quite manageable. With external cables, it's a whole other kettle of fish.

You sound pretty daft. Apple COULD have and SHOULD have adopted USB 3.0 for iMacs, the Mac Mini and the Mac Pro at this stage, NONE of which have it as of this writing. Instead, they are STILL behind and that irritates the heck out of some of us. I've been waiting to replace my PowerMac Server with a Mac Mini server for some time now and the ONLY thing holding me up is the lack of USB 3.0...

I've felt like I was watching paint dry all this year waiting for Apple to roll out their new models, and the Mac Pro spec bump was aggravating as all heck. I am not saying that USB 3.0 isn't long overdue on Macs or that they could not have done it before now. I am not trying to defend all of their design decisions over the past 3 years.

All I was saying is that mid-2011 is about as early as they could have added USB 3.0 to Macs, which was still over a year ago. Claiming that they could have done it years before that is ridiculous. The product would have sucked if that had done that.

How many PCs have you seen from 2010 where all of the USB ports are SuperSpeed? There really weren't any, because people need at least a couple USB ports that actually work. The early silicon and drivers were flakey and had compatibility issues.
 
More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason. Nobody wants to see a return of the old parallel SCSI cables with 50-pin Centronics connectors.
Care to tell us what is your golden amount of pins then?
Nobody's suggested 50 pins for tb.
But those who design with price efficiency in mind, can add some lanes to the soup. Was dual-link DVI somekind of problem?
Did HDMI make terrible mistake by adding dual link (type B connector) to their 1.3 spec?
Tb has 20 pins now. Would 24 or 28 pins made it a big disaster?
If tb v1 would have been specced to 3 channels and 6 Gbit/s per channel, nobody would have noticed any more bottlenecks than they do now.
Or with 4 channels and 6 Gbit/s it would be faster than it is now.
And both of these could have been used with passive cables and all tb stuff would be adopted in greater scale and become more affordable faster.
Also it would be more future-proof if it had both more channels and more bitrate per channel to grow in the future.
There is an optimum in mass production between cost and rocket science. Clearly Apple still haven't got this.
Apple only has a little more than 5% of the global market and around 12% of the US market for PCs based on unit shipments, but they take in over 35% of the operating profit for the entire industry. Apple needs to differentiate itself from the competition in order to justify higher unit prices and maintain those higher profit margins. Thunderbolt is just one more way to reinforce the perception that their devices are more elite, capable and valuable.
[...]
But USB 3.0 devices were not available until 2010, two years later. Do you think USB would be as popular today if it had been UPB (Universal Parallel Bus) instead? Also note that USB is massively popular, whereas SuperSpeed USB is just beginning to gain momentum. There are 6 billion USB devices in use, but even by the end of 2012, only 7% will be of the USB 3.0 variety.
Huh, only 420 MILLION usb3 devices by the end of 2012!
How much there should be for you to recognise usb3 as totally biggest thing at the moment in interconnecting devices?
According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.
Can you also tell us the numbers from 2011?
Do they tell better which one is more significant for IT world, macs or usb3?
Why would Apple spend time developing and testing a driver for silicon that didn't meet their design criteria and was not available in sufficient quantity? Seriously, not even joking, the earliest Apple could have adopted USB 3.0 was mid 2011, and they would have had to lean hard on their suppliers to do so. Whether they should have done so or not at that point is entirely debatable, but saying that they could have added USB 3.0 in 2010 or even early 2011 is just not in line with reality.

The problem with standards is, well, you need to wait until things are standardized. Despite USB 3.0 being ratified in December of 2008, the first 4-port controller wasn't even certified until April of 2011.
C'mon, we all know how much Apple can do about things that they care with their 35% profits of the industry. Too bad that they have monopoly on OsX machines and can choose not to do what customers want. Eg. for years I've been able to buy from Apple a fullHD laptop with matte screen. No matter what size of the screen, I can't do it anymore. Hopefully Apple has grown so big, that anti-trust laws will force it to split OsX from the gadgets and others can start making machines for it also. On the other hand worst thing for OsX that could happen is that Apple decids to kill it, since they think they don't need halo from the pro segment anymore, then they don't need halo from the macs to iOS and iOS makes the best profits, so goodbye OsX... Maybe there will some sort of iOS-macs in the future, but...
If Apple would have cared about what's most beneficial for most mac users, they could have used a tiny fraction of the resources they used to tb, to pushing usb3 out sooner and better.
Can you come up with any other reason why Apple was totally passive with usb3, than they had to put tb to macbooks first, because otherwise tb would have been considered almost useless comparing price-performance against usb3?
My guess is that you'd be solidly into 6-digit territory for the number of Thunderbolt devices sold. And discounting the Apple Thunderbolt Display would be asinine. As would not including Ethernet dongles, unless you can point to a currently available USB 3.0 GbE adapter. And replacing the functionality of FireWire gear you already own with USB 3.0 stuff is not only potentially impossible at this point, but also probably more expensive than a buying a $29 adapter. However, the adapters aren't even available yet, so it makes no difference whether you include them or not.
ATD does not need tb for the connections it has on the back. Those all could be handled with single usb3 connection. If there would be Apple Usb3 Display $100 cheaper than ATD, ATD would sell as much as 17" MBP compared to other MBPs.

Only thing ATD does more is that it allows daisy chaining of second external display and this only feature is taken away if/when external retina display is introduced.

Btw,
now that new Airs have usb3, why Apple is not offering usb3-GbE-dongle?
Even when this is available:
http://electronicdesign.com/article/digital/USB-to-GbE-Controller-Sits-On-One-Chip-
?
But to be more realistic, the attach rate numbers are probably at least 2% for Thunderbolt, and according to the USB-IF were only around 60% for USB 3.0 after the first 18 months. The number of certified devices was only 250 for USB 3.0 vs. 50 for Thunderbolt, so that would be 5x more different products and 30x the attach rate.
So usb3 has only 5x variety in products and 30x attach rate!
This makes tb as succesfull as what? Rdram?
Also, there is still only a very limited range of device silicon available for USB 3.0. You have thumb drives, card readers, SATA bridges, 4-port hubs, cameras and a media player. In the first 18 months, how many USB 3.0 to 6Gb/s SAS/SATA bridges, GbE or 10GbE adapters, HD video interfaces, Fibre Channel adapters, PCIe or ExpressCard expansion chassis, or displays anything remotely like the ATD were available? Despite some initial overlap, USB 3.0 and Thunderbolt will end up being used for very different purposes. Basically if a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?
Exactly!
If a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?
That's why other manufacturers have been using usb3 and not tb.
Other manucturers than Apple don't think that external display is the best available docking station. If they make a docking station, they let the user decide what kind of display suits the user's needs.
Most things that use tb would be pretty much as snappy with usb3, but a whole lot cheaper.
Do you really think there will be crowds of angry customers with pitchforks when they discover they can only run a single external display with a resolution higher than 6MP? I’d love to see the performance of an MBA trying to drive more than that many pixels. According to your logic, consumers should only be allowed to have 1920x1080 displays anyway, because they are so much cheaper due to economies of scale.
The angry one here would be Apple. They think that it confuses too much their customers, if you can attach only one display in certain conditions and two displays in other. And of course Apple would have to scrap current displays in the same second they announce new external retinas, so that users who'd like to use 2 external displays, can't do it any more ("We have this new great FcpX, which is buggy and have totally no support from 3rd party, but no, you can't buy FSC3 upgrade anymore, because we know you don't want it anymore...").
Even driving 2 daisy-chained 2560x1440 ATDs, PCIe bandwidth over Thunderbolt is only reduced by about 16%, and just in the outbound direction and only for devices attached to that chain. You constantly exaggerate the impact of DP on Thunderbolt's PCIe performance, when the odds of it significantly affecting any real-world workflows are slim to none. How would a USB 3.0 controller fare trying to drive two 2560x1440 displays while writing more than 8.4 Gbps to an external RAID array? Oh right, it can't do either of those things anyway...
Again, use your imagination and look to the future.
Care to count how much one 4k 10-bit 3D display would take from tb?
Usb3 wouldn't have to drive this, because there's this dp connection next to it, which is dedicated for handling displays.
But it is far more likely that an inexpensive Thunderbolt to USB 3.0 dongle based on the Port Ridge controller and leveraging Apple's drivers will see the light of day as soon as Mountain Lion and/or the 10.7.5 update become available to the general public.
Wishful thinking of the day!
Apple is trying to sell current models of MBP and usb3 is the one thing older models don't have. Why would they give new model's feature to the old ones? They would loose some sales, so they would never do that.
Did they offered update to my 2009 MBP, when next model could switch GPU on the fly?
Did they offer new EFI to old MP's when new GPU's would have needed it?
Do they give update to my MP1,1's EFI when ML ships?
All I was saying is that mid-2011 is about as early as they could have added USB 3.0 to Macs, which was still over a year ago. Claiming that they could have done it years before that is ridiculous. The product would have sucked if that had done that.

How many PCs have you seen from 2010 where all of the USB ports are SuperSpeed? There really weren't any, because people need at least a couple USB ports that actually work. The early silicon and drivers were flakey and had compatibility issues.
Again, if Apple had had any interest adopting usb3 sooner than they had to do it without loosing their face, they could have done a lot for usb3 to mature faster. Instead they did a whole lot bigger, harder and more complex thing with tb in secrecy with intel. And that secrecy of course postponed 3rd party td devices few years. Why Apple didn't want more 3rd party from the start? Maybe even they realized that they took too expensive and too long step from the beginning. There is a window in time, money and tech to do things with biggest impact and I do think tb did wrong in all 3. Too expensive, too early for chosen tech(+ usb3 too late) and future upgrade roadmap blocked in some ways (retina tb).

Saying that silicon fabs couldn't have produced 10 million additional usb3 chips to Apple in 2010, if Apple had ordered them is as ridiculous as saying that Apple couldn't written non-flakey drivers for usb for the last decade.

You might have some intelligent explanation why $5k MP has half the speed of usb2 port than crappy $500 windoze-ultrabook?

EDIT: new things in bold.
 
Last edited:

Toke, just give up.

Repoman just won't acknowledge that T-Bolt is a technology doomed to near irrelevance by its high cost and marginal value for the 99% (compared to USB 3.0).

And I don't think that I've seen him agree that tying T-Bolt and DisplayPort together on the same connector/cable was a monumental folly.

Maybe T-Bolt v2.0 will drop DisplayPort, include real optical support, and survive. T-Bolt v1.0 is looking very much like a DOA technology.
 
Last edited:

20 pins seems like a good number. Adding more materials to the BOM rarely makes something cheaper, yet Moore's law shows us that the silicon used in last year's Thunderbolt cables will approximately halve in cost every couple years. (Hey, just like original article implies.) Dual-link DVI is one of the few video connections that isn't easily converted to other common formats without the use of a ~$100 adapter... because it used 6 lanes. And you'll note that nobody makes Type B HDMI gear. We'll see if anyone decides to use that connector before they bump the single-link speed again. I think you're missing the point that Intel had created an ASIC that was based on four 10 Gbps channels, and Apple had already developed the mini-DP connector. They are both good designs. Considering their capabilities, $30 to the OEM for the controller, and $49 to the consumer for the cable are absolutely reasonable for a first generation I/O technology like this. The thing about Thunderbolt is it has no baggage. It is at the same time the most advanced and the most future-proof I/O interface you can find on a PC these days.

USB 3.0 is not the biggest thing in interconnects at the moment. USB 2.0 is, and massively so. See how several hundred million is still an order of magnitude less than several billion? USB 3.0 is the 3rd generation of the most popular I/O interface in history. USB is cheap and common and nothing to get excited about really.

About 77m USB 3.0 controllers and 17.8m Macs shipped in 2011. USB 3.0 missed their projections for the year, while Apple exceeded theirs. If you're in IT, you should probably be paying attention to Apple, since pretty much everyone else is following their lead these days. USB is a standard designed by consensus. If you like watching C-Span, the USB-IF's keynotes might be right up your alley.

To be honest, I think Steve Jobs wanted to see Thunderbolt in the world before he departed it. He knew USB 3.0 would be in every PC once AMD and Intel integrated it into their chipsets. It didn't matter to him. Thunderbolt was something he really wanted to see become reality, though. It's the type of thing that's pretty much insanely ahead of it's time, hence it's expensive and most people don't get it. That's the type of stuff Apple loves. Everyone else in the industry will reach for the lowest common denominator—USB 3.0 needed no support from Apple to become ubiquitous.

The ATD needs Thunderbolt for the GbE and FireWire ports. USB 3.0 to GbE still hasn't arrived yet, although I'm not sure exactly why, and USB to FireWire generally isn't possible due to the differences in architecture.

Thunderbolt may possibly be the most successful 1st generation PC I/O port in history, due solely to Apple's commitment to it. I'd be curious if anyone can point to statistics that show otherwise.

Other manufacturers were not able to ship PCs or motherboards with Thunderbolt until a little over a month ago due to Apple's exclusivity agreement. Sony did use what was essentially Light Peak for their Vaio Z, because USB 3.0 alone was not sufficient for their Power Media Dock.

I find it odd that you have so much faith in what is currently $10,000 display technology suddenly becoming affordable in the next two years, but you have issues with Apple taking what is currently $500-$1000 per port I/O technology and bringing it to the desktop for about $100 per port. As for driving a 4K display over Thunderbolt using both DP 1.1a streams... Generally one goes for color accuracy or 3D / high refresh rates, but at 4096x2160, 24 bpp, 48 Hz (that would be the 3D refresh rate at 4K) you'd still have more than twice the bandwidth of USB 3.0 left over for PCIe. At 4096x2160, 30 bpp, 60 Hz you'd still have almost 3 Gbps outbound and of course your full 10 Gbps inbound. Oh, and the 2012 Macs have a USB 3.0 port right next to the Thunderbolt port in case you need that too.

Apple most likely won't make a Thunderbolt to USB 3.0 dongle, but that will not stop someone else from doing it in a heartbeat once all the heavy lifting has already been done.

I was almost going to suggest earlier that the real reason Apple hasn't given us Macs with USB 3.0 before now is because they refused to release their driver until it could at least match the performance of the Windows ones.

Toke, just give up.

Repoman just won't acknowledge that T-Bolt is a technology doomed to near irrelevance by its high cost and marginal value for the 99% (compared to USB 3.0).

And I don't think that I've seen him agree that tying T-Bolt and DisplayPort together on the same connector/cable was a monumental folly.

Maybe T-Bolt v2.0 will drop DisplayPort, include real optical support, and survive. T-Bolt v1.0 is looking very much like a DOA technology.

Maybe the 1% look at things differently. I'll take better over cheaper any time I can.

Not even 18 months out and over 20 million ports shipped, 50 devices on the market, currently more than 2 new products coming to market every week...

And by the way, the only thing the 99% generally use a more than 6 Gbps external I/O connection for is to drive digital displays. HDMI and DP are the only common interfaces with that kind of bandwidth. That's why it makes sense to combine them. Thunderbolt would actually be useless to most people if it wasn't also their video out port.

Since you've never actually used Thunderbolt, how can you be so sure it isn't a good technology?
 
I just saw my first Thunderbolt hard drive product at a Best Buy (BTW, 3TB drives have gone UP since I bought my last two; I paid probably $115 on average for the last two and this one was $150 for the exact same drive/brand WD USB3). Anyway, the Thunderbolt drive was huge (basically 2x the size of the 3TB WD drive which was already 2 drives thick since they're basically two 1.5TB drives raided together) and it was 4TB in size (I'm guessing four 1TB drives raided for speed) and cost $576 if I recall correctly. That's pretty pricey for 4TB, but if it really is raided for speed, it should be pretty quick. But even so, they're not SSD drives and if they're just RAID 0, they're not protected against errors. I didn't really look that closely since I don't have Thunderbolt and wow that's pricey. I've got 9TB for $380 now. I'm using them for backup and off-site backup respectively, so I guess I'd have to compare 3TB for $380, but even if that array is a higher RAID setup (doubt it since it'd be 4.5TB if they were 4 1.5TB drives RAIDED), it still doesn't solve an off-site backup solution (they have "LIVE" drives for a "CLOUD" off-site option these days on top of the local storage, but that would take awhile to transfer that kind of data).
 
I just saw my first Thunderbolt hard drive product at a Best Buy (BTW, 3TB drives have gone UP since I bought my last two; I paid probably $115 on average for the last two and this one was $150 for the exact same drive/brand WD USB3). Anyway, the Thunderbolt drive was huge (basically 2x the size of the 3TB WD drive which was already 2 drives thick since they're basically two 1.5TB drives raided together) and it was 4TB in size (I'm guessing four 1TB drives raided for speed) and cost $576 if I recall correctly. That's pretty pricey for 4TB, but if it really is raided for speed, it should be pretty quick. But even so, they're not SSD drives and if they're just RAID 0, they're not protected against errors. I didn't really look that closely since I don't have Thunderbolt and wow that's pricey. I've got 9TB for $380 now. I'm using them for backup and off-site backup respectively, so I guess I'd have to compare 3TB for $380, but even if that array is a higher RAID setup (doubt it since it'd be 4.5TB if they were 4 1.5TB drives RAIDED), it still doesn't solve an off-site backup solution (they have "LIVE" drives for a "CLOUD" off-site option these days on top of the local storage, but that would take awhile to transfer that kind of data).

Was it this one? (it's 2 2TB drives or 2 3TB drives, with RAID-0 or RAID-1)

http://www.wdc.com/en/products/products.aspx?id=630
http://www.newegg.com/Product/Product.aspx?Item=N82E16822236217

If so, the USB/1394/eSATA version is $200 cheaper than T-Bolt.
 
Last edited:
Was it this one? (it's 2 2TB drives or 2 3TB drives, with RAID-0 or RAID-1)

http://www.wdc.com/en/products/products.aspx?id=630
http://www.newegg.com/Product/Product.aspx?Item=N82E16822236217

If so, the USB/1394/eSATA version is $200 cheaper than T-Bolt.

Yeah, I think that's it. I guess that means my 3TB models are just one drive since that one is definitely twice as thick. As for the price, I always figured TB would cost $150-200 more since that's what Firewire drives tended to cost above just USB 2.x or even eSata + USB 2.0. I paid quite a bit for a 500GB FW800 drive a few years ago for my Macbook Pro. I think the internal 7200 RPM 500GB drive I put in cost me like $80 compared to like $300 for the external micro FW800 drive, but I did buy the former over a year after the external since that is when I was setting the MBP up for Logic Pro (also upgraded memory to 4GB).

Now lucky me I've got a noisy fan on the left side. It started a few months after the upgrade. I thought maybe I screwed something up when I put it back together, but I've read it's extremely common on these things. Mine is apparently far less noisy than most. I plan to take it apart and try to lube it. I didn't want to attempt it until my music project was done (took 2.5 years) just in case I screwed something up somehow. It hasn't really gotten any worse since then, though. Kind of a "wub wub" sound that's just loud enough to be annoying and not much else but definitely only on the left side (apparently the most common to go bad since most of the heat is over there; maybe it dries up the grease or something). The only good thing is my 8600M GT GPU has never gone bad, but then I keep the fan higher than normal and rarely play games on it. Certainly, I was very happy with Logic on it. Now my FW stuff will be a PITA on newer models that dump all that stuff in favor of dongles.

I've been debating whether to grab a 17" MBP while I can still get one. It's only missing USB 3.0 and I don't really need that on a notebook, necessarily since it has FW800 on it and you could technically add it with a TB adapter at some point and the 17" size would be nice for portable music project work, but then if this album I've about to release doesn't sell, I probably won't bother with a second project.

Logic, though is wonderful. I couldn't have asked for better results for sound quality and the effects were good enough I didn't need external guitar boxes, etc. (they could be improved upon, though; a lot of sounds need tweaked). It'd be a darn shame if Apple drops the ball on it. It's already been a good long time since the last update. Why bother with professional interfaces like TB if you don't properly maintain your Pro software lines? They cheesed a lot of people off with Final Cut X. I hope any Logic Pro X doesn't screw things up. It's already fantastic, but I'm sure many Pros still have features they'd like to see. I'd rather see improved default sound banks for the soft-synths and what not. I was pretty happy with editing for the most part, although cut/pasting bits for smooth extensions or loops could have been a bit easier, but then I don't know how to use every single feature it's so extensive. I managed to do everything I set out to do, though so it wasn't too bad even so.
 
Dual-link DVI is one of the few video connections that isn't easily converted to other common formats without the use of a ~$100 adapter... because it used 6 lanes. And you'll note that nobody makes Type B HDMI gear. We'll see if anyone decides to use that connector before they bump the single-link speed again. I think you're missing the point that Intel had created an ASIC that was based on four 10 Gbps channels, and Apple had already developed the mini-DP connector. They are both good designs.
I think you are missing the point that how well you can convert tb to other formats? I've used dl-dvi for years without any need to convert it to anything.
Maybe you can tell us why dp supports only sl-dvi without any conversions?
USB 3.0 is not the biggest thing in interconnects at the moment. USB 2.0 is, and massively so. See how several hundred million is still an order of magnitude less than several billion? USB 3.0 is the 3rd generation of the most popular I/O interface in history. USB is cheap and common and nothing to get excited about really.
You could also say that windowsXP is the biggest thing on OS's at the moment, but I think you know what I meant. Usb3 is the biggest new & fast interconnection. And maybe you are not excited for the best ratio for speed per buck, but many of us are.
The ATD needs Thunderbolt for the GbE and FireWire ports. USB 3.0 to GbE still hasn't arrived yet, although I'm not sure exactly why, and USB to FireWire generally isn't possible due to the differences in architecture.
Usb3 could easily handle GbE and fw.
Also, if you checked the link I gave, usb3-2-GbE is already in silicon, so I bet it will be on the shelf in few months. Usb3 has everything necessary for usb3-2-fw. Check out the specks. I guess the reason why they still don't exist, is so low demand for them.
I find it odd that you have so much faith in what is currently $10,000 display technology suddenly becoming affordable in the next two years, but you have issues with Apple taking what is currently $500-$1000 per port I/O technology and bringing it to the desktop for about $100 per port. As for driving a 4K display over Thunderbolt using both DP 1.1a streams... Generally one goes for color accuracy or 3D / high refresh rates, but at 4096x2160, 24 bpp, 48 Hz (that would be the 3D refresh rate at 4K) you'd still have more than twice the bandwidth of USB 3.0 left over for PCIe. At 4096x2160, 30 bpp, 60 Hz you'd still have almost 3 Gbps outbound and of course your full 10 Gbps inbound. Oh, and the 2012 Macs have a USB 3.0 port right next to the Thunderbolt port in case you need that too.
Apple has introduced high pixel density screens now in phones, tablets and laptops. How logical would it be to stop here?

4k is standard in DCI and there are already even consumer 4k video cameras. Also still photography benefits from HiDPI. 4k televisions are coming. How much HiDPI has increased so far Apple's devices' prices? Economy of scale just works here. There's nothing new that when mass production adopts new tech, one or two zeros drop from price.
When 4k computer screens go mainstream, nobody will accept 24 Hz for 3D. Same display has to be good in sports-tv, video games and movies. So it will have to be at least 4096x2160, 30 bpp, 60 Hz and you can double that for 3D. All of a sudden a usb3 port is faster for data when these 4k-3D-displays are connected to tb.
So tb has to have new revision very soon now. This will be very problematic for Apple since they are obsessed for over-simplifying and two versions of tb this close in time and also before than the first version even gets widely accepted will be very bad. Mac customers will be pissed off, when they notice that their 1/2/3 year old flagship is all of a sudden obsolete because of its "the most advanced and the most future-proof I/O". There's your baggage.
Apple most likely won't make a Thunderbolt to USB 3.0 dongle, but that will not stop someone else from doing it in a heartbeat once all the heavy lifting has already been done.
And that dongle will cost customer 10 times more than what including usb3 port to a mac would have cost to Apple.
I was almost going to suggest earlier that the real reason Apple hasn't given us Macs with USB 3.0 before now is because they refused to release their driver until it could at least match the performance of the Windows ones.
Why windows machines perform so good?
Can't Apple write drivers for their OS?
Or did they just started writing so much late?
Maybe the 1% look at things differently. I'll take better over cheaper any time I can.
That 1% wasn't enough for 17" MBP. Apple will ditch tb as soon as it has more profitable way to do same things.
Not even 18 months out and over 20 million ports shipped, 50 devices on the market, currently more than 2 new products coming to market every week...
Still, usb3 is doing 5x/30x better, like you calculated...
And by the way, the only thing the 99% generally use a more than 6 Gbps external I/O connection for is to drive digital displays. HDMI and DP are the only common interfaces with that kind of bandwidth. That's why it makes sense to combine them. Thunderbolt would actually be useless to most people if it wasn't also their video out port.
Combining two high speed ports is slower than giving two ports. But you are right; it seems that Apple's biggest fear is expensive port that has no use. Now that macs are getting more widely hdmi ports, tb might just end like that...
Since you've never actually used Thunderbolt, how can you be so sure it isn't a good technology?
Just lookin' at price tag is enough!
 
Originally Posted by repoman27
Since you've never actually used Thunderbolt, how can you be so sure it isn't a good technology?

Just lookin' at price tag is enough!

Didn't the late turtlenecked overlord teach you that it was good to pay $500 for a $300 disk drive because it used Apple-only technology - even though the real world performance of the $300 drive was the same?

:rolleyes:
 
I think you are missing the point that how well you can convert tb to other formats? I've used dl-dvi for years without any need to convert it to anything.
Maybe you can tell us why dp supports only sl-dvi without any conversions?

If you have a display that requires DL-DVI and lacks DP or HDMI, you need an expensive adapter. This is true for both DP and HDMI outputs because they reduced the pin count to 20 and 19 pins respectively from DVI's 25 in order to make more compact and mobile device friendly connectors. DL-DVI uses 6 signaling pairs, whereas DP only has 4 and HDMI only has 3. Then again, the only displays that present this issue cost over a grand at the time they were originally sold, so paying 10% of that for an adapter can be rationalized I guess.

You could also say that windowsXP is the biggest thing on OS's at the moment, but I think you know what I meant. Usb3 is the biggest new & fast interconnection. And maybe you are not excited for the best ratio for speed per buck, but many of us are.

I had to restrain myself yesterday from impulse buying 2 new USB 3.0 drive enclosures that I came across because they were less than $100 combined. I tend to have to deal with a lot of legacy equipment in the field, so I'm not as excited now as I will be when the majority of that gear finally has a USB 3.0 port on it. (I also still have to deal with a fair number of PC's running XP.) I recommended a USB 3.0 backup device that came bundled with a free USB 3.0 PCIe adapter to a client a few months ago, and they decided that they could save about $50 by just going with a USB 2.0 only version because they didn't need it to be fast. Total face-palm.

Also, if you checked the link I gave, usb3-2-GbE is already in silicon, so I bet it will be on the shelf in few months. Usb3 has everything necessary for usb3-2-fw. Check out the specks. I guess the reason why they still don't exist, is so low demand for them.

Actually, I'm not sure why the USB 3.0 GbE adapters aren't shipping already. I wonder if they're having difficulty getting them to fit in the power envelope of a standard USB device.

There is no device silicon yet to bridge FireWire to USB 3.0, and I wonder if there would ever be enough demand to make a dongle that could actually do so. USB not allowing DMA makes FireWire to USB a bit tricky though. I'm not sure it would ever be able to offer all of the same functionality.

Apple has introduced high pixel density screens now in phones, tablets and laptops. How logical would it be to stop here?

4k is standard in DCI and there are already even consumer 4k video cameras. Also still photography benefits from HiDPI. 4k televisions are coming. How much HiDPI has increased so far Apple's devices' prices? Economy of scale just works here. There's nothing new that when mass production adopts new tech, one or two zeros drop from price.
When 4k computer screens go mainstream, nobody will accept 24 Hz for 3D. Same display has to be good in sports-tv, video games and movies. So it will have to be at least 4096x2160, 30 bpp, 60 Hz and you can double that for 3D. All of a sudden a usb3 port is faster for data when these 4k-3D-displays are connected to tb.
So tb has to have new revision very soon now. This will be very problematic for Apple since they are obsessed for over-simplifying and two versions of tb this close in time and also before than the first version even gets widely accepted will be very bad. Mac customers will be pissed off, when they notice that their 1/2/3 year old flagship is all of a sudden obsolete because of its "the most advanced and the most future-proof I/O". There's your baggage.

I realize 4K displays are coming, and I'm all for them. I am really excited about the "retina" trend. I'll be very happy to see the day when it is common for displays less than 30" to be 300-600 ppi. The problem is that there are significant barriers to pushing this type of technology into the mainstream right now. For instance, trying to broadcast sports at 4096x2160, 30 bpp, 60 Hz presents significant problems for the content providers. If there is no adequate means of content delivery, there will be no content for the end user, and thus no demand for the higher resolution displays. That's why digital cinema projectors are hitting those resolutions, but it hasn't really made it to the home yet. We can't all have the studios mail us a hard drive with a movie on it, and the intertubes aren't ready for Netflix to start offering 4K streaming to the masses.

In the PC/tablet space, screen resolution is currently bound by the graphics capabilities of the device. The MBPR has a half decent GPU, and it has just enough horsepower to maintain fluidity driving a single panel at 2880x1800, 24 bpp, 60 Hz. 4096x2160, 30 bpp, 120 Hz would require 35 Gbps of pixel data! That's more than two DisplayPort 1.2 or four HDMI 1.4a connectors could drive. What GPU could keep up with that? Even the AMD Radeon 7970 and NVIDIA GeForce GTX 690 cannot push that many pixels.

So for the foreseeable future, Thunderbolt is looking pretty good. Intel has said that the first speed increase will likely occur in 2014, and it would stand to reason that it will jump to 25 Gbps per channel at that time. Plenty to deal with the increased burden of large, hi-dpi digital displays when they do arrive.

And that dongle will cost customer 10 times more than what including usb3 port to a mac would have cost to Apple.

Yep.

Didn't the late turtlenecked overlord teach you that it was good to pay $500 for a $300 disk drive because it used Apple-only technology - even though the real world performance of the $300 drive was the same?

:rolleyes:

Well, it's more like you're paying $389 (plus $49 for a cable) for a $169 disk drive (which is even more of a delta) because it uses Intel-only technology to deliver about the same performance.

However, if you're looking at multiple disks or SSD's, or non-storage based applications, Thunderbolt offers far better real-world performance than USB 3.0 for your extra $250. Pretty much anything that falls under the 275 MB/s threshold would be more economically achieved via USB 3.0 though.
 
If you have a display that requires DL-DVI and lacks DP or HDMI, you need an expensive adapter. This is true for both DP and HDMI outputs because they reduced the pin count to 20 and 19 pins respectively from DVI's 25 in order to make more compact and mobile device friendly connectors. DL-DVI uses 6 signaling pairs, whereas DP only has 4 and HDMI only has 3. Then again, the only displays that present this issue cost over a grand at the time they were originally sold, so paying 10% of that for an adapter can be rationalized I guess.
You are missing the point. Has there ever been any problems for dl-dvi users for dl-dvi using 6 pairs of signals? Cables too thick, too short, too expensive, too heavy?
I never had any problems with the size of full dvi connection in powerbooks and there's also mini-dvi available. If the only problem is how to convert dl-dvi to other (newer) formats, then isn't these other (newer) formats the problem?

So once again, if dp & tb designers would had put room for few additional pins, all cables and adaptors would be affordable. I don't believe that BOM of pins have any significant role here. Designers just design these things for present time & need, without looking to the future.
Actually, I'm not sure why the USB 3.0 GbE adapters aren't shipping already. I wonder if they're having difficulty getting them to fit in the power envelope of a standard USB device.
I guess that there are so few computers with usb3, but no rj45, that there just haven't been so much demand for these in the past.

Usb3 increased power supply from 500mA to 900mA. Pretty surprising, if GbE uses over double what 100MbE uses.
There is no device silicon yet to bridge FireWire to USB 3.0, and I wonder if there would ever be enough demand to make a dongle that could actually do so. USB not allowing DMA makes FireWire to USB a bit tricky though. I'm not sure it would ever be able to offer all of the same functionality.
There's so much advanced tehniques used in Usb3 compared to usb2, that I don't think that DMA access is so important any more. Also computers and devices have a lot more processing power in multiple cores to spend. Also DMA has security problems.
I realize 4K displays are coming, and I'm all for them. I am really excited about the "retina" trend. I'll be very happy to see the day when it is common for displays less than 30" to be 300-600 ppi. The problem is that there are significant barriers to pushing this type of technology into the mainstream right now. For instance, trying to broadcast sports at 4096x2160, 30 bpp, 60 Hz presents significant problems for the content providers. If there is no adequate means of content delivery, there will be no content for the end user, and thus no demand for the higher resolution displays. That's why digital cinema projectors are hitting those resolutions, but it hasn't really made it to the home yet. We can't all have the studios mail us a hard drive with a movie on it, and the intertubes aren't ready for Netflix to start offering 4K streaming to the masses.
But we can have play.com or amazon mail us a blu-ray. I guess that within 2 years we will have new bd version, which is 4k.
4k is essentially for motion pictures for first decade, but other will follow.
3D broadcasts will be mainstream in a few years and 4k is next.
Anyway broadcast standards didn't stop Apple selling rMBP. Display industry is moving to higher pixel densities. There's nothing to stop it. Television and computer is mixing more and more all the time, so pretty soon broadcasting standards are not so important when buying displays.
In the PC/tablet space, screen resolution is currently bound by the graphics capabilities of the device. The MBPR has a half decent GPU, and it has just enough horsepower to maintain fluidity driving a single panel at 2880x1800, 24 bpp, 60 Hz. 4096x2160, 30 bpp, 120 Hz would require 35 Gbps of pixel data! That's more than two DisplayPort 1.2 or four HDMI 1.4a connectors could drive. What GPU could keep up with that? Even the AMD Radeon 7970 and NVIDIA GeForce GTX 690 cannot push that many pixels.
Half decent isn't enough then, so this may be a big broblem to Apple. They'd have to upgrade their GPU offerings and drivers to full decent!

There's already dual-dp in use. Next upgrade to dp & hdmi specs will come soon. Doubling the speed in every revision is usual business. GPU power will keep increasing like it has for decades. You sound just like people who said that you just can't put high pixel density to phones, tablets or laptops. Big screen with desktop computer is the easiest option here, but once again, it seems to a bit out of Apple's focus.

So for the foreseeable future, Thunderbolt is looking pretty good. Intel has said that the first speed increase will likely occur in 2014, and it would stand to reason that it will jump to 25 Gbps per channel at that time. Plenty to deal with the increased burden of large, hi-dpi digital displays when they do arrive.
You keep repeating that tb is looking good. Is it really?

So far it seems that first year when wide usage is even possible is 2013. And for big retina displays, they need new version 2014 at the latest, maybe even next year. Meaning that tv v1 could be widely used only under 2 years or maybe even only less than year. That kind of upgrade path will not lead to happy ecosystem.

How much do you think those 25Gb tb devices and cables will cost in 2014? Twice the price of today? Quadruple? Add one zero to price? None of these will work. Tb's main object should be getting the price down ASAP and increasing channel speed will do just opposite.
Well, it's more like you're paying $389 (plus $49 for a cable) for a $169 disk drive (which is even more of a delta) because it uses Intel-only technology to deliver about the same performance.

However, if you're looking at multiple disks or SSD's, or non-storage based applications, Thunderbolt offers far better real-world performance than USB 3.0 for your extra $250. Pretty much anything that falls under the 275 MB/s threshold would be more economically achieved via USB 3.0 though.
And at the same time we have to keep ignoring the most reliable (s.m.a.r.t) $1-solution to connect storage: the native connection or eSATA(p).

When 99% of high bandwidth need is either display or storage, I just can't stop thinking how much easier and cheaper it would have been, if Apple would have just used dp & esata.
 
When 99% of high bandwidth need is either display or storage, I just can't stop thinking how much easier and cheaper it would have been, if Apple would have just used dp & esata.

I can give you an example.

I recently added 12 TB as a 9 TB RAID-5 array to my home PC for backup storage.

Code:
[B]$100[/B] Sans Digital TR4M 4 drive hot-swap eSATA external cabinet, includes free cable! ([url=http://www.newegg.com/Product/Product.aspx?Item=N82E16816111177]newegg[/url])
[B]$600[/B] WD WD30EZRX 3 TB Intellipower 64 MiB cache (4)([url=http://www.newegg.com/Product/Product.aspx?Item=N82E16822136874]newegg[/url])
--------
[B]$700[/B] Total (before sales tax)

Apple sells the 12 TB Pegasus for $2499 (without cable) - and while it's more performant than my setup on paper, for backups it would be more than 3 times the cost for no added value.
 
Last edited:
The Pegasus is also upgradable to 24 TB.

It's human nature to defend our choices. I chose TB for my DAS. I chose USB3 for my portable drive. In my mind, TB to USB isn't a straightforward comparison.
 
The Pegasus is also upgradable to 24 TB.

It's human nature to defend our choices. I chose TB for my DAS. I chose USB3 for my portable drive. In my mind, TB to USB isn't a straightforward comparison.

From what I read, they aren’t meant to. The only thing that they have in common is that they have an interface used to transmit data. They are very different implementations and different applications. USB3 is meant to supplement/replace USB2. TB isn’t meant to do that. TB isn’t meant to replace or “Kill” USB at all. If anything, it’s meant to kill and expand on Firewire.
 
The Pegasus is also upgradable to 24 TB.

It's human nature to defend our choices. I chose TB for my DAS. I chose USB3 for my portable drive. In my mind, TB to USB isn't a straightforward comparison.

From what I read, they aren’t meant to. The only thing that they have in common is that they have an interface used to transmit data. They are very different implementations and different applications. USB3 is meant to supplement/replace USB2. TB isn’t meant to do that. TB isn’t meant to replace or “Kill” USB at all. If anything, it’s meant to kill and expand on Firewire.

The topic I was addressing was eSATA vs T-Bolt, not USB.

...and I can add a second $100 cabinet to the spare eSATA port, upgrade to 8 4 TB drives and have 32 TB for still a lot less than the 12 TB Pegasus. (Although I wouldn't do that, the 4 TB drives are still having teething pains - lots of DOA units and lots of infant mortality.)

...and, like I said, the Pegasus is more performant - but it's wasteful overkill for my purpose. If I needed high performance DAS and cost weren't important, T-Bolt is a good solution.

Since the bottleneck for my purpose is Cat6 GbE Ethernet, T-Bolt is a waste of money.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.