Of course, I meant that more lanes could be used in one cable. Not multiple sockets for one data path.
More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason. Nobody wants to see a return of the old parallel SCSI cables with 50-pin Centronics connectors.
Good selection of numbers you have.
I think that when tb designers were pushing the envelope a bit too much when they made the hard decisions somewhere in 2010. Maybe they didn't realize, that to make a new useful standard, it has to be mass-adopted by consumers to get price efficient.
We are very close to the time, when "pro" tech will not be more advanced than consumer tech. Pushing the envelope is getting so expensive, that you simply can't do that with any smaller market size than the biggest.
Apple has significant enough market share at the moment to create a robust ecosystem for both software developers and accessory manufacturers all by themselves. A standard does not need to be adopted by billions of consumers or even the majority in order to become useful or a good value proposition.
There will always be a minority segment of the market that demands more performance than is available from run-of-the-mill consumer electronics devices and are willing to pay a premium for it: the pros and enthusiasts. There is also a key group of purchasers to whom price is virtually no object for products that meet their specific requirements: enterprise and government. While the market may be far larger for 1080p HDTVs than 4K medical imaging monitors, the profits are way better for the latter. Trickle down will continue to happen as it always has. 10.3125 Gbps per channel is old news to the telecom industry, who will be deploying 25-28 Gbps this year and paying thousands of dollars per port to do so. In 2014 Thunderbolt will get a speed bump and consumers will have access to those same speeds for a couple hundred dollars per port.
Apple only has a little more than 5% of the global market and around 12% of the US market for PCs based on unit shipments, but they take in over 35% of the operating profit for the entire industry. Apple needs to differentiate itself from the competition in order to justify higher unit prices and maintain those higher profit margins. Thunderbolt is just one more way to reinforce the perception that their devices are more elite, capable and valuable.
Usb3 was released in 2008, 8 years later than usb2.
If usb4 comes in 2016 with greater speed than tb now, a fraction of price of tb and there's no tb2 at the time, tb will die slowly away.
Then we can keep speculating that if tb had had 8 pairs of wires in the cable and therefore used cheap passive cables, it would have become as popular as usb...
But USB 3.0 devices were not available until 2010, two years later. Do you think USB would be as popular today if it had been UPB (Universal Parallel Bus) instead? Also note that USB is massively popular, whereas SuperSpeed USB is just beginning to gain momentum. There are 6 billion USB devices in use, but even by the end of 2012, only 7% will be of the USB 3.0 variety.
I disagree fully.
Apple could have included usb3 in macs in 2010.
And more usb3 controllers were sold in 2010 than macs.
Now which one of these minorities are significant again and why?
There's no technical or economical reason for not to do this, if they would have wanted to maintain state-of-the-art imago. Saving $5 chip in cost just isn't that. Maybe this is just the one legacy baggage that Apple is carrying. After fw loosing to usb, they couldn't just face their defeat and had to come up something sexier than usb and that became tb. Also tb might be a whole lot less sexier, if macs would have had usb3 before tb.
According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.
Why would Apple spend time developing and testing a driver for silicon that didn't meet their design criteria and was not available in sufficient quantity? Seriously, not even joking, the earliest Apple could have adopted USB 3.0 was mid 2011, and they would have had to lean hard on their suppliers to do so. Whether they should have done so or not at that point is entirely debatable, but saying that they could have added USB 3.0 in 2010 or even early 2011 is just not in line with reality.
The problem with standards is, well, you need to wait until things are standardized. Despite USB 3.0 being ratified in December of 2008, the first 4-port controller wasn't even certified until April of 2011.
Although you can't realise from controller sales numbers how much certain port is used and even more importantly how much there was benefit using this port compared to other alternatives, I just don't get this when something is widely enough accepted. Do you have to sell 40 million controllers a year, before you can say that the used standard is adopted? It would be very interesting to know how many tb devices has been sold that really benefits from tb? 4 digits or even 5? Tb displays and fw & ethernet dongles should be excluded from the list, since all these would be handled more price efficiently with usb3.
The industry generally relies on a metric known as "attach rate" (a term which I now realize I have been misusing for some time). It's the number of complimentary devices sold for each primary product sold. As far as I'm aware, these figures just haven't been made public for Thunderbolt, but without a doubt they are lower than for USB 3.0 at this point.
My guess is that you'd be solidly into 6-digit territory for the number of Thunderbolt devices sold. And discounting the Apple Thunderbolt Display would be asinine. As would not including Ethernet dongles, unless you can point to a currently available USB 3.0 GbE adapter. And replacing the functionality of FireWire gear you already own with USB 3.0 stuff is not only potentially impossible at this point, but also probably more expensive than a buying a $29 adapter. However, the adapters aren't even available yet, so it makes no difference whether you include them or not.
I just don't get this excuse that Apple shouldn't have included usb3 before it has certain percentage of market adoption. How about using the same rule for tb? Is there some reason why macs should be some kind of "average PC" and therefore have only features that most computers have? Does retina need to be ubiquitous to be justified? Comparing pc market to macs is just plain stupid old fruit comparison. When most of pc crap is way cheaper than macs, you just can't expect them having any advanced tech. On the other hand almost all pc hardware in macs' price group has had usb3 (and bd) for second of third year now.
I never argued that Apple needed to wait for a certain level of adoption before adding USB 3.0, or should only include mainstream technologies in their products. Just like everyone else, though, they can only purchase what their suppliers can produce. When Apple moves first, they can be aggressive and buy up all of a certain item. When it comes to things like USB 3.0 host controllers, or 4G LTE modem and baseband chips, there are a lot of other prospective customers, and any drawbacks due to early implementations can make waiting it out for a round or two seem like a better idea.
Just compare usb3 and tb 18 months after first products were on the shelf. Usb3 had 100x more different products and adoption rate than tb. Maybe 1% of mac users will ever use tb and at the same time 99% of users will benefit usb3. Apple's customers who bought mac in 2010-2012 would also benefit from usb3 still many years. Apple chose not to offer it, since so few undersood to even ask it and now they can buy new macbooks sooner than ever before.
But to be more realistic, the attach rate numbers are probably at least 2% for Thunderbolt, and according to the USB-IF were only around 60% for USB 3.0 after the first 18 months. The number of certified devices was only 250 for USB 3.0 vs. 50 for Thunderbolt, so that would be 5x more different products and 30x the attach rate.
Also, there is still only a very limited range of device silicon available for USB 3.0. You have thumb drives, card readers, SATA bridges, 4-port hubs, cameras and a media player. In the first 18 months, how many USB 3.0 to 6Gb/s SAS/SATA bridges, GbE or 10GbE adapters, HD video interfaces, Fibre Channel adapters, PCIe or ExpressCard expansion chassis, or displays anything remotely like the ATD were available? Despite some initial overlap, USB 3.0 and Thunderbolt will end up being used for very different purposes. Basically if a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?
Well, we are in the verge of external retina displays. If Apple brings any kind of other external display to market, it will be disappointment. (Hmm, next model could be also just the same as now, but with usb3. Even if the display doesn't have intel's chipset that includes usb3...

) And with their situation with tb versus retina they have to either drop external monitors from their products altogether or limit the amount of external retina displays to one per mac or upgrade tb specs and have a mess with angry customers, whose expensive hardware just turned obsolete even before anybody thought. None of the options are very nice for anybody.
Do you really think there will be crowds of angry customers with pitchforks when they discover they can only run a single external display with a resolution higher than 6MP? I’d love to see the performance of an MBA trying to drive more than that many pixels. According to your logic, consumers should only be allowed to have 1920x1080 displays anyway, because they are so much cheaper due to economies of scale.
So, maybe those cables should have stayed where they belong and the desktop alternatives should be designed with better price efficiency?
Yes it will.
Nobody, needs anything that is too expensive and everybody wants as fast as they can reasonably justify to afford. So there's no binary solution for this.
Ahh, but don't you see that many people already can afford and justify the additional expense of Thunderbolt? Having a choice is a good thing, and I'm definitely glad that we finally have the option of using either USB 3.0 or Thunderbolt on the same platform.
If macs would had usb3 over 2 years now, maybe usb3 would had been adopted way faster. But yes, hdmi in new mbp is quite an oddball, maybe average mac user is stupid enough for not to know that dp-hdmi dongle does the same thing OR maybe Apple is preparing their customers to that if you want 2 external displays, the retina one takes the whole tb and the second external, non-retina model can use the hdmi OR if you need all tb bandwidth for data, you can use hdmi for display, which then does not take any bandwidth away from tb.
Even driving 2 daisy-chained 2560x1440 ATDs, PCIe bandwidth over Thunderbolt is only reduced by about 16%, and just in the outbound direction and only for devices attached to that chain. You constantly exaggerate the impact of DP on Thunderbolt's PCIe performance, when the odds of it significantly affecting any real-world workflows are slim to none. How would a USB 3.0 controller fare trying to drive two 2560x1440 displays while writing more than 8.4 Gbps to an external RAID array? Oh right, it can't do either of those things anyway...
Also, a very tiny fraction of computers, but still few million mbp's have tb but no usb3. Unfortunately there might never be a affordable tb-usb3 dongle or tb hub to use more these dongles...
But it is far more likely that an inexpensive Thunderbolt to USB 3.0 dongle based on the Port Ridge controller and leveraging Apple's drivers will see the light of day as soon as Mountain Lion and/or the 10.7.5 update become available to the general public.
Again there were more usb3 controllers sold than macs. Are they both not significant then?
Again, that 20% of pc's were on price group of macs and had usb3. At the same time the macs did not have. Is this a sign of bad design in pc's or macs?
Those controllers were still mostly finding their way onto enthusiast motherboards or drop-in PCIe expansion cards for desktop PCs, by a factor of 2:1 compared to notebook PC deployments. Once again, it is very clear that integrating third-party USB 3.0 controllers into Intel PCs with form-factors similar to Apple's product line did not happen in any significant way until mid-2011 or later. Until someone at Apple steps forward and recounts the tale of why the decision was made to wait until 2012, we can only speculate. I highly doubt that it was the result of the controllers being too expensive, or Apple's engineers not being up to the design task.
Good question!
Either way, chips inside the device or inside the cable, you need same amount of them, so it doesn't affect the price of the whole system.
The thing I have issue with is this:
In your own thought-experiment, take two thunderbolt devices connected by a thunderbolt cable. Now cut the ends of the cable and put the ends inside the tb devices.
Now you have two (differently designed) thunderbolt devices, communicating perfectly with each other as before, but via a cheap, passive cable.
The benefit being, you only buy the expensive electronics once per device, not twice per cable.
Why on earth didn't they implement it like that???
To ask the question a different way, why did Silicon Image not decide to put the transmit and receive electronics into the DVI cable (or HDMI). They could have done so, but they thought that would be a daft idea.
If the logic were integrated into the T-Bolt controller itself (instead of active cables) you'd need zero additional chips.
Intel's Light Peak controller, which became the Light Ridge Thunderbolt controller, was designed to be connected to an on-board optical transceiver. The output from the optical engine was then routed via fiber to a hybrid Cu/optical port containing a specially designed lens which allowed the use of passive optical cables. This system had many drawbacks which made commercialization challenging or downright impractical. Sony was the only OEM to go this route and only for a laptop that started at $3000.
Using the controller as-is with the active copper cables we have today was a far more flexible, cost-effective and expedient solution than going back to the drawing board and taping out a 120mm^2 controller all over again just to integrate a new PHY. I realize there is a lot of resistance to the notion that putting the logic in the cable may have been the best all-around solution, but there are reasons why it is the industry norm at 10.3125 GBaud per lane.
Active circuitry also allows the use of very thin wire for the signaling pairs, as small as 40 AWG, at power levels as low as 0.7 W per lane. Compare that to Intel's latest X540 10GBASE-T controller with integrated MAC/PHY. The 2 port controller weighs in at 625mm^2 and uses 6.25 W max per port when paired with UTP cables using 22 AWG wire. This is advertised as one of the lowest power 10GBASE-T solutions on the market. Meanwhile SFP+ modules get the job done using less than 1 W per port and allow considerable flexibility regarding the type of media used.