PDA

View Full Version : Prices of Thunderbolt Cables Likely to Drop in 2013




MacRumors
Jul 6, 2012, 03:10 PM
http://images.macrumors.com/im/macrumorsthreadlogo.gif (http://www.macrumors.com/2012/07/06/prices-of-thunderbolt-cables-likely-to-drop-in-2013/)


One of the main criticisms of the new Thunderbolt connectivity standard embraced by Apple has been its cost, which adds a considerable premium to the prices of compatible peripherals. Even Thunderbolt cables are expensive, with Apple's 2-meter cable (http://store.apple.com/us/product/MC913ZM/A) priced at $49, a price on par with offerings from the few other companies selling Thunderbolt cables so far.

In a report published earlier this week, Ars Technica took a look (http://arstechnica.com/apple/2012/07/why-thunderbolt-cables-will-be-expensive-until-2013/) at why the cables are so expensive and investigated some of the upcoming advances that could help bring prices down beginning late this year or early next year.

http://images.macrumors.com/article-new/2011/06/QBZvGuXR2nRD64NM.medium-500x250.jpg


Inside an Apple Thunderbolt Cable connector (Source: iFixit (http://www.ifixit.com/blog/blog/2011/06/29/what-makes-the-thunderbolt-cable-lightning-fast/))
As revealed in iFixit's teardown (http://www.macrumors.com/2011/06/29/thunderbolt-cable-teardown-reveals-electronics-and-firmware/) last year, Apple's Thunderbolt cable is expensive because it contains a significant number of chips and other circuitry, starting with the transceiver as noted by Ars Technica:The chip is built using silicon germanium, "an expensive semiconductor process typically used for telecom applications," [Intersil marketing manager John] Mitchell told Ars. [...]

In addition to the transceiver, the current reference design also requires a separate microcontroller, as well as power management and voltage regulation chips to deliver the 3V data signals and 15V optional power supply for bus-powered devices. Essentially, there are four integrated circuits (IC) at either end of a Thunderbolt cable.But Intersil appears set to simplify the design for Thunderbolt circuitry later this year with its own products that will reduce the number of chips and allow for cheaper cable to be used.What Intersil calls an "Active Cable IC Solution for Thunderbolt Technology" appears to be the only complete turnkey solution we could find among manufacturers selling ICs for Thunderbolt. It combines the microcontroller and transceiver into a single signal processing chip, and combines power management and voltage regulators into a single power management chip. This cuts the number of required ICs from four to two.With the new chips being manufactured using a 40-nanometer process, yield and cost efficiency are improved and heat generation is decreased, leading to further cost savings on the cable design. Combined with other improvements, Intersil's solution will bring substantial improvements in component costs, size, and power usage, which together should yield significant cost savings for consumers.

Article Link: Prices of Thunderbolt Cables Likely to Drop in 2013 (http://www.macrumors.com/2012/07/06/prices-of-thunderbolt-cables-likely-to-drop-in-2013/)



shurcooL
Jul 6, 2012, 03:11 PM
It's getting there.

samcolson4
Jul 6, 2012, 03:12 PM
Now all we need is some less expensive stuff to plug it in to...

trims
Jul 6, 2012, 03:13 PM
It's getting there.

Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

Dr McKay
Jul 6, 2012, 03:15 PM
PC manufacters have been very slow to adopt, however perhaps it will gain traction. Although it's superior I can see it taking FireWires place as 2nd to USB3.

dagamer34
Jul 6, 2012, 03:17 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

Launch was February 24, 2011 with the Early 2011 MacBook laptop refresh.

Yvan256
Jul 6, 2012, 03:17 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

My 2010 Mac mini doesn't have a Thunderbolt port, so I have zero incentive to buy any.

In fact, even the FireWire 800 port is unused, although it's very nice to have it as an option. However the only FireWire hardware I had was my 3rd generation iPod and its hard drive died a few months ago.

dagamer34
Jul 6, 2012, 03:19 PM
PC manufacters have been very slow to adopt, however perhaps it will gain traction. Although it's superior I can see it taking FireWires place as 2nd to USB3.

PC manufacturers have only had access to it since this year. And it only gets it's true benefits on laptops which have no internal expansion ports. Problem is that unless you're dealing with professional equipment, the thing I think most people would want to use Thunderbolt with is a dock that had USB 3.0, eSATA, card reader, etc so that you only need to plug in 2 cables into your laptop: power and Thunderbolt.

usptact
Jul 6, 2012, 03:19 PM
Some time ago we would laugh loud if offered a really expensive cable. Now they are trying to sell as an product by itself. Where that world is going? :rolleyes:

bungiefan89
Jul 6, 2012, 03:20 PM
PC manufacters have been very slow to adopt, however perhaps it will gain traction. Although it's superior I can see it taking FireWires place as 2nd to USB3.Actually, given how strong Thunderbolt seems to be, I think it could easily be an industry standard if they could drop the price.
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?

WindWaker
Jul 6, 2012, 03:20 PM
The prices of the actual hardware are so expensive this won't really make any difference in the popularity of such products...

SDAVE
Jul 6, 2012, 03:21 PM
TB is going to become super cheap soon once there is affordable peripherals out there.

You can get above SATA III speeds with it and soon enough enclosures will be here so we can pop in SSDs and get full speed.

Places like Monoprice will start to get TB cables soon enough.

USB3.0 is a nice upgrade to USB2.0, but still not as fast as TB.

FW800 is dead. I am still using them, though, glad that TB can work with multitude of ports.

kingtj
Jul 6, 2012, 03:22 PM
The thing with Thunderbolt is, it's a high-end standard, suitable for power user/pro user needs more than anything else.

I could easily see this getting utilized by video editors, for example, who might want to pump uncompressed 1080p resolution video through it in large quantities. For them, the projects they're working on, not to mention the cameras they'd attach such a cable to, make the price of the cable rather irrelevant.

Problem is ... we don't even have a Mac Pro tower with the connector on it yet!


PC manufacters have been very slow to adopt, however perhaps it will gain traction. Although it's superior I can see it taking FireWires place as 2nd to USB3.

jclardy
Jul 6, 2012, 03:25 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

I will be owning a TB->Firewire adapter once it is released.

Anyways, Thunderbolt is not meant to be a mainstream consumer port.

USB is cheaper for manufacturers to implement, is supported in more machines (Other than just macs at the moment) and has sufficient data speeds for most consumer applications.

Basically, there is no point in using a super fast and expensive connection when you aren't going to use the bandwidth, which is why there are only RAID drives available for TB and not any $90 2TB 7200 RPM disks.

When SSD prices drop to a reasonable level, normal consumers will finally be able to use all the available bandwidth that thunderbolt provides.

Dan--
Jul 6, 2012, 03:25 PM
Prices of cables have to come down by at least a factor of 2.

Peripherals need to drop by over $100.

ghostface147
Jul 6, 2012, 03:25 PM
The cables do get rather toasty when transferring a high amount of data. I bought one of those goplex thunderbolt drives for my time machine backup and didn't notice a tremendous speed increase compared to firewire 800. Of course the biggest issue is the speed of mechanical hard drives. Now make it an SSD drive and it would fly, but a terabyte one would be rather expensive.

LimeiBook86
Jul 6, 2012, 03:28 PM
Very good news, now bring on some more price-friendly Thunderbolt adapters and hubs! :D I'd love to add a USB 3.0 port to my 2011 iMac via Thunderbolt.

radiogoober
Jul 6, 2012, 03:37 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

I own over a thousand dollars worth of TB peripherals.

----------

Actually, given how strong Thunderbolt seems to be, I think it could easily be an industry standard if they could drop the price.
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?

Whole-heartedly agree, except just have multiple Thunderbolt ports. Screw USB :)

baryon
Jul 6, 2012, 03:38 PM
What about when Thunderbolt goes optical? How will that work, will we need new Macs again? Isn't an optical cable basically just a piece of translucent plastic? Wouldn't it then be tons cheaper? Would that finally get rid of the freaking supercomputer built into the cable?

I miss the days when a cable was literally a piece of wire, and cost the same as a piece of wire. I'll switch to Thunderbolt if and when the price of a cable will be the same as the price of a USB cable. Until then, I have no problems with USB whatsoever.

AidenShaw
Jul 6, 2012, 03:38 PM
Problem is that unless you're dealing with professional equipment, the thing I think most people would want to use Thunderbolt with is a dock that had USB 3.0, eSATA, card reader, etc so that you only need to plug in 2 cables into your laptop: power and Thunderbolt.

Why would you need more than one cable?

Why not zero cables, like most OEM laptop docks?

Think differently, for gord's sake. (as in "differently from Apple", since other vendors have had very nice solutions for at least a decade)

kniemann
Jul 6, 2012, 03:39 PM
When is Monster Cable going to release a Thunderbolt cable?

ps45
Jul 6, 2012, 03:40 PM
I own over a thousand dollars worth of TB peripherals.

Are there many TB peripherals under a thousand dollars? :)

Fortimir
Jul 6, 2012, 03:41 PM
Thunderbolt is amazing and, in the near future, it lives wonderfully side-by-side with USB. There's really nothing to dislike about Thunderbold aside from the price.


Ultra-high multi-path bandwidth anywhere from 10-20Gbps with copper and 100Gbps with fiber.
Forward-compatible with optical transceivers.
3.3v to 18v capability on the line.
Small ports because the transceivers are on the cable.
Daisy-chainable.
For Mac users, DisplayPort and Thunderbolt play nicely.

As it supports data, video, audio, and power, you can use a single Thunderbolt port and single cable to connect everything. With adapters you can use virtually any protocol with it. It's essentially a huge pipe and you can throw anything you want at it.

AidenShaw
Jul 6, 2012, 03:41 PM
When is Monster Cable going to release a Thunderbolt cable?

As soon as a $200 cable is marketable.

chirpie
Jul 6, 2012, 03:45 PM
"Prices of Thunderbolt Cables Likely to Drop in 2013"

I don't mean to be a bit negative but...

"Prices of Anything Related to Technology Likely to Drop in 2013"

would most likely also work.

Fortimir
Jul 6, 2012, 03:45 PM
When is Monster Cable going to release a Thunderbolt cable?

In case you aren't trolling, you do realize Monster has never made a cable that was better than another cable for less money, right?

If you are, then that's a good one!

UnfetteredMind
Jul 6, 2012, 03:45 PM
I currently have no machines with Thunderbolt, but I'm considering refreshing my 2006 MacPro in the next year (since it won't run ML). I have multiple eSATA devices (RAID, standalone drive). They do have FW800 connections so it's possible I could connect them that way as well, assuming the new Mac I go with still has FW800.

Any lowering of TB costs is welcome! Hopefully it will lower the costs of actual consumer products using TB.

caligomez
Jul 6, 2012, 03:45 PM
So TB cable prices are "likely" to drop a couple of bucks by "2013"... Great.. That'll be a big relief when I buy a $400 TB Hub or $600 external drive a year from now..:rolleyes:

bungiefan89
Jul 6, 2012, 03:46 PM
Whole-heartedly agree, except just have multiple Thunderbolt ports. Screw USB :)A single port for everything does sound even better but... USB has been SO outrageously successful, I can't imagine it being completely replaced, even with something as powerful as Thunderbolt. Just think of all the hundreds of USB devices available today... are you going to use a USB to Thunderbolt dongle for ALL of those?

Dr McKay
Jul 6, 2012, 03:47 PM
I own over a thousand dollars worth of TB

Wow, you own a whole external drive! ;)

spazzcat
Jul 6, 2012, 03:47 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

I have a RMPB, so I have a network adapter and a display port adapter, so I own two...

WestonHarvey1
Jul 6, 2012, 03:47 PM
What about when Thunderbolt goes optical? How will that work, will we need new Macs again? Isn't an optical cable basically just a piece of translucent plastic? Wouldn't it then be tons cheaper? Would that finally get rid of the freaking supercomputer built into the cable?

When fiber versions become available, the optical transducers can be included inside the cable connectors. That will make them compatible with current TB Macs.

spazzcat
Jul 6, 2012, 03:48 PM
A single port for everything does sound even better but... USB has been SO outrageously successful, I can't imagine it being completely replaced, even with something as powerful as Thunderbolt. Just think of all the hundreds of USB devices available today... are you going to use a USB to Thunderbolt dongle for ALL of those?

You could use a USB hub that plugs in to a TB port.

bungiefan89
Jul 6, 2012, 03:48 PM
Thunderbolt port and single cable to connect everything. With adapters you can use virtually any protocol with it. It's essentially a huge pipe and you can throw anything you want at it.Then why can't I use my 2011 iMac as a screen for my Xbox 360? :mad:

AidenShaw
Jul 6, 2012, 03:50 PM
Ultra-high multi-path bandwidth anywhere from 10-20Gbps with copper and 100Gbps with fiber

There are no optical T-Bolt cables at any speed. You are quoting roadmap and theoretical numbers. And "multi-path" is questionable - a T-Bolt chain is single-path.

Fail.

Forward-compatible with optical transceivers.

Are you talking about T-Bolt 2.0 or T-Bolt 3.0?

T-Bolt 1.0 does not support optical cables - but there are certain vaporware noises about connectors with Cu-Optical transceivers in the cable for longer cable runs at copper speeds - but each end is a copper connection.


Daisy-chainable.

This is a liability, not an asset. Daisy chains are pure suck - since you often have to shut the system down to remove or insert a device in the middle

3PO
Jul 6, 2012, 03:53 PM
High price = low adoption rate = FireWire all over gain.

spazzcat
Jul 6, 2012, 03:55 PM
Actually, given how strong Thunderbolt seems to be, I think it could easily be an industry standard if they could drop the price.
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?

Don't need to wait until 2015, my RMPB already has no network port, firewire, DVI, or VGA. It does have HDMI, but really Apple could have just included a third TB port.

----------

High price = low adoption rate = FireWire all over gain.

The price will come down, look at Bluray when it first came out. All tech is pricey when it first comes out...

Fortimir
Jul 6, 2012, 03:57 PM
You are quoting roadmap and theoretical numbers.
Yep. Never said I wasn't.

And I should have specified that by multi-channel I was referring to the ability of Thunderbolt to not suck performance out of the entire bus like USB. Thunderbolt is designed to handle multiple devices of varying levels of performance without affecting the channel itself.

This is a liability, not an asset. Daisy chains are pure suck - since you often have to shut the system down to remove or insert a device in the middle
Most things I'd want to daisy chain aren't being hotplugged, so that's a near-non-issue. And now that everything is going the way of SSDs, a full system restart takes 15 seconds. I can deal. The big point for me is that I still see daisy chains as a plus since devices can be smaller and I'm faced with less cable clutter.

----------

High price = low adoption rate = FireWire all over gain.

Thunderbolt is quite a bit more capable than Firewire. I think it will have a much better future... not that Firewire really had a bad life. It's still around, just living in the shadows. Thunderbolt is definitely a MUCH larger threat to Firewire than anything else.

3PO
Jul 6, 2012, 03:58 PM
Don't need to wait until 2015, my RMPB already has no network port, firewire, DVI, or VGA. It does have HDMI, but really Apple could have just included a third TB port.

----------



The price will come down, look at Bluray when it first came out. All tech is pricey when it first comes out...

Of course it will come down, the point is it will be too late.

----------


Thunderbolt is quite a bit more capable than Firewire. I think it will have a much better future... not that Firewire really had a bad life. It's still around, just living in the shadows. Thunderbolt is definitely a MUCH larger threat to Firewire than anything else.

So was Firewire compared to USB.

rmwebs
Jul 6, 2012, 04:00 PM
Surely it would have made more sense to put that tiny microcontroller and voltage regulator on the actual thunderbolt port and devices, thus reducing the cable cost...you know...how every single other cable works.

deeddawg
Jul 6, 2012, 04:03 PM
Thunderbolt is a really cool idea. It'd be great if/when it becomes commonplace with many inexpensive accessories and cables.

But what I don't understand is: Who on earth thought eight chips inside every cable was a good idea? Anything requiring $50 cables is going to be adopted about as fast as concrete soccer balls.

Fortimir
Jul 6, 2012, 04:03 PM
Surely it would have made more sense to put that tiny microcontroller and voltage regulator on the actual thunderbolt port and devices, thus reducing the cable cost...you know...how every single other cable works.

You're just moving the cost and sacrificing expand-ability. Think about how many limits are lifted by making the port "open" and forcing the cables and peripherals to utilize it. Sure it costs a hair more because you'll have more cables... but this is a pro tech, not a consumer one at the moment.

tomovo
Jul 6, 2012, 04:05 PM
If the iPhone is going to get a new connector, it should be Thunderbolt.

ArtOfWarfare
Jul 6, 2012, 04:10 PM
Surely it would have made more sense to put that tiny microcontroller and voltage regulator on the actual thunderbolt port and devices, thus reducing the cable cost...you know...how every single other cable works.

I wouldn't consider myself a hardware pro (I'm more of a software guy,) but isn't the fact that those chips are in the cable instead of the port the reason optical TB cables will work with current macs?

Going this route with USB years ago would have meant USB 3 cables would be allowed to plug into USB 2 ports, wouldn't it?

This route allows ports to be forward compatible and support devices not yet released rather than backwards compatible and only support prior devices.

Put another way, getting the latest and greatest means replacing a few cables that add up to a few hundred dollars, rather than your computer setup, which for pro users is generally more than a few thousand dollars.

(If I'm wrong, someone please let me know - I could be grossly misinformed.)

tech4all
Jul 6, 2012, 04:16 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

Being a "dinosaur" indicates something is very old, not new.

hobo.hopkins
Jul 6, 2012, 04:20 PM
I believe that everyone would welcome a price decrease. It's always an unfortunate hindrance to the adoption of new technology.

baryon
Jul 6, 2012, 04:23 PM
When fiber versions become available, the optical transducers can be included inside the cable connectors. That will make them compatible with current TB Macs.

So, the optical signal (the light pulses) won't be coming from the computer, but rather generated by the cable? So there will not only be 2 freaking microcomputers in each cable (one at each end), but also a whole light generating/receiving thingy?

Wouldn't it be easier to just put an LED in the computer (cheap as hell) that simply blinks the 1s and the 0s and a transparent cable relays them? I mean, I'm not an engineer, but this is how optical audio works today, it costs nothing and it's a working way to relay lots of information. Why can't Thunderbolt do this?

Conflagrare
Jul 6, 2012, 04:29 PM
Surely it would have made more sense to put that tiny microcontroller and voltage regulator on the actual thunderbolt port and devices, thus reducing the cable cost...you know...how every single other cable works.

Originally Posted by WestonHarvey1
When fiber versions become available, the optical transducers can be included inside the cable connectors. That will make them compatible with current TB Macs.

It's so you can take the new optical table which is not yet designed/made/sold, and plug them into your existing thunderbolt port.

"So, the optical signal (the light pulses) won't be coming from the computer, but rather generated by the cable? So there will not only be 2 freaking microcomputers in each cable (one at each end), but also a whole light generating/receiving thingy?

Wouldn't it be easier to just put an LED in the computer (cheap as hell) that simply blinks the 1s and the 0s and a transparent cable relays them? I mean, I'm not an engineer, but this is how optical audio works today, it costs nothing and and it's a working way to relay lots of information. Why can't Thunderbolt do this?"

And no, optical transmitters are NOT cheap. Low frequency crap LED's yes. But not for high speed. Here's one for a 1Gb network at $100:

http://www.amazon.com/Cisco-GLC-SX-MM-1000BASE-SX-Transceiver-Module/dp/B0000ANEX9

I guarantee that thunderbolt runs faster than Ethernet... a lot faster, and hence more expensive.

NewbieCanada
Jul 6, 2012, 04:29 PM
When is Monster Cable going to release a Thunderbolt cable?

When the ones from everyone else cost five bucks.

faroZ06
Jul 6, 2012, 04:34 PM
It'll still be expensive because Apple charges ripoff prices for cables.

----------

It's annoying how Apple is sacrificing FireWire 400 for FW800 and/or Thunderbolt. So many devices use FW400. It was cool when Apple just gave us FW800 and FW400, but now you need adapters. I'm gonna stick with 2008 for a while.

----------

If the iPhone is going to get a new connector, it should be Thunderbolt.

I doubt the iPhone's internal memory is even fast enough to utilize Thunderbolt effectively, so it would end up just being overpriced. Firewire would be good.

Tombs
Jul 6, 2012, 04:35 PM
TB I think will take hold in the market as more PC manufacturer's include it in their designs, Samsung, MSi already offer laptops with a TB option and there are Desktop PC's with the inclusion of a TB port or 2. It just seems so slow because Apple decided to jump on board so early, and as we all know Apple although having made major in roads into the PC market with their products do not have the majority % wise on owned PC/laptops. The TB will take time to take hold, this isn't a professional product for professional people, its a solution for all but needs time to to take hold and filter into the PC/laptop market.

faroZ06
Jul 6, 2012, 04:38 PM
In case you aren't trolling, you do realize Monster has never made a cable that was better than another cable for less money, right?

If you are, then that's a good one!

Seriously, I can get an HDMI cable for $2.50 that's just as good as those stupid $70 things.

Swift
Jul 6, 2012, 04:40 PM
The actual "way things are" is changing. The new Intel chip has embedded support for Thunderbolt, so Windows computers will be able to have these ports without spending so much on Thunderbolt. The new chips will help on the price of the connectors, and market volume will lower the prices as well. Apple chose the right horse.

And Ivy Bridge also has support onboard for USB 3.0. Which can work on Thunderbolt along with the screen and solid state storage. And firwire 800 drives.

Once more and more Ivy Bridge is out there, the device manufacturers will start pumping out affordable things along with more expensive gee-whiz suff.

Imagine a computer in pieces, connected by some future Thunderbolt. The end of the tower. Completely configurable.

NMF
Jul 6, 2012, 04:43 PM
Imagine a computer in pieces, connected by some future Thunderbolt. The end of the tower. Completely configurable.

The possibility of a MacBook Air with an external Thunderbolt GPU makes me weak in the knees...

doctor-don
Jul 6, 2012, 04:44 PM
So Apple's 2-meter TB cable is now $49?

When the price is lowered, will it be $45?

Certainly, with more demand the price per cable should drop. Just don't skimp by using 30-gauge wires as some HDMI companies have done.

UnfetteredMind
Jul 6, 2012, 04:46 PM
Surely it would have made more sense to put that tiny microcontroller and voltage regulator on the actual thunderbolt port and devices, thus reducing the cable cost...you know...how every single other cable works.

The cables are active and essentially tuned for the length of cable of which they are a part. See prior MR article linked below:

http://www.macrumors.com/2011/06/29/thunderbolt-cable-teardown-reveals-electronics-and-firmware/

mazz0
Jul 6, 2012, 04:58 PM
I've never seen a TB cable in the flesh, but in the pictures the connectors look huge. That strikes me as a potential problem - the further the hard section of the connector extends from the device the more easily knocked it is (and possibly damaged), the more that movement of the cable will result in movement of connected devices, and the bigger the gap you have to leave around devices. This, I imagine, will only get worse when the fibre is added. Anything that makes the connector smaller, therefore, must be a bonus, I'd have thought.

If the iPhone is going to get a new connector, it should be Thunderbolt.

so everyone who buys a new iPhone has to buy a new Mac?!

SilianRail
Jul 6, 2012, 04:58 PM
$49 is pretty reasonable, especially since a storage device is $1,000+. Just be happy we aren't in the analog days where quality cables were very expensive.

Dangerous Theory
Jul 6, 2012, 05:01 PM
You're just moving the cost and sacrificing expand-ability. Think about how many limits are lifted by making the port "open" and forcing the cables and peripherals to utilize it. Sure it costs a hair more because you'll have more cables... but this is a pro tech, not a consumer one at the moment.

You're only moving the cost if you plan to only use one TB peripheral. Having all the expensive tech consolidated into one port allows as many cables as you like to connect, therefore lowering the cost.

SmoMo
Jul 6, 2012, 05:24 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

I've got a rMBP and so far:
* A Thunderbolt GB Ethernet adapter
* An external 120GB SSD HD, I might upgrade the SSD size soon
* A 27"Imac which I often use as a 2nd screen to the macbook

I put the HD together from a Goflex Thunderbolt adapter, a regular GoFlex drive case, and regular SATAIII SSD drive. I didn't really need the drive case but it makes the whole thing neater ( and less wobbly )

The drive is really fast, I have a partition which I use as the main OS for my iMac, its about three times as fast as the internal ( non SSD ) drive, and access time is even better.

I've had no problems booting off the same partition from a TB equipped macbook, although the genius in the apple store said it was a bad idea but I've been doing that for a few weeks and not found any issues yet.

Not sure what other Thunderbolt devices I'd like now, Network/Storage/Display are the 3 big devices that require bandwidth. I'd be interested in TB iPad and iPhone, but then again I haven't plugged either into my computer for ages now that iCloud syncing and OTA updates have been around.

tigres
Jul 6, 2012, 05:26 PM
I own over a thousand dollars worth of TB peripherals. :)

Which ONE do you own?

rmwebs
Jul 6, 2012, 05:30 PM
The cables are active and essentially tuned for the length of cable of which they are a part. See prior MR article linked below:

http://www.macrumors.com/2011/06/29/thunderbolt-cable-teardown-reveals-electronics-and-firmware/

Ahh fair enough. Makes sense I guess.

----------

Why do Thunderbolt threads bring out the ****ing idiots on these boards?

Were you questioning yourself out loud there? :rolleyes:

radiogoober
Jul 6, 2012, 05:32 PM
Which ONE do you own?

haha, too true!

SmoMo
Jul 6, 2012, 05:42 PM
Imagine a computer in pieces, connected by some future Thunderbolt. The end of the tower. Completely configurable.

I was discussing something similar recently with a techy friend. It seemed to be a recurring conversation that started with a lament as to how sad it was that you could no longer replace batteries etc.. in modern devices, and how this was the death of the computer as separate components.

Then on the way home on the train I realised I had:
* a wireless hires screen ( iPad3 )
* tethered wirelessly to my mobile internet device ( iPhone 4S in bag )
* a bluetooth wireless keyboard
* I was recharging the phone with a Morphie battery pack ( 10,000 mvH and can charge any USB device )

So wow, this is actually the golden age of the component computer, and its all wireless to boot!

I've even, seen advertised little hard drives that present themselves as little wifi hotspots that you can connect to and stream movies or files from, that would make the whole system complete modular then, with most of the equipment in your bag and just a screen and keyboard on the table.

I certainly don't miss the days of everything having to be crammed into a big bulky tower, and needing a screwdriver to change components, the anti static wrist bands, incompatible cards, running out of slots, dropping a screw under the motherboard etcet etcet

This time round it's more like Lego.

Xian Zhu Xuande
Jul 6, 2012, 05:45 PM
As soon as a $200 cable is marketable.
Hasn't stopped them from selling $200 HDMI cables.

Gomff
Jul 6, 2012, 05:57 PM
I'd like a Thunderbolt to USB 3 converter, so that I can get some actual use out of my Thunderbolt port without spending a fortune.

Apple, Intel and all the TB peripheral manufacturers fumbled the ball with Thunderbolt by making it too expensive. Even if they had to take a hit on their profits and recoup their R & D over a longer time, they should have made it cheaper just to get the ball rolling. What's more, they should have anticipated this given how they're competing with such an established standard like USB.

heisetax
Jul 6, 2012, 06:14 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

If it's not on a Mac Pro it must not be any good or we don't really need it. That means that 6 GB SATA III is not really needed along with ThunderBolt. For these new Macs that only have USB & ThunderBolt all one needs is $300 or so & another 3 month or so wait for the product to allow easy access of our previous products.

Price & available products are still really limited. I was just glad to see that I could connect my old displays with my old Mini DisplayPort to DVI or HDMI connectors on it work. Even with the DisplayPort to Mini DisplayPort cable to my 30" HP display worked without a problem.

I have a small excuse as I've only had my 17" Mac laptops with ThunderBolt for about 2 weeks now. My first step will be a multi-purpose adapter like Belkin plans to ship sometime in Sept or so. That will give my 2 new old laptops USB3 & eSATA. Plus I'll still have the card slot open for other changes. That means that it may me next year or so before I need anything more than a ThunderBolt cable. And this article says that I will have to pay a higher price if I want one before Christmas.

Frozzie
Jul 6, 2012, 06:17 PM
As much as I want Thunderbolt to succeed, i.e. become used by mainstream consumers and usable at public places, more or less it is apparent that Thunderbolt will be technology of a minority, those who use for professional purposes.

I don't think I can carry a Thunderbolt SSD around thinking "I shouldn't have trouble connecting this to my friends computers because Thunderbolt port is standard", rather that is USB 3.0.

Which is a shame because the performance is simply great. And also because I only have a Thunderbolt port and am extremely unlucky to own a 2011 Macbook Pro which Apple should have included USB 3.0 ports, not 2.0.

Lennholm
Jul 6, 2012, 06:32 PM
I still think it's stupid that the cable needs electronics in the first place. A cable should just be a set of passive wires linking two interfaces together, not be part of the interface itself. The fact that not all necessary electronics are in the devices themselves means to me that it wasn't implemented properly.

nuckinfutz
Jul 6, 2012, 06:40 PM
I say you would be hard pressed to find any thread here that didn't. ;)

True my friend. I think there's always a bit of resistance to expensive stuff. Everyone wants to keep more of their money in their bank.


I still think it's stupid that the cable needs electronics in the first place. A cable should just be a set of passive wires linking two interfaces together, not be part of the interface itself. The fact that not all necessary electronics are in the devices themselves means to me that it wasn't implemented properly.

Necessary evil. At this speeds I'm guessing that you either need to really beef up the transceivers to account for synchronization and other issues or you take part of that logic and put it in the cable itself.

twoodcc
Jul 6, 2012, 06:41 PM
this is a very good thing. hopefully the devices will get cheaper also

Mechanic
Jul 6, 2012, 06:45 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

I have three and they were worth every penny, While I do agree the prices are high right now you can get good gear for resonable prices if you need them if you look.

jontech
Jul 6, 2012, 06:47 PM
Now all we need is some less expensive stuff to plug it in to...

Installed two Promise 12tb arrays and one WD 4tb for a customer and couldn't believe the prices. However it's a good value for those that need it

Mechanic
Jul 6, 2012, 06:56 PM
I still think it's stupid that the cable needs electronics in the first place. A cable should just be a set of passive wires linking two interfaces together, not be part of the interface itself. The fact that not all necessary electronics are in the devices themselves means to me that it wasn't implemented properly.

Its not stupid telecoms have been using active cables for years, in order for thunderbolt to achieve the 10mb speed it needs those controllers to tune the cable to get the 10 mb through put and also to multiplex the data and video streams on the fly into one stream through the cable.

----------

----------

Which ONE do you own?

Just picked up the LaCie Thunderbolt to dual esata hub its very fast and can raid two drives on separate esata channels at the same time. I use it for my media server mini. I can run 4 ATV 3's at the same time and it has no slowdown whatsoever.

Stridder44
Jul 6, 2012, 07:12 PM
Actually, given how strong Thunderbolt seems to be, I think it could easily be an industry standard if they could drop the price.
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?


You left out 3.5 mm audio jack(s), but yes, this seems to be the idea at Apple. And I like it. Just USB 3 and TB ports = a fantastic future.

On a somewhat unrelated note, I guess this would mean the death (or at least non-use) of Firewire 1600/3200? I remember seeing rumors of them in development, but it seems pointless what with USB3/TB.

KdParker
Jul 6, 2012, 07:22 PM
Now all we need is some less expensive stuff to plug it in to...

right

BiggAW
Jul 6, 2012, 08:10 PM
I like my $2 USB cables, thank you very much.

mentaluproar
Jul 6, 2012, 08:18 PM
I don't know how widely this will ever be used. Firewire was originally intended for high speed data transmission, and USB for standard one-connector-for-most-things. We all know how that went. USB got faster and cheaper, and eve though firewire 800 was faster, USB 2 was the go-to connector.

Unless thunderbolt turns into a one-cable-to-rule-them all (and aside from power it looks like it will get there) it's not going to take off the way USB did. It needs to be cheaper and easier to use. Thunderbolt only has one of those points covered.

SDAVE
Jul 6, 2012, 08:21 PM
I don't know how widely this will ever be used. Firewire was originally intended for high speed data transmission, and USB for standard one-connector-for-most-things. We all know how that went. USB got faster and cheaper, and eve though firewire 800 was faster, USB 2 was the go-to connector.

Unless thunderbolt turns into a one-cable-to-rule-them all (and aside from power it looks like it will get there) it's not going to take off the way USB did. It needs to be cheaper and easier to use. Thunderbolt only has one of those points covered.

Firewire was popular because of digital video (DV) devices like camcorders...This is the reason FW800 never took off, even though it's faster than USB 2.0. Nowadays digital video camcorders either record to SD cards or internal hard drives and the data is copied to the computer via USB2.0.

FW800 is dying, unfortunately. I'm glad some of my external FW800 also have a USB3.0 port :D

AidenShaw
Jul 6, 2012, 08:58 PM
The new Intel chip has embedded support for Thunderbolt, so Windows computers will be able to have these ports without spending so much on Thunderbolt.

The Ivy Bridge chipsets do not support T-Bolt - but they do support USB 3.0. Discrete T-Bolt controllers are still needed.


On a somewhat unrelated note, I guess this would mean the death (or at least non-use) of Firewire 1600/3200? I remember seeing rumors of them in development, but it seems pointless what with USB3/TB.

Wouldn't 1394 at 1600/3200 have to have been born in order to die?

It was a vapor spec that never materialized, and now never will.

bedifferent
Jul 6, 2012, 09:15 PM
I know this is a silly point of contention, but I dislike the name "Thunderbolt". "Light Peak" seemed perfect, especially marketing; following "FireWire", "LightPeak" or "LP" devices.

Thank God Jobs never got his wish in naming the "iPhone" the "Magic Phone".

MagnusVonMagnum
Jul 6, 2012, 09:17 PM
Actually, given how strong Thunderbolt seems to be, I think it could easily be an industry standard if they could drop the price.
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?

Thunderbolt like FW is daisy-chained. All I can say is that daisy-chaining sucks big time for removable devices, especially if the "monitor" must be the last device on the chain. Want to unplug that hard drive? Time to play musical chairs with all your peripherals! People don't realize how MUCH it sucks until they're actually in a situation where they have to do it regularly. And just TRY to find a reasonably priced multi-port hub for Thunderbolt. Try to find a reasonable priced ANYTHING for Thunderbolt. The whole system is a major FAIL since inception for these very reasons and the sad thing is that it was SO EASY to see its failings from Day 1 because it's Firewire all over again. It was plain to see USB3 would be dirt cheap and used by default given its backward compatible nature with USB 2 and 1. In short, you'll NEED USB3 regardless so of course it'll get used. It's there. It's cheap. It's everywhere. The only way to dethrone an entrenched format like that is to be not only massively better (TB is 2x faster, but that's neither hear nor there when it's a BUS format and video sucks up that bandwidth very quickly), but it also has to be both more convenient (daisy-chaining kills that) and cheaper (NOTHING and I mean NOTHING is cheap about TB; you're paying $200+ premiums for the same hard drive just to use a TB connector (and then pay $50 for a cable on top of that, so that 3TB drive then costs you $350-450 that costs $100-150 for USB3 and the same speeds since the hard drive is the limiting factor). Good luck finding one at a brick and mortar store for that matter.

People keep making a big deal about this idea of using "less" connectors in the future, but it's not really true except for a docking connector. All those devices STILL have to be connected to the machine somehow (i.e. unless your hard drives are built into your monitor, you'll need a separate cable for them and for Network and everything else; the only connection that needs just one cable is the dock connector and there I'd agree it would take a few seconds less to dock your notebook than with 4 cables, but is it really worth that much more money to save a few seconds? The current all-in-one hubs cost a fortune and Apple's own built-in hub to their monitor is already outdated since it's only USB2 (and for no good reason either since USB3 was available when it came out and the cost would have been negligible but that would have meant OSX would have had to support it while Jobs was still alive and that wasn't going to happen. Steve was intelligent and creative on a lot of levels, but practicality wasn't always one of them.

repoman27
Jul 6, 2012, 09:26 PM
There are no optical T-Bolt cables at any speed. You are quoting roadmap and theoretical numbers. And "multi-path" is questionable - a T-Bolt chain is single-path.

Fail.

Are you talking about T-Bolt 2.0 or T-Bolt 3.0?

T-Bolt 1.0 does not support optical cables - but there are certain vaporware noises about connectors with Cu-Optical transceivers in the cable for longer cable runs at copper speeds - but each end is a copper connection.

This is a liability, not an asset. Daisy chains are pure suck - since you often have to shut the system down to remove or insert a device in the middle

As usual, you're wrong about everything, and have the audacity to proclaim "fail" about the statements of others.

Sumitomo makes optical Thunderbolt cables. They've been available for a while now. Sumitomo also manufactures several of the other Thunderbolt cables that have recently come to market (Elgato, Belkin), but they don't sell direct in the US. Yes the optical transceiver is housed in the cable connector, right where it should be for a consumer device. It keeps the price of everything lower for those that don't need the range of optical, it keeps a potentially dangerous laser device out of harm's way, and doesn't require any expensive, easily damaged or contaminated lenses in the connector.

The speed of Thunderbolt is entirely determined by the controller. Thus far all controllers on the market are 10 Gbps, full-duplex, per channel. All cables are dual channel, and thus multi-path.

The copper implementation of Thunderbolt is actually twice the bandwidth of the optical versions that were demoed as Light Peak. Those only offered a single 10 Gbps channel, rather than the two that are standard with Thunderbolt. Short of silicon photonics, there is always an electrical/optical boundary, and your semantical arguments about the significance of where this boundary lies are pointless. For now, economics and practicality will determine where they are.

Also, the requirement of Thunderbolt devices to be hot-pluggable means that daisy chaining isn't much of a liability, unless you are silly enough to boot off of a device that isn't at or near the beginning of the chain. But then again, you're welcome to voice your opinion that daisy chains suck. At least that isn't patently false, like most of the other troll-bait claims you make.

Lancer
Jul 6, 2012, 10:49 PM
Now all we need is some less expensive stuff to plug it in to...

1+

Why would the average Mac or PC user now looking at USB3 want to pay more for TB for storage?

OT - plugging a USB2 HDD into a USB3 computer will it be faster? I'm not talking about USB3 speed but will there be less of a bottleneck with the faster connecting if you have a number of HDDs running off the same bus or hub?

kiljoy616
Jul 6, 2012, 10:54 PM
Nope. Still a dinosaur.

Hands up how many of you own a Thunderbolt peripheral >2 years since launch?

Hand up how many of you own Ferrari after all they only been out 2 plus years. :rolleyes:

----------

As usual, you're wrong about everything, and have the audacity to proclaim "fail" about the statements of others.

Sumitomo makes optical Thunderbolt cables.

So well written and yet no one has given you a thumb up. :Dd

bigwig
Jul 6, 2012, 11:30 PM
PC manufacters have been very slow to adopt, however perhaps it will gain traction. Although it's superior I can see it taking FireWires place as 2nd to USB3.
Certainly, seeing as FireWire is on its deathbed. Big electronics retailers like Best Buy and Fry's have for the most part ceased stocking FW external drives in their stores, and only feature USB3 drives.

As for TB, every delay and grossly-overpriced product on the TB side just further entrenches USB3. I predict TB won't get close to FW in market penetration.

Schmitty11
Jul 6, 2012, 11:33 PM
Thunderbolt cables June 2013:

Amazon: $0.01
Apple $48.95

yg17
Jul 6, 2012, 11:59 PM
I've never seen a TB cable in the flesh, but in the pictures the connectors look huge. That strikes me as a potential problem - the further the hard section of the connector extends from the device the more easily knocked it is (and possibly damaged), the more that movement of the cable will result in movement of connected devices, and the bigger the gap you have to leave around devices. This, I imagine, will only get worse when the fibre is added. Anything that makes the connector smaller, therefore, must be a bonus, I'd have thought.



The pictures are very deceiving. Here's a pic of mine for scale, next to a USB connector, MagSafe 2 connector, headphone, and what for some reason is an unnecessarily large MDP->DVI cable. It's maybe about a centimeter longer than the USB connector, but not long enough to be a problem I don't think.

http://i.imgur.com/7ZbMD.jpg

btbeme
Jul 7, 2012, 01:08 AM
This isn't news. Simply stated, the prices HAVE to come down, because the could not be any friggin expensive to begin with.

MH01
Jul 7, 2012, 02:59 AM
I did chuckle when the heading said "likely" to drop by 2013 :)

mrbyu
Jul 7, 2012, 03:26 AM
I have a 2011 MBA 13" and actually the only thing which I really envy from the new Airs is the USB 3.0... I really don't think we're gonna see considerably cheap Thunderbolt devices in the near future (1-2 years)... :(

USB 3.0 is "only" 5 Gbit/s but it's s till way faster than Firewire, it's a whole another category.

Josephiah
Jul 7, 2012, 07:24 AM
I miss the days when a cable was literally a piece of wire, and cost the same as a piece of wire. I'll switch to Thunderbolt if and when the price of a cable will be the same as the price of a USB cable. Until then, I have no problems with USB whatsoever.

Is there a good reason for all those ICs to be built into the cable?* Surely it would have been more sensible to put all the processing hardware into the computer/peripheral at the socket, leaving the cable as just that - a cable.

(*unless, the conspiracy theorist might add, you're trying to make your laptop ultra-thin and create a nice money-making scheme into the bargain...)

Its not stupid telecoms have been using active cables for years, in order for thunderbolt to achieve the 10mb speed it needs those controllers to tune the cable to get the 10 mb through put and also to multiplex the data and video streams on the fly into one stream through the cable.

Apologies if I've missed something obvious, but I still don't see the (technical) reason why all that tuning/multiplexing hardware should be built into the cable, rather than into the ports at either end...?

KnightWRX
Jul 7, 2012, 07:42 AM
Apple's biggest marketing blunder with Thunderbolt : making everyone think it was aimed at replacing USB. Whether intentional or not on their part, that is now what most in the Apple community think.

Thunderbolt is not a USB replacement. Never will be. It'll be a niche host based connectivity option for higher end machines and peripherals, mostly used in the prosumer world.

Enterprise is going network based connectivity (FC or IP SANs for storage, network based peripherals over GBE or 10GE), consumers are sticking with lower mass market priced peripherals (USB3).

BiggAW
Jul 7, 2012, 08:01 AM
The pictures are very deceiving. Here's a pic of mine for scale, next to a USB connector, MagSafe 2 connector, headphone, and what for some reason is an unnecessarily large MDP->DVI cable. It's maybe about a centimeter longer than the USB connector, but not long enough to be a problem I don't think.

http://i.imgur.com/7ZbMD.jpg

What on earth is that Monoprice cable? Is that the adapter in a new Retina Pro? I have Monoprice adapters, and none are as hideously large as that thing...

yg17
Jul 7, 2012, 09:17 AM
What on earth is that Monoprice cable? Is that the adapter in a new Retina Pro? I have Monoprice adapters, and none are as hideously large as that thing...

Yeah, it's a Monoprice Mini DisplayPort to DVI cable. I have no idea why it's so freakin' huge either. And yes, it's a retina MBP.

AidenShaw
Jul 7, 2012, 11:20 AM
As usual, you're wrong about everything, and have the audacity to proclaim "fail" about the statements of others.

Sumitomo makes optical Thunderbolt cables.

Thank you for the polite reply.

Any "optical" T-Bolt cable today is a hybrid Cu-optical cable - it has copper connectors, with the copper protocols (and speeds) bridged across an optical segment.

True optical does not exist.

Fortimir
Jul 7, 2012, 11:33 AM
Yeah, it's a Monoprice Mini DisplayPort to DVI cable. I have no idea why it's so freakin' huge either. And yes, it's a retina MBP.

Haha... I have one of them Monoprice big boys, but fortunately it connects a second display to my iMac, so it stays hidden from view.

dysamoria
Jul 7, 2012, 11:42 AM
Actually, given how strong Thunderbolt seems to be, I think it could easily be an industry standard if they could drop the price.
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?

I remember when USB was announced and described nearly exactly the same way... One of these days I'd like to see the hype fulfilled on some magical new technology.

WestonHarvey1
Jul 7, 2012, 12:56 PM
So, the optical signal (the light pulses) won't be coming from the computer, but rather generated by the cable? So there will not only be 2 freaking microcomputers in each cable (one at each end), but also a whole light generating/receiving thingy?

Wouldn't it be easier to just put an LED in the computer (cheap as hell) that simply blinks the 1s and the 0s and a transparent cable relays them? I mean, I'm not an engineer, but this is how optical audio works today, it costs nothing and it's a working way to relay lots of information. Why can't Thunderbolt do this?

When optical TB becomes more prevalent, they'll probably move the hardware back into the port controller. Then there will be cheaper cables, but it will establish a fragmented legacy cable market. Perhaps we might also see a transducer dongle adapter.

All this putting chips in cables does sound perverse, but we live in an age where things have become so small and so cheap there's really no reason not to put intelligence inside the most mundane things.

Eidorian
Jul 7, 2012, 01:21 PM
I remember when USB was announced and described nearly exactly the same way... One of these days I'd like to see the hype fulfilled on some magical new technology.Hot-swapping without a restart and a whopping 12 Mbit/s in the age of floppies and serial cables. I had few serial mice... You really needed USB 2.0 once flash drives over 256 MB rolled out.

Get back to me when I can get a Thunderbolt cable in a cereal box. Which you might be able to do with a USB cable, today.

repoman27
Jul 7, 2012, 02:18 PM
Is there a good reason for all those ICs to be built into the cable?* Surely it would have been more sensible to put all the processing hardware into the computer/peripheral at the socket, leaving the cable as just that - a cable.

(*unless, the conspiracy theorist might add, you're trying to make your laptop ultra-thin and create a nice money-making scheme into the bargain...)

Apologies if I've missed something obvious, but I still don't see the (technical) reason why all that tuning/multiplexing hardware should be built into the cable, rather than into the ports at either end...?

I pondered the active vs. passive decision for some time, and wasn't quite sure what to make of Apple's decision. 10 Gbps x2 is tricky territory for a consumer cable, but passive twinax cables that support 10 and even 14 Gbps x4 exist—although none of them cost less than $49 either. And strangely, active Thunderbolt cables are limited to around 3m in length, which is the same as passive 14 Gbps QSFP+ cables.

At first I reckoned that including a low speed signaling pair and bus power that can provide either 3.3V or 18V and up to 10W to devices might be complicating the issue. But now I think it all has to do with the connector; pushing a peak aggregate throughput of 40 Gbps through a Mini DisplayPort connector is a big ask.

If we look at the history here, up until 6 months or less before the first Thunderbolt equipped Macs rolled off the line, Intel demonstrated Light Peak exclusively using optical media. Intel's Light Peak controller looked almost exactly like the 1st gen Thunderbolt controllers did, but it was connected by traces that were only an inch or two long to an optical transceiver module. The controllers appeared to be 4-channel (and were marked "LR A2", for code name "Light Ridge" perhaps?), while the optical modules were only 2-channel. This meant that a full 4-channel setup required two optical modules, each of which occupied more board space than the already sizable controller chip.

So right off the bat we have some issues. Light Peak is clearly a killer I/O interface for mobile devices, yet we have a solution that requires several rather large components that consume valuable board real estate as well as a significant amount of power. The optical components also added another $5-$10 to the BOM cost of what was already an outlandishly expensive technology for the PC market. Most OEMs were not terribly interested. The real deal-breaker though, was that despite appearing very involved in testing, Intel didn't actually make any part of the optical hardware for Light Peak. That was all sourced from a consortium including SAE Magnetics, Avago, Oclaro, Enabelance, IPtronics, Ensphere, Foxconn, FOCI and Corning.

Intel talked a big game about the optical transition being the chance for unification on the I/O front. They also used modified USB 3.0 connectors for most of their Light Peak demonstrations. I think they may have been hoping to go down the path of standardization, and possibly even have Light Peak rolled into the USB specification at some point. However, they had two eager customers that wanted the technology sooner rather than later and weren't concerned in the slightest about standards: Apple and Sony. Apple was willing to commit in a major way, but they also wanted an exclusive. There was no way to make that kind of deal with nearly a dozen players involved, so instead Apple and Intel cut everybody else out. Apple would bring their own trademark, Thunderbolt, and their own PHY based on the Mini DisplayPort connector which they had developed and thus held the license to. (Did anyone ever believe that Intel came up with the "Thunderbolt" moniker? If they had named it, they would have opted for something far more romantic, like "Intel CV82524EF/L Converged High-Speed Packet Based Controller (Formerly Light Ridge)".)

Not ones to miss an opportunity, Intel still sold their controllers to Sony, who used them in their Vaio Z notebooks and media dock via a "proprietary" optical implementation that was essentially identical to the Light Peak demonstration hardware. Through some careful couching of words by Sony and Intel, they could claim that this wasn't really Thunderbolt or USB 3.0, and therefore didn't infringe on any licensing agreements that might already be in place.

Now the issue that Apple faced was that they had an exclusive deal on a super fast controller that was only designed to push a signal down about 2 inches of copper. Repurposing the pins on the mDP connector and locating it close enough to the controller wasn't too hard, but the only way to propagate that signal down any reasonable length of cable required additional circuitry. With only six months to solve this problem, Apple adapted some more or less off-the-shelf components from the telecom industry and created the active Thunderbolt cable. This was probably the only solution that would also allow for backwards and forwards compatibility with future controllers or different media.

At some point down the road, Intel may well make the Thunderbolt PHY much more integrated and robust. But for now, using a tiny, consumer oriented, friction fit, 20-pin connector for 2 channels of bidirectional, 10.3125 Gbps signaling is going to require active cabling. The good news is that multiple vendors are developing silicon specifically targeting this problem, and thus prices should indeed come down—which was the topic of the original article. In addition to Intersil, I also noticed that TI seems to have a whole range of Thunderbolt products ready for market: http://www.ti.com/ww/en/analog/tps22985_thunderbolt/index.shtml?DCMP=hpa_int_thunderbolt&HQS=thunderbolt-bt1

BiggAW
Jul 7, 2012, 02:53 PM
Yeah, it's a Monoprice Mini DisplayPort to DVI cable. I have no idea why it's so freakin' huge either. And yes, it's a retina MBP.

Yikes. None of mine have been that big. It took me a minute to figure out what they might be, as that's where the ethernet port is on my MBP. :)

repoman27
Jul 7, 2012, 03:22 PM
Thank you for the polite reply.

Any "optical" T-Bolt cable today is a hybrid Cu-optical cable - it has copper connectors, with the copper protocols (and speeds) bridged across an optical segment.

True optical does not exist.

Apologies for the tone of my post. I might have been a bit fired-up when I wrote that.

As I said in my previous post, there is always going to be an electrical/optical boundary, and whether it is on one side of the physical connector or the other is immaterial. Everything to do with bit rates and protocols is handled by the upper layers and is the domain of the Thunderbolt controller. All the optical transceiver does is convert the voltage levels carried by the differential signaling pairs coming out of the controller into light impulses.

http://techon.nikkeibp.co.jp/english/NEWS_EN/20111018/199430/thumb_230_1.jpg

Whether this bit resides inside the PC or the cable connector does not make one cable necessarily better or "more optical" than another. You can argue that locating the VCSELs and photodiodes in the connector makes the cable no longer purely optical, but for most consumer applications, a purely optical cable has more drawbacks than benefits. It also involves creating 4 pairs of tiny lenses that can withstand hundreds if not thousands of mating cycles, plus exposure to abrasion and any number of environmental contaminants. Meanwhile, electrical is proven technology that is cheap to manufacture, and you need to use copper anyway if you want to provide bus power.

If you're looking for Truth, I suggest talking to the folks over at the LHC. ;)

morechicken
Jul 7, 2012, 04:34 PM
We need the lower prices now.

AidenShaw
Jul 7, 2012, 05:36 PM
Apologies for the tone of my post. I might have been a bit fired-up when I wrote that.

As I said in my previous post, there is always going to be an electrical/optical boundary, and whether it is on one side of the physical connector or the other is immaterial.

My post was responding to a post saying that optical could do 100 Gbps - and it's not immaterial for the current T-Bolt V1.0 technology to point out that "true optical" does not exist, and that 100 Gbps optical is a future roadmap as I said.


If you're looking for Truth, I suggest talking to the folks over at the LHC. ;)

I do - I worked on LHC for five of the years that I lived in Switzerland - I still keep in touch with a handful of people. And they're really excited that Higgs has virtually been found.

mazz0
Jul 7, 2012, 08:01 PM
The pictures are very deceiving. Here's a pic of mine for scale, next to a USB connector, MagSafe 2 connector, headphone, and what for some reason is an unnecessarily large MDP->DVI cable. It's maybe about a centimeter longer than the USB connector, but not long enough to be a problem I don't think.

http://i.imgur.com/7ZbMD.jpg

Oh, that's not as bad as I'd thought. Still looks to be coming out at a bit of a non-perpendicular angle though, is that caused by that behemoth next to it?

Sol
Jul 7, 2012, 09:54 PM
For now Thunderbolt has its pro uses like ultra fast external RAID and video capture but for general consumers it is a more expensive alternative to USB 3. Thunderbolt needs lower prices and more importantly, peripherals only possible on it, like external graphics cards that would boost performance on something like a MacBook Air and long cables.

BiggAW
Jul 7, 2012, 10:50 PM
Oh, that's not as bad as I'd thought. Still looks to be coming out at a bit of a non-perpendicular angle though, is that caused by that behemoth next to it?

It can't be, as they are all bent upwards. Probably just cable strain.

For now Thunderbolt has its pro uses like ultra fast external RAID and video capture but for general consumers it is a more expensive alternative to USB 3. Thunderbolt needs lower prices and more importantly, peripherals only possible on it, like external graphics cards that would boost performance on something like a MacBook Air and long cables.

I agree with the first part, and in reality, most consumers, even power users like me, are fine with USB 2. TB is great for pros now. I don't think it will ever be relevant for consumers. Consumers don't even want USB 3, but it will just become the standard, and no one will care one way or the other. If people happen to plug a USB 3 thing in to a USB 3 port, it will have the extra bandwidth available, if not, they probably won't care that it's running at USB 2 speed. Eventually everything will be USB 3 by default, and that will be that.

repoman27
Jul 8, 2012, 01:34 AM
My post was responding to a post saying that optical could do 100 Gbps - and it's not immaterial for the current T-Bolt V1.0 technology to point out that "true optical" does not exist, and that 100 Gbps optical is a future roadmap as I said.

So are you saying that Sumitomo's optical Thunderbolt cables are "false optical", or that they don't actually exist?

I do wholeheartedly agree that 100 Gbps is not not in the cards for the current generation of Thunderbolt.

I think there's a general failure by many people to understand that although fiber can provide tremendous bandwidth, it is still limited by the silicon that is driving the signal going to it. The way that 40 and 100 Gbps links, both copper and optical, are achieved today is through the aggregation of multiple 10 Gbps channels. You can buy 10 Gbps x12 cables that bundle 24 optical fibers or 48 copper conductors into a 120 Gbps link, but the connectors are fairly large and the cables are pretty unwieldy. Increasing the number of channels in a Thunderbolt cable is one way to add bandwidth, but the result doesn't align well with the design goals of the interface. In order to make Thunderbolt faster, Intel needs to raise the single channel data rate of the controller.

So let's look at some common high speed serial interfaces and their per lane signaling rates:

USB 3.0 - 5.0 GBaud
DisplayPort 1.2 - 5.4 GBaud
SATA 6Gb/s - 6.0 Gbaud
PCIe 3.0 - 8.0 GBaud
10GbE / Thunderbolt - 10.3125 GBaud
12Gb/s SAS - 12.0 Gbaud
FDR InfiniBand / 16GFC Fibre Channel - 14.0625 GBaud
That's it. There is nothing faster on the market, period.

So if adding more lanes to the cable isn't desirable, and the highest performance silicon in production is only 36% faster than the original Thunderbolt implementation, well that's not really enough to provide what could be considered a generational advance. That will only happen when single lane throughput of 25 Gbps can be achieved. The race to this milestone was previously being contested by Mellanox and Qlogic, until January of this year when, lo and behold, Intel acquired Qlogic's InfiniBand business. Thus the next speed bump to Thunderbolt will presumably follow the EDR InfiniBand rollout, most likely sometime in 2014. Maybe then fake optical cables will become standard.

Lancer
Jul 8, 2012, 04:00 AM
The pictures are very deceiving. Here's a pic of mine for scale, next to a USB connector, MagSafe 2 connector, headphone, and what for some reason is an unnecessarily large MDP->DVI cable. It's maybe about a centimeter longer than the USB connector, but not long enough to be a problem I don't think.

http://i.imgur.com/7ZbMD.jpg

So much for going wireless :lol:

Macman45
Jul 8, 2012, 04:07 AM
My one ( and at the moment only) Thunderbolt cable was a freebie with my iMac...I had no use for it until I bought my Pegasus R4 which it's using at the moment....Read up a bit about the Apple cables not being as good as they should be, folks wrapping them up in foil due to interference etc. Either I got a good one, or a lot of it is just BS....I've put my phone, iPad next to it when it's in action, and neither device disconnect ts or exhibits odd behaviour...THEN I priced them up....Here in the UK, the are Ł30 from Apple, a little cheaper on Amazon....Some of the new ones, which promise zero interference etc. are a lot more expensive. As Thunderbolt devices become more mainstream, we should see the price for cables drop....Glad I didn't have to buy one though...it's the kind of thing that really annoys me...You don't get one supplied with Promise or Thunderbolt ACD's either.

hchung
Jul 8, 2012, 05:17 AM
Is there a good reason for all those ICs to be built into the cable?* Surely it would have been more sensible to put all the processing hardware into the computer/peripheral at the socket, leaving the cable as just that - a cable.

(*unless, the conspiracy theorist might add, you're trying to make your laptop ultra-thin and create a nice money-making scheme into the bargain...)

Apologies if I've missed something obvious, but I still don't see the (technical) reason why all that tuning/multiplexing hardware should be built into the cable, rather than into the ports at either end...?

When a piece of cable is cut from a spool, it'll have certain characteristics based on minute differences between cables cut of the same length from one spool to the next. Little things matter that can't be easily controlled in manufacturing, including things that people normally don't think about like the formulation and thickness of the sheath around the actual wire, since that becomes the dielectric for a capacitor made of the length of a wire and the adjacent wires. Or the particular thickness of the wire. Or the soldering job on the end of the wire. And so on...

In order to push multi-gigabit/sec of data through a cable, the transceivers need to be tuned to the cable's characteristics in order to increase the likelihood of the data arriving uncorrupted on the other side.

It's kind of like trying to mass produce tuning forks by cutting into the same shape. You still have to fine tune each one individually because they'll still be slightly different.

There's two ways to go about providing calibration data:
1) include calibration equipment on the device using the cable. (your idea)
2) run the characterization tests after the cable has been assembled with the transceivers, and write the calibration data to the transceivers on the cable itself. (what Apple did)

Why is #2 better than #1? Because the hardware necessary to do the calibration costs more than your laptop. I don't really know the price of ADCs in the 20 gigasample/second range. But digikey.com says that some in the 3 gigasample/second range (not good enough) already cost $700 a piece. So I'm guessing ones that can actually do the job are well over $1k.

So, it's technically possible to put calibration equipment onto the device and then use cheap cables between devices. But I don't think you really want that to happen. :)

KnightWRX
Jul 8, 2012, 06:00 AM
So are you saying that Sumitomo's optical Thunderbolt cables are "false optical", or that they don't actually exist?


No, he's saying they're not actually Thunderbolt cables. They're a pair of bridges that happen to re-encapsulate Thunderbolt as it currently works over copper to a fiber medium.

Thunderbolt's specification does not support fiber currently as a transport medium. It will in the future, but we're not there yet.

toke lahti
Jul 8, 2012, 06:13 AM
Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers. Yes, it would certainly take a long time to adopt the technology like that, but it sounds like a convenient world once fully adopted, doesn't it?
Wishing the disappearance of legacy ports is pretty old fashioned. It will always take few decades more than estimated.
Dvi came in 13 years ago and everybody was shouting how vga will die at least when all screens are flat (=digital).
Fast forward 13 years and guess what is most common connector on laptop: VGA. Even when there's no analog displays anymore. Nobody with any knowledge in ITC would have predicted this a decade ago.
The actual "way things are" is changing. The new Intel chip has embedded support for Thunderbolt, so Windows computers will be able to have these ports without spending so much on Thunderbolt. The new chips will help on the price of the connectors, and market volume will lower the prices as well. Apple chose the right horse.
And Ivy Bridge also has support onboard for USB 3.0. Which can work on Thunderbolt along with the screen and solid state storage. And firwire 800 drives.
Once more and more Ivy Bridge is out there, the device manufacturers will start pumping out affordable things along with more expensive gee-whiz suff.
Imagine a computer in pieces, connected by some future Thunderbolt. The end of the tower. Completely configurable.
Also, complete baloney.
Ivy Bridge does not have tb integrated, maybe Haswell will next year.
Also IB does not have fw integrated like no chipset ever had.
Somehow Apple has always had discreet controllers for fw and tb, but always they have run out of space when they should have put usb3 controller (which has always also been smallest and cheapest) on mb.

Things are really changing, but Apple chose the wrong horse.
Things have changed in that so much money has been invested both in corporate world and in private world to existing devices and infrastructure, that today's connections won't just die away. Just like electricity sockets or light bulb sockets won't die away.
You know, there are still ps/2 sockets in mb's?
Industry has stopped trying to replace rj45 and instead are trying to figure out how to better use it.
Every year there is more stuff sold, which will add the legacy weight and therefore slows down "the new generation". This is why 3rd party manufacturers are not intrigued when just-one-more-port is invented. They won't use it (in large scale), if there's doubt, that most of they customers will need that for at least a decade. There won't be any revolutions in this area anymore, just slow evolution, where new things should be designed with sharp and broad vision to both past & the future.

As for tb, Apple hasn't played their cards well for the future. All tb stuff will remain expensive. The technology is just so much more complicated that it just can't get affordable without being sold to almost every computer user.
Usb3 will win this without any doubt. Very few will pay 10x for 2x speed, so tb will not became mainstream.
Tb is also getting old before it is even adopted.
It will not replace workstation for those who need bandwidth and for those who don't need, usb3 is enough.
People are already asking big retina displays. These will saturate current tb with no time. After that, you'll have faster connection to your storage via usb3 than tb.
If they want to keep dp integrated to tb, they need new version of tb, which will be even more expensive than the current one and also all current tb stuff will be obsolete. This is just dead end.
Maybe they'll just have keep tb in the backburner like fw and see if it will succeed after several years. Maybe they have learnt something from intel's mistake with rambus' rdram.
Consumers don't know what hd-sdi is, but it is a vital connection for big industry and that's how it should be. There no reason to market it to consumers as a new way to hook up displays and on the side make things more expensive.
I already wrote about the tower and saturation issue here:
http://forums.macrumors.com/showpost.php?p=15143816&postcount=528
I don't know how widely this will ever be used. Firewire was originally intended for high speed data transmission, and USB for standard one-connector-for-most-things. We all know how that went. USB got faster and cheaper, and eve though firewire 800 was faster, USB 2 was the go-to connector.

Unless thunderbolt turns into a one-cable-to-rule-them all (and aside from power it looks like it will get there) it's not going to take off the way USB did. It needs to be cheaper and easier to use. Thunderbolt only has one of those points covered.
Light Peak was originally designed to be an extension for other sockets.
What we have now is because it was tried to change it to something else.
Maybe the problem really was, that intel's researchers first developed the interconnection and only after that started thinking what it could be used for.
Apple's biggest marketing blunder with Thunderbolt : making everyone think it was aimed at replacing USB. Whether intentional or not on their part, that is now what most in the Apple community think.

Thunderbolt is not a USB replacement. Never will be. It'll be a niche host based connectivity option for higher end machines and peripherals, mostly used in the prosumer world.

Enterprise is going network based connectivity (FC or IP SANs for storage, network based peripherals over GBE or 10GE), consumers are sticking with lower mass market priced peripherals (USB3).
Exactly!
And the problem will be, that it is sold to 100% of mac users, when only 1% really benefits from it. It should be handled more like express card; only those who need it, need to buy it.
My bet is, that tb will cause more problems with average mac user than regular dp would, after big retina displays arrive to market.
(Another gen of displays from Apple, that have certain limitations with very recent macs.)
Intel talked a big game about the optical transition being the chance for unification on the I/O front. They also used modified USB 3.0 connectors for most of their Light Peak demonstrations. I think they may have been hoping to go down the path of standardization, and possibly even have Light Peak rolled into the USB specification at some point. However, they had two eager customers that wanted the technology sooner rather than later and weren't concerned in the slightest about standards: Apple and Sony. Apple was willing to commit in a major way, but they also wanted an exclusive. There was no way to make that kind of deal with nearly a dozen players involved, so instead Apple and Intel cut everybody else out. Apple would bring their own trademark, Thunderbolt, and their own PHY based on the Mini DisplayPort connector which they had developed and thus held the license to. (Did anyone ever believe that Intel came up with the "Thunderbolt" moniker? If they had named it, they would have opted for something far more romantic, like "Intel CV82524EF/L Converged High-Speed Packet Based Controller (Formerly Light Ridge)".)

Not ones to miss an opportunity, Intel still sold their controllers to Sony, who used them in their Vaio Z notebooks and media dock via a "proprietary" optical implementation that was essentially identical to the Light Peak demonstration hardware. Through some careful couching of words by Sony and Intel, they could claim that this wasn't really Thunderbolt or USB 3.0, and therefore didn't infringe on any licensing agreements that might already be in place.

Now the issue that Apple faced was that they had an exclusive deal on a super fast controller that was only designed to push a signal down about 2 inches of copper. Repurposing the pins on the mDP connector and locating it close enough to the controller wasn't too hard, but the only way to propagate that signal down any reasonable length of cable required additional circuitry. With only six months to solve this problem, Apple adapted some more or less off-the-shelf components from the telecom industry and created the active Thunderbolt cable. This was probably the only solution that would also allow for backwards and forwards compatibility with future controllers or different media.

At some point down the road, Intel may well make the Thunderbolt PHY much more integrated and robust. But for now, using a tiny, consumer oriented, friction fit, 20-pin connector for 2 channels of bidirectional, 10.3125 Gbps signaling is going to require active cabling. The good news is that multiple vendors are developing silicon specifically targeting this problem, and thus prices should indeed come down—which was the topic of the original article. In addition to Intersil, I also noticed that TI seems to have a whole range of Thunderbolt products ready for market: http://www.ti.com/ww/en/analog/tps22985_thunderbolt/index.shtml?DCMP=hpa_int_thunderbolt&HQS=thunderbolt-bt1
Apple chose the technically wrong way by coupling dp to tb, which resulted in hindered dp and now they will face the bandwidth problem.
Sony chose technically right way by keeping dp free of additional standard versions, but failed to negotiate with usb consortium.
All 4 (intel, apple, sony, usb consortium) failed in creating one solution that all would use and because of volume would become reasonably priced.
I just can't see that in current market situation more chips would make tb devices much cheaper. Volumes are simply too small, that it wouldn't matter if the chips would be free.
I guess money wasn't in mind for those who designed tb. To break the bad usability of chain topology, you need hubs. Eg. 4 port hub would need 4 controllers? In a situation where many tb devices have no second port for daisy-chaining, because those ports are just too expensive?
So if adding more lanes to the cable isn't desirable, and the highest performance silicon in production is only 36% faster than the original Thunderbolt implementation, well that's not really enough to provide what could be considered a generational advance. That will only happen when single lane throughput of 25 Gbps can be achieved. The race to this milestone was previously being contested by Mellanox and Qlogic, until January of this year when, lo and behold, Intel acquired Qlogic's InfiniBand business. Thus the next speed bump to Thunderbolt will presumably follow the EDR InfiniBand rollout, most likely sometime in 2014. Maybe then fake optical cables will become standard.
Does it really matter moneywise where the media conversion is made in chain-topology? Every connection needs 2 ports, 1 cable and 2 media conversions?

After all, back to the topic, if price of a cable is just a few percent of using the tb ecosystem, what does it matter if the cable is cheaper? Lowering the whole bill for 1%? Or even 2%.
Goosh, I'll need to run buy new mac with horrible glossy screen to get part of this amazing cost savings!

Sackvillenb
Jul 8, 2012, 07:47 AM
I seriously hope this happens. High pricing is the biggest detriment to thunderbolt right now, as most people are aware. And having such great technology (imperfect as it ma be) at our fingertips and yet just out of reach due to pricing is silly... I think things will improve once intel gets thunderbolt onto window's devices, which they will....

Neodym
Jul 8, 2012, 08:18 AM
You really needed USB 2.0 once flash drives over 256 MB rolled out.
Would be nice to have flash drives with a Thunderbolt connector (ideally together with a USB connector on the other end for compatibility)...

G51989
Jul 8, 2012, 08:22 AM
Neat.

Until these cables start costing 5 dollars, I can't see this interface becoming the standard.

Standard for higher end devices, sure. But thats it. Now if they can ever get the cables down to the sub 10 dollar range, then hell yeah

AidenShaw
Jul 8, 2012, 10:16 AM
So are you saying that Sumitomo's optical Thunderbolt cables are "false optical", or that they don't actually exist?

My previous post was:

Any "optical" T-Bolt cable today is a hybrid Cu-optical cable - it has copper connectors, with the copper protocols (and speeds) bridged across an optical segment.

True optical does not exist.

I'm saying that they're hybrid cables, not true optical. They have no advantages over copper cables except longer length and perhaps electrical isolation.


You can buy 10 Gbps x12 cables that bundle 24 optical fibers or 48 copper conductors into a 120 Gbps link, but the connectors are fairly large and the cables are pretty unwieldy.

They could use wavelength-division multiplexing (http://en.wikipedia.org/wiki/Wavelength-division_multiplexing) without adding more fibres.

repoman27
Jul 8, 2012, 10:32 AM
No, he's saying they're not actually Thunderbolt cables. They're a pair of bridges that happen to re-encapsulate Thunderbolt as it currently works over copper to a fiber medium.

Thunderbolt's specification does not support fiber currently as a transport medium. It will in the future, but we're not there yet.

What? Thunderbolt, nee Light Peak, was designed to use optical media from inception. Omitting the optical transceivers was a decision made based on cost and practicality in order to bring the technology to market more rapidly.

The reason I included the image of a Light Peak transceiver in my previous post was to illustrate that there is no significant logic in those devices. They are very simple; essentially all they do is convert electrons into photons. There is no "encapsulation" being performed whatsoever, and the signal is the same as the one being generated by the Thunderbolt controller regardless of the media being used. We're talking about a threshold here, not a bridge.

Have you never seen a technology that allowed for multiple interchangeable PHY's before? 10/40/100GbE, InfiniBand and Fibre Channel all use pluggable transceiver modules based on multi-source agreements so that the equipment can utilize whatever media is best suited to the deployment scenario. Sometimes the modules are separate from the cable itself, but for short runs, often the transceiver and cable are bonded to create a pluggable cable assembly. Would you argue that these technologies do not currently support fiber as a transport medium because not one of them outputs an optical signal directly from the switch or controller? Or would you say that they do because the transceiver module fits into a recess in the device and is thus not just part of the cable?

We are there. We just seem to be having trouble understanding that fiber optics doesn't magically make an I/O interface go faster, all it does is use light instead of electricity to carry the signal.

http://fcp.co/images/stories/2012_04/NAB_2012/sumitomo_thunderbolt_cable_2.jpg

SilianRail
Jul 8, 2012, 11:12 AM
PCI Express will have an external 32 Gbps cable that will obliterate thunderbolt in 2013.

KnightWRX
Jul 8, 2012, 11:16 AM
What? Thunderbolt, nee Light Peak, was designed to use optical media from inception. Omitting the optical transceivers was a decision made based on cost and practicality in order to bring the technology to market more rapidly.


Exactly what Aiden and me keep saying. What is it you don't understand about our posts and clarification exactly ? The current spec does not have optical as a transport medium. Any cable made out of fiber optics would need to take a copper connection and do a conversion to a light pulse, not necessiraly following any Thunderbolt specification since the other end will translate it back to a Thunderbolt copper signal.

----------


Have you never seen a technology that allowed for multiple interchangeable PHY's before?

Yes, Ethernet. Usually, following the ISO layer model, the physical layer is seperated from the logical link layer in a way that permits a protocol encapsulation from any type of logical link to physical media.

Lance-AR
Jul 8, 2012, 11:19 AM
PCI Express will have an external 32 Gbps cable that will obliterate thunderbolt in 2013.

This could be the delay on the Pro. How does the roadmap for internal PCI bandwidth compare?

repoman27
Jul 8, 2012, 12:03 PM
I'm saying that they're hybrid cables, not true optical. They have no advantages over copper cables except longer length and perhaps electrical isolation.

Well that's because the only advantages optical cables currently offer over copper are lower path loss, better isolation from interference, thinner, lighter, more resistant to certain types of corrosion, etc.

What advantage would be gained by placing the optical transceiver on the motherboard side of the connector, as was common with the Light Peak demonstration hardware and as was implemented by Sony in the Vaio Z? Note that the lens design that was available only allowed for single-channel cables, so it was half the bandwidth of the current implementation and did not provide multiple pathways to the devices.

They could use wavelength-division multiplexing (http://en.wikipedia.org/wiki/Wavelength-division_multiplexing) without adding more fibres.

Very true. Now how does that benefit the end user? How does it make the Thunderbolt controller faster? The per channel speed stays the same, but now we can fit more channels down a single pipe. What does that do to the complexity of the cross-bar switch in the controller? Say we go for a 5x increase in bandwidth, now your switch has gone from 8 ports to 24. How do we feed that from the back end? Add more protocol adapters, bump the DP adapters to DisplayPort 1.2 and increase the PCIe connection to PCIe 3.0 x16. Oops, now we need a 40-port Thunderbolt switch and a 32-lane PCIe 3.0 switch. We've got a massive, expensive, power hungry, 800-pin behemoth of a controller on our hands. Sounds perfect for a mobile device. Now everyone can pay an extra $480 for anything that includes a Thunderbolt port, and we still need to retain the copper in the connector to provide bus power and not break compatibility with DisplayPort/Thunderbolt 1.0.

The silicon is the limiting factor, not the medium. There is no "true" optical cable that can make the silicon driving it any faster, and simply increasing the parallelism of the system at this point doesn't make any sense.

BiggAW
Jul 8, 2012, 12:24 PM
The whole concept of optical on short-range cables is stupid. Fiber is a great technology, for delivering high bandwidth over distance, but for anything under 100 meters, copper is cheaper, easier, and has fewer things to break. They're already doing 10gbps over Ethernet, so I don't see the need for fiber at the even shorter desktop level. Copper can easily do the job for a tiny fraction of the price.

Optical Digital audio seems to be the one exception, because somehow it become a standard, and it dirt cheap now. Other than that, fiber should end on the outside wall of the home, and be copper thereafter, like FIOS. Now if I could just get fiber to the outside wall of my house... :D

I'm sticking with USB 2. When it just happens that I have USB 3 devices, and USB 3 cables came with them, I'll switch. Until then, USB 2 is just fine.

repoman27
Jul 8, 2012, 12:26 PM
Exactly what Aiden and me keep saying. What is it you don't understand about our posts and clarification exactly ? The current spec does not have optical as a transport medium. Any cable made out of fiber optics would need to take a copper connection and do a conversion to a light pulse, not necessiraly following any Thunderbolt specification since the other end will translate it back to a Thunderbolt copper signal.

What I don't understand, exactly, is why you think the Thunderbolt spec does not define the use of optical media, when clearly I posted an image of a licensed by Intel, made to specification, emblazoned with the Thunderbolt logo, honest-to-goodness optical Thunderbolt cable. The specification includes the use of active copper or optical cables, with the optical cables containing the transceivers within the connectors. Optical transport cables are also part of the DisplayPort 1.1 specification with which Thunderbolt ports are backwards compatible.

Yes, Ethernet. Usually, following the ISO layer model, the physical layer is seperated from the logical link layer in a way that permits a protocol encapsulation from any type of logical link to physical media.

So why do you understand the concept in the context of Ethernet, but not Thunderbolt?

LostSoul80
Jul 8, 2012, 12:27 PM
I don't think The problem is with the cables.

AidenShaw
Jul 8, 2012, 12:52 PM
Very true. Now how does that benefit the end user? How does it make the Thunderbolt controller faster? The per channel speed stays the same, but now we can fit more channels down a single pipe.

Ever heard of "teaming", as in "Ethernet teaming"?

PCIe packets can be transmitted in parallel on multiple channels, multiplying bandwith for the end user - so the "per channel" speed for the extended PCIe bus can be the sum of the actual fibre channels used.

For example, instead of only PCIe x4, WDM with 4 channels at the current data rate would allow PCIe x16 devices to be supported!
__________

But, enough of this tangent - T-Bolt 1.0 will never go faster than it is now, regardless of whether copper or hybrid Cu-optical cables are used.

We'll have to wait for T-Bolt V2.0 (or "PCI Express External") for faster external connections.

theelysium
Jul 8, 2012, 12:59 PM
yeah how about shorter while you are at it! Who the hell always needs a 6ft cable? I'd pay 10$ for a 1ft!

baryon
Jul 8, 2012, 01:39 PM
When optical TB becomes more prevalent, they'll probably move the hardware back into the port controller. Then there will be cheaper cables, but it will establish a fragmented legacy cable market. Perhaps we might also see a transducer dongle adapter.

All this putting chips in cables does sound perverse, but we live in an age where things have become so small and so cheap there's really no reason not to put intelligence inside the most mundane things.

Yeah I guess Thunderbolt will be perfect once it goes optical… Until then it's just a limited solution, and until then I would say it's not that much better than USB 3.0, for most uses, when considering the price.

And I was surprised to find chips in ink cartridges that keep track of ink levels without having a clue about actual ink levels… Another way to make people pay more!

BiggAW
Jul 8, 2012, 05:15 PM
Yeah I guess Thunderbolt will be perfect once it goes optical… Until then it's just a limited solution, and until then I would say it's not that much better than USB 3.0, for most uses, when considering the price.

And I was surprised to find chips in ink cartridges that keep track of ink levels without having a clue about actual ink levels… Another way to make people pay more!

Optical is not any more perfect than copper. There is no benefit at short distances.

KnightWRX
Jul 8, 2012, 07:48 PM
Optical is not any more perfect than copper. There is no benefit at short distances.

Sure there is, no cross-talk or interference by electrical and magnetic fields, resulting in higher throughput due to less error correction being required. On the other hand, careful how you bend that wire.

repoman27
Jul 8, 2012, 09:56 PM
Ever heard of "teaming", as in "Ethernet teaming"?

PCIe packets can be transmitted in parallel on multiple channels, multiplying bandwith for the end user - so the "per channel" speed for the extended PCIe bus can be the sum of the actual fibre channels used.

For example, instead of only PCIe x4, WDM with 4 channels at the current data rate would allow PCIe x16 devices to be supported!
__________

But, enough of this tangent - T-Bolt 1.0 will never go faster than it is now, regardless of whether copper or hybrid Cu-optical cables are used.

We'll have to wait for T-Bolt V2.0 (or "PCI Express External") for faster external connections.

I am familiar with teaming, but as I said, introducing higher orders of parallelism to Thunderbolt doesn't bring anything desirable to the table. The first generation of Thunderbolt controllers included 2-channel and 4-channel designs ranging in price from roughly $20-$30. The 2nd generation brought us a single-channel controller and the first Thunderbolt accessory to retail for under $30. Everyone is clamoring for cheaper Thunderbolt gear. Adding more lanes to a serial interface tends to scale up the costs associated with it in a fairly linear fashion. I don't see a lot of forum posts where people are saying, "Heck, I'd happily pay twice as much for Thunderbolt if only it had more bandwidth, but 10 Gbps x2 just won't cut it for my workflow."

And while we do often see serial interfaces such as PCIe where several lanes are aggregated into a single link, generational advances are almost universally achieved by increasing the symbol rate, not by further increasing lane count. In this way the physical interface requires little to no modification, which in turn allows for backwards and forwards compatibility, and there is usually no significant increase in cost during the transition. Each new generation generally strives to double the symbol rate of the one that preceded it. i.e. PCIe 1.0 @ 2.5 GBaud -> PCIe 2.0 @ 5.0 Gbaud -> PCIe 3.0 at 8.0 Gbaud (but with significantly more efficient encoding so the bit rate was effectively doubled), SATA 1.5 -> 3.0 -> 6.0 Gbps, DisplayPort 1.1 @ 2.7 GBaud -> DisplayPort 1.2 @ 5.4 GBaud, etc.

SATA 12Gb/s could be available right now if SATA-IO decided to double the number of signaling pairs in the cable, but it would be very tricky to make equipment from different generations work together if they did. Thunderbolt is a dual-channel architecture at this point, and those channels are already operating just about as fast as they can.

And yes, I'll try to stop with the tangents now.

Sure there is, no cross-talk or interference by electrical and magnetic fields, resulting in higher throughput due to less error correction being required. On the other hand, careful how you bend that wire.

Fiber is bound by Shannon's channel capacity curve just as much as any other medium. And I'd be careful how I bent any cable carrying 10+Gbps channels, especially in light of how much they tend to cost.

Yeah I guess Thunderbolt will be perfect once it goes optical… Until then it's just a limited solution, and until then I would say it's not that much better than USB 3.0, for most uses, when considering the price.

And I was surprised to find chips in ink cartridges that keep track of ink levels without having a clue about actual ink levels… Another way to make people pay more!

A 4-channel Thunderbolt controller is capable of pumping 40 Gbps vs. a USB 3.0 controller which can only manage 4 Gbps. That's a full order of magnitude more bandwidth. Even if you're looking at a single channel, Thunderbolt is still 2.5 times faster than USB 3.0. These are not insignificant differences.

Thunderbolt devices and cables are also, on average, an order of magnitude more expensive than their USB 3.0 counterparts. And since most people don't generally require more than the 4 Gbps that USB 3.0 offers, it is understandable why many folks were annoyed that Apple chose to offer Macs with Thunderbolt ports for 16 months prior to their inclusion of USB 3.0.

I would not say that Thunderbolt is a limited solution, but rather the opposite. It merely remains underexploited at this juncture. That being said, there's no reason to pay 10 times more for a Thunderbolt device if you've got another option available that can get the job done just as well.

And you do realize that the purchase prices of printers these days are entirely subsidized by the cost of the consumables? That's why a free printer is generally the most expensive printer you can own.

AidenShaw
Jul 8, 2012, 10:03 PM
Sure there is, no cross-talk or interference by electrical and magnetic fields, resulting in higher throughput due to less error correction being required. On the other hand, careful how you bend that wire.

Do you worry about cross-talk or interference on your USB cables?

Do you worry about cross-talk or interference on your GbE cables?

Do you worry about cross-talk or interference on your 1394 cables?

Do you worry about cross-talk or interference on your eSATA cables?

Since I use all of these, and the answer is "NO" for all, why shouldn't I conclude that this is one more piece of evidence that T-Bolt is a half-baked concept, rolled out before reasonable real-world testing?

If cross-talk and interference are issues for T-Bolt 1.0 - then T-Bolt 1.0 has serious flaws. We should look forward to PCIe Express External to kill T-Bolt outright.

repoman27
Jul 8, 2012, 11:13 PM
Do you worry about cross-talk or interference on your USB cables?

Do you worry about cross-talk or interference on your GbE cables?

Do you worry about cross-talk or interference on your 1394 cables?

Do you worry about cross-talk or interference on your eSATA cables?

Since I use all of these, and the answer is "NO" for all, why shouldn't I conclude that this is one more piece of evidence that T-Bolt is a half-baked concept, rolled out before reasonable real-world testing?

You need a little logic review. Your conclusion is a total non sequitur.

Did the people who designed your USB cables worry about cross-talk or interference?

Did the people who designed your GbE cables worry about cross-talk or interference?

Did the people who designed your 1394 cables worry about cross-talk or interference?

Did the people who designed your eSATA cables worry about cross-talk or interference?

Since the answer to all of these questions is hopefully "YES", and they all seem to work just fine as a result, you should conclude that you would also have a similar experience if you used a Thunderbolt cable, since the engineers who designed it clearly worried about cross-talk and interference as well, and took steps to compensate for them specifically so that you wouldn't have any issues.

And once again, Thunderbolt is operating at 10.3125 GBaud. That's 29% faster than PCI 3.0. Would you be concerned about crosstalk and interference if someone asked you to create a 2m external PCIe 3.0 x2 cable?

If cross-talk and interference are issues for T-Bolt 1.0 - then T-Bolt 1.0 has serious flaws. We should look forward to PCIe Express External to kill T-Bolt outright.

http://image.ec21.com/image/molexkorea/oimg_GC01586823_CA01586876/External_PCI_Express_PCIe_.jpg

Yeah, baby! One of those is gonna look dead sexy hanging off of your Dell Precision Mobile Workstation!

I wish I could find an image that gave a better sense of scale so you could witness just how huge those suckers are. But you're right, they're gonna kill Thunderbolt outright in the Ultrabook segment. I can't wait for these products that Molex brought to market back in December of 2008 to suddenly go mainstream and prove to everyone just how much of a total marketing fail Thunderbolt is.

I'm not sure why I continue to be a victim of your trolling.

KnightWRX
Jul 9, 2012, 04:04 AM
Do you worry about cross-talk or interference on your GbE cables?


Yes.

toke lahti
Jul 9, 2012, 06:56 AM
I am familiar with teaming, but as I said, introducing higher orders of parallelism to Thunderbolt doesn't bring anything desirable to the table. The first generation of Thunderbolt controllers included 2-channel and 4-channel designs ranging in price from roughly $20-$30. The 2nd generation brought us a single-channel controller and the first Thunderbolt accessory to retail for under $30. Everyone is clamoring for cheaper Thunderbolt gear. Adding more lanes to a serial interface tends to scale up the costs associated with it in a fairly linear fashion. I don't see a lot of forum posts where people are saying, "Heck, I'd happily pay twice as much for Thunderbolt if only it had more bandwidth, but 10 Gbps x2 just won't cut it for my workflow."
If they want to keep dp coupled with tb, there is a problem and something has to be done. Current tb does not have enough bandwidth for future retina displays. Combined with Apple's obsession for insanely limited amount of simultaneous models, I can't believe that they would sell both non-retina (for people who want to use multiple external displays with one computer) and retina (for people who want to use only one external display with one computer) model at the same time.

The per channel speed stays the same, but now we can fit more channels down a single pipe. What does that do to the complexity of the cross-bar switch in the controller? Say we go for a 5x increase in bandwidth, now your switch has gone from 8 ports to 24. How do we feed that from the back end? Add more protocol adapters, bump the DP adapters to DisplayPort 1.2 and increase the PCIe connection to PCIe 3.0 x16. Oops, now we need a 40-port Thunderbolt switch and a 32-lane PCIe 3.0 switch. We've got a massive, expensive, power hungry, 800-pin behemoth of a controller on our hands. Sounds perfect for a mobile device. Now everyone can pay an extra $480 for anything that includes a Thunderbolt port, and we still need to retain the copper in the connector to provide bus power and not break compatibility with DisplayPort/Thunderbolt 1.0.

The silicon is the limiting factor, not the medium. There is no "true" optical cable that can make the silicon driving it any faster, and simply increasing the parallelism of the system at this point doesn't make any sense.
I'd guess that doubling the channel speed is just too difficult eg. too expensive. Then the only option is more channels.

Going 5x is of course very complex, but going 2x might be pretty reasonable.
Then there is need for tb2 connector. Cables can still be active-copper (like now) or passive-optical. I don't think there would be big difference in overall price. In optical, sockets would be expensive, but the cables cheaper. In optical, use wavelength-division, if it's cheaper or double the fibers, their thickness is not notable.

Legacy interoperability will need dongles. Nothing new in here, we're already using dp-dongles, fw-dongles and ethernet-dongles.

Good thing with passive-optical cables could be, that they could be also used with future revisions. If you double the wire count for tb2, they could add wavelength-division in tb3. With current cable prices, it might be pretty nice that one cable could last for 2 generations and you wouldn't need new cable for every gadget.

Then, the final option: go back to drawing table and think again what light peak was designed for and making the logical decision: separating dp and tb again.

And voilá, there's no need for upgrading anything!
Current dp1.2 is good enough for multiple retina displays and current tb's 40Gbit/s bandwidth is enough for everything else.

Only problem here is, that this would look like Apple was wrong and Sony was right and Apple could never accept this. So, we're down at Apple's PR department, where they need to come up with believable story, how to market this mistake as insanely amazing innovation.

repoman27
Jul 9, 2012, 08:56 AM
If they want to keep dp coupled with tb, there is a problem and something has to be done. Current tb does not have enough bandwidth for future retina displays. Combined with Apple's obsession for insanely limited amount of simultaneous models, I can't believe that they would sell both non-retina (for people who want to use multiple external displays with one computer) and retina (for people who want to use only one external display with one computer) model at the same time.
...

Then, the final option: go back to drawing table and think again what light peak was designed for and making the logical decision: separating dp and tb again.
...


Although you do say "future retina displays", it is worth noting that the MBPR's 2880x1800 screen is pretty much the highest resolution you can still drive via DP 1.1a. That means that a single Thunderbolt port can still drive two external displays at that resolution. Since the internal panels on the iMacs would be driven by DP 1.2 directly from the GPU, they can go "retina" if Apple so chooses. And I don't think too many people would be disappointed to see a not-necessarily-retina, 30-inch, 2880x1800, Apple Thunderbolt Display with USB 3.0 ports...

Many of the early Light Peak demonstrations involved transporting uncompressed HD display data at the same time as other data. I don't believe DP was a last minute addition in any way. If anything, I think Intel considered including an even wider array of protocol adapters within the controllers.

baryon
Jul 9, 2012, 12:45 PM
Optical is not any more perfect than copper. There is no benefit at short distances.

Yeah but I'm speaking about the technical aspect: it's just a transparent plastic wire, and nothing more, while the electric version today is not simply a cable but two microcomputers and eight or so wires. It's far more complex, thus prone to failure and higher prices.

Optical should be far simpler, and shouldn't require a pair of microcomputers in each freaking cable. Therefore I suspect that optical should be many times cheaper, even though it isn't going to be for pointless marketing reasons.

Josephiah
Jul 9, 2012, 01:45 PM
When a piece of cable is cut from a spool...
[snip]
...So, it's technically possible to put calibration equipment onto the device and then use cheap cables between devices. But I don't think you really want that to happen. :)

Thanks hchung for a perfectly pitched, well-mannered and well explained answer. :)

coolspot18
Jul 9, 2012, 02:33 PM
PC manufacters have been very slow to adopt, however perhaps it will gain traction. Although it's superior I can see it taking FireWires place as 2nd to USB3.

A very very distant second place.

It's dead for the mainstream consumer market - USB 3.0 is the future.

----------


Picture this: in the year 2015, the ONLY ports on most computers are USB 3.0 and Thunderbolt. You no longer have a need for VGA, DVI, HDMI, FireWire, or even Ethernet. All of those could be run through Thunderbolt and suddenly it's much easier to connect devices to computers.

Won't happen because people don't want to buy several dongles for ethernet, vga, dvi, etc. People are happy with physical ports.

Problem with Thunderbolt is that it does something that not many people want for too expensive of a price. For most people, USB 2.0/3.0 are good enough - and much cheaper.

----------

[/COLOR]I own over a thousand dollars worth of TB peripherals.

So one RAID array? heh.

BiggAW
Jul 9, 2012, 07:37 PM
Sure there is, no cross-talk or interference by electrical and magnetic fields, resulting in higher throughput due to less error correction being required. On the other hand, careful how you bend that wire.

In theory. In practice, it doesn't really affect anything.

Yeah but I'm speaking about the technical aspect: it's just a transparent plastic wire, and nothing more, while the electric version today is not simply a cable but two microcomputers and eight or so wires. It's far more complex, thus prone to failure and higher prices.

Optical should be far simpler, and shouldn't require a pair of microcomputers in each freaking cable. Therefore I suspect that optical should be many times cheaper, even though it isn't going to be for pointless marketing reasons.

You still have to have optical transceivers on either end. Even if they're in the device.

KnightWRX
Jul 9, 2012, 07:52 PM
In theory. In practice, it doesn't really affect anything.

You've never dealt with a shoddy Cat5e installation before it seems. Of course, I've also dealt with fiber optics squeezed between floor tiles and bent at 90 degrees (good thing the SAN is multipathed).

AidenShaw
Jul 9, 2012, 08:44 PM
You've never dealt with a shoddy Cat5e installation before it seems.

First mistake - running GbE on Cat5e instead of Cat6.


I've also dealt with fiber optics squeezed between floor tiles and bent at 90 degrees (good thing the SAN is multipathed).

I bend fibre 90° and more all of the time.

The problem isn't the degrees, it's the radius.

toke lahti
Jul 10, 2012, 09:23 AM
Although you do say "future retina displays", it is worth noting that the MBPR's 2880x1800 screen is pretty much the highest resolution you can still drive via DP 1.1a. That means that a single Thunderbolt port can still drive two external displays at that resolution. Since the internal panels on the iMacs would be driven by DP 1.2 directly from the GPU, they can go "retina" if Apple so chooses. And I don't think too many people would be disappointed to see a not-necessarily-retina, 30-inch, 2880x1800, Apple Thunderbolt Display with USB 3.0 ports...
Everybody's expecting that the new display is retina. They already did with iphone, ipad and mbp. Non-retina would be disappointment like last mp "update" was. (Funny that in new MSI's mb, there's next vga port next to tb port quite happily and at the same time Apple can't put tb to MP even if it kills as many legacy ports as possible on all its products... ;) )
I'd be interested in any high quality display from Apple with 10-bit colors and matte surface, retina or not, but MATTE.
Many of the early Light Peak demonstrations involved transporting uncompressed HD display data at the same time as other data. I don't believe DP was a last minute addition in any way. If anything, I think Intel considered including an even wider array of protocol adapters within the controllers.
If I remember correctly, intel wanted to include light peak to usb socket, but usb consortium didn't approve their intentions.

And Apple should have known that it will be pushing megapixel boundaries with their displays and at the same time they choose a connection that really is a meta connection that will always lag behind one generation.

Uncompressed HD is so last decade. Today you have to be able to quadruple that. Again, making current macs "future-non-proof" can also be just marketing decision from Apple.

KnightWRX
Jul 10, 2012, 09:28 AM
The problem isn't the degrees, it's the radius.

Bent 90 degrees in my head means over a distance of about 2 mm. My floor tiles when they squeeze a wire don't tend to make their radius over a distance of a few meters.

g4cube
Jul 10, 2012, 10:28 AM
If I remember correctly, intel wanted to include light peak to usb socket, but usb consortium didn't approve their intentions.



I've enjoyed reading the give and take, and just barely touching some of the true issues in providing economical, high-speed cables.

First, let's take a look at Intel's original idea of an optical connection piggy-backing onto a standard USB connector. the most difficult implementation detail of this solution is the transition from Fiber to metal. It is really a difficult thing to do - need to balance alignment, power levels, and transmission loss across the boundary. It not so easy to get a reliable connection. It's not just putting the end of the optical fiber "close" to a laser transceiver.

I think Intel simply ran out of time trying to get a reliable solution to market. It's one thing to make a few prototypes to demonstrate feasibility; it's another to make a reliable solution that can be manufactured sonsistently at low cost.

As for the copper cables that exist now. As we all know, it has taken quite a long time to get even to this point, with cable costs slowly declining. As has been discussed extensively, these active cables are labor intensive due to the need to calibrate each end of the cable to deliver consistent performance.

At 10Gbps speeds, each transition is a potential bottleneck. It is important to maintain a constant impedance across every transition: cable to transceiver chip; transceiver ship to connector plug. Then plug to receptacle within the computer or peripheral. Within the computer or peripheral, the PCB traces must conform to transmission line characteristics, with special PCB material, and controlled radius routing of traces to the controller chips. The overall goal is to provide a controlled impedance path between the controller in the computer to the controller in a device. Every transition has the potential of preventing reliable signal transmission.

I mention all this because it is important to contrast this with USB 3.0. Here we have a 5Gbps signal path where the expectation is that we have low-cost connectors and no active cable. Build-quality of the cable will have a significant contribution to reliable connections. A quick perusal of support forums for any computer manufacturer with USB 3.0 support will reveal a lot of discussion about unreliable USB 3.0 connections and performance.

Important factors again are a reliable connection between controller in the computer and controller in the device. Every transition "should" try to preserve the transmission line characteristics of a controlled impedance connection from end-to-end. It's not just a wire. In the real world, every transition has the potential of preventing reliable performance.

Whether USB or Thunderbolt, reliable connections are possible. It is just a matter of quality cable construction and device or computer design. Alas. this may require an investment of new, expensive equipment to verify the initial design, and to audit the manufacturing process to assure consistency.

With volume and experience, the methods will improve, and hopefully also drive the costs down.

repoman27
Jul 10, 2012, 10:52 AM
First mistake - running GbE on Cat5e instead of Cat6.

If you're not worried about cross-talk or interference on your GbE cables, why on earth would you waste your money on Cat 6? In what way does your GbE network benefit from Cat 6 over Cat 5e?

If you're gonna bother switching to 22 AWG you might as well pony up for Cat 6a so at least you have the headroom for 10GbE out to 100m. Cat 6 always seemed to me to be nothing more than a great way to get everyone to buy all new cable, connectors and jacks while we're sitting around waiting for 10GBASE-T to come down in price/power consumption—and then they get to do it to us all over again with Cat 6a. Seriously, if you aren't bound to do so by contractual obligation, or using your UTP for something other than GbE that is bandwidth intensive like video, why would you actually pay more for Cat 6?

Cat 6 is just Monster Cat 5e.

BiggAW
Jul 11, 2012, 08:40 PM
You've never dealt with a shoddy Cat5e installation before it seems. Of course, I've also dealt with fiber optics squeezed between floor tiles and bent at 90 degrees (good thing the SAN is multipathed).

You can run anything shoddily. Wired, wireless, copper, or fiber. So that really has nothing to do with the discussion.

toke lahti
Jul 12, 2012, 08:02 AM
As we all know, it has taken quite a long time to get even to this point, with cable costs slowly declining.

At 10Gbps speeds, each transition is a potential bottleneck.

With volume and experience, the methods will improve, and hopefully also drive the costs down.
How much the cost of the one and only tb cable in the market has declined in past 2 years?

You do know that they have sold passive HDMI cables for 9 years, which are cheaper and longer than tb cables and have certified throughput of 5 Gbit/s?
Why this is not possible with usb3?

You also know that they are also now selling 15 meter long high speed passive HDMI cables, that are certified for 10 Gbit/s and cost less than the famous one and only tb cable?

g4cube
Jul 12, 2012, 12:17 PM
toke,

all good points.

TB has 2 bi-directional 10Gbps pipes. HDMI has 1 unidirectional pipe.

It's not impossible to make a good, reliable cable. Just hasn't happened in volume yet for TB, or for USB 3. As I mentioned it is the entire chain, not just the cable.

TB implementation for shorter cables does permit lower cost. Just look at the TB-Ethernet dongle that Apple is selling for $29.

Also, to OEMs with alternate suppliers like Sumitomo, costs are reduced allowing some new products to include the cable, see the new drive from Buffalo and the earlier product from El Gato. Seagate also has a product configuration for the Mac that bundles a TB cable.

2 years? Not quite. Apple introduced the first TB computers in Feb 2011, and LaCie and Promise did not start shipping products until a few months later.

While there are plenty of passive HDMI cables (good and bad), there are also longer active HDMI cables. The vendors must have some reason to provide them.

I'm not in the camp saying that TB will replace USB 3.0; nor vice-versa. Both technologies serve a particular market with capabilities users want.

Both USB 3.0 and TB are still immature, but still improving, too. USB-IF is still revising the USB 3.0 spec. There still are certain classes of USB devices that can't get USB-IF certification because the specs are still evolving. TB being a non-public standard, it is more difficult to determine what is going on behind the scenes.

We'll see what happens in the coming months as PC manufacturers actually start shipping alternatives to the Mac computers.

I do agree that TB serves a niche; not the mass consumer market. The latter will still be dominated by USB peripherals. We'll just need to get to some stage of maturity.

AidenShaw
Jul 12, 2012, 03:24 PM
Originally Posted by AidenShaw
Do you worry about cross-talk or interference on your GbE cables?

Yes.

I don't... ;)

BiggAW
Jul 12, 2012, 05:35 PM
It's not impossible to make a good, reliable cable. Just hasn't happened in volume yet for TB, or for USB 3. As I mentioned it is the entire chain, not just the cable.

Clearly no one is making cheap USB 3.0 cables. Case in point:

http://www.monoprice.com/products/product.asp?c_id=103&cp_id=10303&cs_id=1030309&p_id=6504&seq=1&format=2

toke lahti
Jul 13, 2012, 08:27 AM
TB has 2 bi-directional 10Gbps pipes. HDMI has 1 unidirectional pipe.
Yes, I know. I just gave examples for reasoning that fast connection can be made with passive cables and if it's more affordable, then it's more reasonable.
Funny that at present time to carry tb's bandwidth, it would be more affordable to use multiple hdmi connections or any other connections than the tb itself. So far most "not the best bang for a buck" solutions for consumer market has failed. This does not predict very bright future for tb. Remember rdram?
Maybe tb is still too complex, a bit like dp was at the beginning and maybe tb will survive, but Apple's priorities offering new technologies is just so far of what I would think being best value for customers.
Usb3 has been mature enough to be offered for years. Nevertheless they posponed it as long as possible without loosing their face and touted about tb, which also crippled (in a way) dp.
I guess I'm not the only mac user, who could have enjoyed the benefits of usb3 for years with my mac now.
It's not impossible to make a good, reliable cable. Just hasn't happened in volume yet for TB, or for USB 3. As I mentioned it is the entire chain, not just the cable.
Tb and usb3 is pretty much the opposites of cable tech and quality control. The other is most expensive and rarest thing you can find and the other one is the most produced and cheapest cable on earth. Since it doesn't carry lethal voltage, there's no regulations which it officially have to meet. Cable factories in China try to save every last penny they can and of course there's a lot of quality problems.

I've purchased Seagate's enclosures, where cable's where packed with so tight that they were almost broken from the beginning. No-name vendors will have even worser quality.

On the other hand, there are high quality usb cables with very affordable price and tb will never meet those prices.
TB implementation for shorter cables does permit lower cost. Just look at the TB-Ethernet dongle that Apple is selling for $29.
Cable length is not necessary the defining factor here. That dongle might not have active cable circuits in both ends. It might convert tb to ethernet at the tb side and the rest of cable can be just passive cat and socket.
2 years? Not quite. Apple introduced the first TB computers in Feb 2011, and LaCie and Promise did not start shipping products until a few months later.
Okay, I'll correct the time to 1.5 years. What's the answer then?
While there are plenty of passive HDMI cables (good and bad), there are also longer active HDMI cables. The vendors must have some reason to provide them.
Yes, if you need longer than specs allow, you need active cable. Big rooms need this. I've worked in all kind of presentations, so there's nothing new here. But the question remains, is tb really technologically good and cost-effective choice, when it needs very expensive cables even with shortest lengths and at the same time older standards do better?
I do agree that TB serves a niche; not the mass consumer market. The latter will still be dominated by USB peripherals. We'll just need to get to some stage of maturity.
Maybe I'm too negative or criticize too harshly Apple's decisions, but I've been really disappointed to new Macs for years now. Maybe I'll have to switch away. Every new release brings something amazing to the package, but takes away more features important for me.

They just gave us Retina mac, but took away 17" AND matte screen AND upgrading ram and storage. Usb3 came about 2 years too late. Hdmi came to macbooks about 5 years too late. Now they can't support external retinas because of crippled dp in tb. Macpro has been a big (and expensive) joke for years and will be that at least about an year. And where are the new imacs? Do they once again take away features that doesn't make the product any better? Do we get mac mini with usb3 before 2014?

repoman27
Jul 13, 2012, 07:23 PM
Yes, I know. I just gave examples for reasoning that fast connection can be made with passive cables and if it's more affordable, then it's more reasonable.
Funny that at present time to carry tb's bandwidth, it would be more affordable to use multiple hdmi connections or any other connections than the tb itself. So far most "not the best bang for a buck" solutions for consumer market has failed. This does not predict very bright future for tb. Remember rdram?
Maybe tb is still too complex, a bit like dp was at the beginning and maybe tb will survive, but Apple's priorities offering new technologies is just so far of what I would think being best value for customers.
Usb3 has been mature enough to be offered for years. Nevertheless they posponed it as long as possible without loosing their face and touted about tb, which also crippled (in a way) dp.
I guess I'm not the only mac user, who could have enjoyed the benefits of usb3 for years with my mac now.

While it may be more affordable to use multiple HDMI cables, it would be utterly impractical. It would require four HDMI ports and cables to approximate a single Thunderbolt port and cable.

HDMI uses 3 TMDS channels operating at a maximum rate of 3.4 Gbps to create a single, half-duplex link capable of carrying 10.2 Gbps, or 8.16 Gbps after accounting for 8b/10b overhead.

DisplayPort 1.2 uses 4 lanes at up to 5.4 Gbps to create a single, half-duplex, main link of 21.6 Gbps, or 17.28 Gbps less 8b/10b overhead. There are currently zero DisplayPort 1.2 HBR panels or MST hubs on the market though, despite the first sink device silicon receiving certification on August 10, 2011. So all existing DisplayPort gear operates at only 2.7 Gbps per lane, 10.8 Gbps per link, or 8.64 Gbps without encoding overhead.

USB 3.0 SuperSpeed mode utilizes 2 dedicated signaling pairs to create a single, full-duplex 5 Gbps link, which provides 4 Gbps without 8b/10b overhead.

Thunderbolt combines 4 signaling pairs operating at 10.3125 Gbps to create 2 full-duplex channels capable of transporting 10 Gbps. That's a symbol rate 3.8x DisplayPort, 3.0x HDMI and more than 2x USB 3.0. Reliably achieving these kinds of speeds is currently a non-trivial engineering problem at power levels reasonable for a mobile device. This is the first time an interface anything like this has been included standard on a consumer PC. If your work flow can actually benefit from a 10 Gbps I/O port, Thunderbolt has the potential to be a massively less expensive solution than the other currently available options.

Apple did not postpone inclusion of USB 3.0 for years. As soon as Intel included it, they did. 2012 will mark the first year that USB 3.0 will even achieve a 50% attach rate for new PCs and motherboards. USB 3.0 has only been available for 2.5 years from any manufacturer. Quarterly production of USB 3.0 controllers did not even exceed 10m until Q3 2011. To say that USB 3.0 was ubiquitous before this year is a gross overstatement of the situation, with only 6.5% of PCs in the wild having at least one port by the end of 2011.

And to say that Thunderbolt crippled DisplayPort is just not accurate either. Current Thunderbolt controllers are capable of carrying up to two DP 1.1a streams. Intel's HD Graphics 3000 was not even capable of outputting DP 1.2. The 2012 Macs with Cactus Ridge controllers may actually be capable of outputting DisplayPort 1.2 signals over their Thunderbolt ports when used in DP mode, but there's not much way of knowing since there are precisely zero DisplayPort 1.2 panels or MST hubs available on the market to test this with. Until there exists a display that cannot be driven by Thunderbolt due to it only supporting DP 1.1a, this simply does not affect anyone. It is of absolutely no consequence in the real world at this juncture.

Tb and usb3 is pretty much the opposites of cable tech and quality control. The other is most expensive and rarest thing you can find and the other one is the most produced and cheapest cable on earth. Since it doesn't carry lethal voltage, there's no regulations which it officially have to meet. Cable factories in China try to save every last penny they can and of course there's a lot of quality problems.

I've purchased Seagate's enclosures, where cable's where packed with so tight that they were almost broken from the beginning. No-name vendors will have even worser quality.

On the other hand, there are high quality usb cables with very affordable price and tb will never meet those prices.

Even being active and made by Apple, Thunderbolt cables are actually just about the cheapest cables available that can handle symbol rates of 10.3125 GBaud. And they're not terribly hard to find. Granted you can't pick one up at the local pharmacy, but then again they've only been on the market 18 months, versus over 14 years for USB. Also bear in mind that the cheapest cable Apple sells goes for $19. Thunderbolt cables aren't a bad value at $49.

Cable length is not necessary the defining factor here. That dongle might not have active cable circuits in both ends. It might convert tb to ethernet at the tb side and the rest of cable can be just passive cat and socket.

The Ethernet controller is located on the Ethernet end of the dongle, along with the Port Ridge Thunderbolt controller. However, the cable length is very short, and it only needs to carry a single Thunderbolt channel, so it may well not require the same active circuitry as the 2m cable.

Okay, I'll correct the time to 1.5 years. What's the answer then?

As was the topic of the original article, Thunderbolt cables from other vendors have just entered the market this month, but as yet, none of them are selling for any less than Apple's offering. Clearly Apple has been aggressive with the price of their cable, even at $49, in order to promote adoption of the technology. Cables that operate at 10 Gbps per channel are generally found in data centers and HPC environments, not on the desktop. This is not the same as the far slower and more mature technologies you keep comparing it to, and thus it just can't be made as inexpensively.

If you don't think Thunderbolt is a good value proposition already, then you don't need need 10 Gbps I/O period. Making it cheaper won't change that.

Yes, if you need longer than specs allow, you need active cable. Big rooms need this. I've worked in all kind of presentations, so there's nothing new here. But the question remains, is tb really technologically good and cost-effective choice, when it needs very expensive cables even with shortest lengths and at the same time older standards do better?

Maybe I'm too negative or criticize too harshly Apple's decisions, but I've been really disappointed to new Macs for years now. Maybe I'll have to switch away. Every new release brings something amazing to the package, but takes away more features important for me.

They just gave us Retina mac, but took away 17" AND matte screen AND upgrading ram and storage. Usb3 came about 2 years too late. Hdmi came to macbooks about 5 years too late. Now they can't support external retinas because of crippled dp in tb. Macpro has been a big (and expensive) joke for years and will be that at least about an year. And where are the new imacs? Do they once again take away features that doesn't make the product any better? Do we get mac mini with usb3 before 2014?

2 years ago, the same number of PCs had USB 3.0 as had Thunderbolt one year ago—i.e. a tiny minority. Mini DP to HDMI adapters are less than $10 and have been available for years. I find it kind of odd that Apple even chose to include an HDMI port on the MBPR. Point to a single display in existence that cannot be driven by the DP implementation used by Thunderbolt. And of course we will have new iMacs and minis before the year is out.

AidenShaw
Jul 13, 2012, 08:49 PM
2 years ago, the same number of PCs had USB 3.0 as had Thunderbolt one year ago—i.e. a tiny minority.

...and today a fair percentage of systems have USB 3.0, and still only a tiny minority have T-Bolt.

Your point?

Eidorian
Jul 13, 2012, 10:33 PM
...and today a fair percentage of systems have USB 3.0, and still only a tiny minority have T-Bolt.

Your point?And now those new Thunderbolt systems also have USB 3.0...

thekev
Jul 13, 2012, 11:28 PM
I don't... ;)

Oh man that looks like a fire hazard waiting to happen. So many people on here are drinking too much thunderbolt kool-aid. Apple got a couple peripherals out the door with it, and beyond that it's unlikely that they care. The people who expected fast adoption and immediate cheap peripherals were all out of their minds given the required driver and hardware development combined with the chip cost for a product that serves a very limited market (high bandwidth requirements and stuck to a macbook pro).

Eidorian
Jul 13, 2012, 11:32 PM
Oh man that looks like a fire hazard waiting to happen. So many people on here are drinking too much thunderbolt kool-aid. Apple got a couple peripherals out the door with it, and beyond that it's unlikely that they care. The people who expected fast adoption and immediate cheap peripherals were all out of their minds given the required driver and hardware development combined with the chip cost for a product that serves a very limited market (high bandwidth requirements and stuck to a macbook pro).Hardware is the major issue. Once you have the controllers pushing data down the cable, it appears just like a PCIe bus to the hardware at opposite ends.

thekev
Jul 13, 2012, 11:49 PM
Hardware is the major issue. Once you have the controllers pushing data down the cable, it appears just like a PCIe bus to the hardware at opposite ends.

I thought it still required a costly chip at the end of the chain. Intel's part numbers suggested that daisy chainable and non chainable devices also required different parts. What you mention would most likely facilitate development of mac pro type peripherals for the macbook pro. I know Black Magic Design made thunderbolt versions of their products, but that's still a highly specific market.

repoman27
Jul 14, 2012, 01:17 AM
...and today a fair percentage of systems have USB 3.0, and still only a tiny minority have T-Bolt.

Your point?

Well, precisely, that was my point. Two years ago, USB 3.0 had a lower attach rate than Thunderbolt does today. The six-fold increase in USB 3.0 adoption in 2012 is almost entirely due to Intel's including it in the Panther Point chipsets. Saying that Apple was holding back on USB 3.0 support in 2010 is ridiculous. The whole industry was waiting until the controllers and drivers were ready for prime time.

AidenShaw
Jul 14, 2012, 07:58 AM
Hardware is the major issue. Once you have the controllers pushing data down the cable, it appears just like a PCIe bus to the hardware at opposite ends.

To be more clear, "it appears just like a volatile (the bus comes and goes) hot-pluggable (the devices come and go) dynamic (the bus topology changes) PCIe bus".

The driver for an internal PCIe device probably needs some modifications to handle these events.

pankajdobariya
Jul 14, 2012, 08:03 AM
It is nice to see that the price of this marvellous device will come down and will be more budget friendly.

Eidorian
Jul 14, 2012, 08:13 AM
I thought it still required a costly chip at the end of the chain. Intel's part numbers suggested that daisy chainable and non chainable devices also required different parts. What you mention would most likely facilitate development of mac pro type peripherals for the macbook pro. I know Black Magic Design made thunderbolt versions of their products, but that's still a highly specific market.It does require that costly chip at the end. Once you have that support whatever is at each end appears as if it was over a PCIe bus. Now OS X and drivers? We know that joke.

Well, precisely, that was my point. Two years ago, USB 3.0 had a lower attach rate than Thunderbolt does today. The six-fold increase in USB 3.0 adoption in 2012 is almost entirely due to Intel's including it in the Panther Point chipsets. Saying that Apple was holding back on USB 3.0 support in 2010 is ridiculous. The whole industry was waiting until the controllers and drivers were ready for prime time.2011 was a big year too. While not native, only bargain basement entry level boards were limited to USB 2.0. Vendors were happy enough to slap in an additional USB 3.0 controller for a few dollars and wire it. DMI 2.0 with all that bandwidth was the biggest factor and not the price of controllers.

To be more clear, "it appears just like a volatile (the bus comes and goes) hot-pluggable (the devices come and go) dynamic (the bus topology changes) PCIe bus".

The driver for an internal PCIe device probably needs some modifications to handle these events.I would not try hot swapping cards from internal PCIe slots either. ;)

repoman27
Jul 14, 2012, 10:17 AM
2011 was a big year too. While not native, only bargain basement entry level boards were limited to USB 2.0. Vendors were happy enough to slap in an additional USB 3.0 controller for a few dollars and wire it. DMI 2.0 with all that bandwidth was the biggest factor and not the price of controllers.

Only 77m USB 3.0 controllers shipped in 2011 vs. 352.8m PCs worldwide. So best case would be 20% of new PCs having USB 3.0 in 2011. And of course AMD did beat Intel to the punch and manage to achieve some chipset integration in 2011. Nonetheless, there seems to be this notion that most PCs on the market last year included USB 3.0, when in actuality it was only 1 in 5.

And while folks are busy complaining that things they don't have any use for are too expensive, here's some more fuel for the fire: http://www.attotech.com/products/family.php?id=15

See these are the products I envisioned when Thunderbolt was introduced. Although I did imagine them to be a bit smaller and not quite as fugly...

Eidorian
Jul 14, 2012, 10:34 AM
And while folks are busy complaining that things they don't have any use for are too expensive, here's some more fuel for the fire: http://www.attotech.com/products/family.php?id=15

See these are the products I envisioned when Thunderbolt was introduced. Although I did imagine them to be a bit smaller and not quite as fugly...And I am still waiting for a decent external Thunderbolt GPU box for a MacBook Air...

thekev
Jul 14, 2012, 04:54 PM
I would not try hot swapping cards from internal PCIe slots either. ;)

That was extremely funny.

Nonetheless, there seems to be this notion that most PCs on the market last year included USB 3.0, when in actuality it was only 1 in 5.

And while folks are busy complaining that things they don't have any use for are too expensive, here's some more fuel for the fire: http://www.attotech.com/products/family.php?id=15

See these are the products I envisioned when Thunderbolt was introduced. Although I did imagine them to be a bit smaller and not quite as fugly...

They cost about as much as I anticipated when reading the word "Atto" in the link. I wouldn't necessarily look at the overall PC market on usb3. Many of the more expensive non workstation PCs did adopt it as a selling point. Beyond that, many people on here have the false notion that Apple should always be first with these things. They're actually quite conservative with most of their additions.

And I am still waiting for a decent external Thunderbolt GPU box for a MacBook Air...

Well it would need to work with bootcamp or that would eliminate much of their potential market. It is important to note that the macbook air chips have lower total bandwidth allocation than the others. I'm not sure whether this presents an issue of greater significance. Do you think many will appear though? It seems like a somewhat limited market.

Eidorian
Jul 14, 2012, 05:18 PM
Well it would need to work with bootcamp or that would eliminate much of their potential market. It is important to note that the macbook air chips have lower total bandwidth allocation than the others. I'm not sure whether this presents an issue of greater significance. Do you think many will appear though? It seems like a somewhat limited market.Well as I mentioned before, the video card will be detected as if it was over a PCIe bridge to Windows. Hot swapping might be out of the question but rebooting tends to solve Thunderbolt oddities in Windows unless you are dealing with firmware bugs.

Sonnet Technology has some rather expensive Thunderbolt to PCIe slot solutions and MSI keeps show casing their GUS II without every delivering. After that I recall at least one other vendor but a link escapes me.

I would be better off getting a MacBook Pro and using the GT 650M. The MacBook Air is limited to a single bidirectional 10 Gbps channel.

AidenShaw
Jul 14, 2012, 05:23 PM
I would not try hot swapping cards from internal PCIe slots either. ;)

That was extremely funny.

Hot plug memory and controller cards is a feature on mid-range servers:

HP ProLiant DL380 G4 server vs. IBM eServer xSeries 346

• The DL380 G4 has optional hot plug PCI slots. The xSeries 346
has non-hot plug PCI slots only.

• The DL580 G3 offers online spare memory, hot plug mirrored
memory, and hot plug RAID memory. The xSeries 366 does not
offer hot plug RAID memory.

http://hp.vecmar.com/pdf/HP%20Proliant%20Servers%20over%20IBM.pdf

Eidorian
Jul 14, 2012, 05:26 PM
Hot plug memory and controller cards is a feature on mid-range servers:I am not surprised. Fancy controllers, power circuitry, and the drivers to keep your OS of choice from throwing a fit when you just have to swap out a mission critical card sounds well...par for the course.

Thunderbolt? Eh...yeah I hope I can hot swap my external storage. Video card? HAH!

thekev
Jul 14, 2012, 05:59 PM
Hot plug memory and controller cards is a feature on mid-range servers:

That represents a market where downtime is planned whenever possible. Were you suggesting this as a feature that could eventually be leveraged?

Well as I mentioned before, the video card will be detected as if it was over a PCIe bridge to Windows. Hot swapping might be out of the question but rebooting tends to solve Thunderbolt oddities in Windows unless you are dealing with firmware bugs.

Sonnet Technology has some rather expensive Thunderbolt to PCIe slot solutions and MSI keeps show casing their GUS II without every delivering. After that I recall at least one other vendor but a link escapes me.

I would be better off getting a MacBook Pro and using the GT 650M. The MacBook Air is limited to a single bidirectional 10 Gbps channel.

Sonnet tends to get things out quickly, but you pay for them. This has always been the case. I remember them using a somewhat limited power supply too. I didn't realize the GT 650m was that big of an improvement over the 6770 in games. I don't typically play them.

AidenShaw
Jul 14, 2012, 06:26 PM
I am not surprised. Fancy controllers, power circuitry, and the drivers to keep your OS of choice from throwing a fit when you just have to swap out a mission critical card sounds well...par for the course.

Thunderbolt? Eh...yeah I hope I can hot swap my external storage. Video card? HAH!

You could, if it's the last storage device on the chain. If not, probably need to reboot.


That represents a market where downtime is planned whenever possible. Were you suggesting this as a feature that could eventually be leveraged?

T-Bolt devices are hot-plug PCIe devices - it has to be addressed unless you want to reboot every time you connect or disconnect a T-Bolt cable.

And, by the way, an ExpressCard is a hot-plug PCIe device as well - so a simplified version of the problem is already addressed. (The device is hot-plug, but the PCIe bus is static.)

repoman27
Jul 14, 2012, 06:35 PM
Well it would need to work with bootcamp or that would eliminate much of their potential market. It is important to note that the macbook air chips have lower total bandwidth allocation than the others. I'm not sure whether this presents an issue of greater significance. Do you think many will appear though? It seems like a somewhat limited market.

The MacBook Air is limited to a single bidirectional 10 Gbps channel.

The 2011 MacBook Airs used the DSL2310 Eagle Ridge controller which only allows for a single Thunderbolt port, but it has 2x 10 Gbps, full-duplex channels just like any other Thunderbolt port. It also has the same PCIe 2.0 x4 back-end as the 4-channel controllers. It is limited to a single DP sink protocol adapter and lacks a DP source protocol adapter, but from a PCIe standpoint it is not limited in any way.

The 2012 MacBook Air uses the DSL3510L Cactus Ridge 4-channel controller, which is the same one used in the new MacBook Pros.

thekev
Jul 14, 2012, 07:30 PM
T-Bolt devices are hot-plug PCIe devices - it has to be addressed unless you want to reboot every time you connect or disconnect a T-Bolt cable.

And, by the way, an ExpressCard is a hot-plug PCIe device as well - so a simplified version of the problem is already addressed. (The device is hot-plug, but the PCIe bus is static.)

I forgot about Express cards. That is less amusing than imagining someone hot plugging a graphics card in a mac pro:p.

Chippy99
Jul 15, 2012, 04:10 AM
The thing I have issue with is this:

In your own thought-experiment, take two thunderbolt devices connected by a thunderbolt cable. Now cut the ends of the cable and put the ends inside the tb devices.

Now you have two (differently designed) thunderbolt devices, communicating perfectly with each other as before, but via a cheap, passive cable.

The benefit being, you only buy the expensive electronics once per device, not twice per cable.

Why on earth didn't they implement it like that???

To ask the question a different way, why did Silicon Image not decide to put the transmit and receive electronics into the DVI cable (or HDMI). They could have done so, but they thought that would be a daft idea.

toke lahti
Jul 15, 2012, 10:27 AM
While it may be more affordable to use multiple HDMI cables, it would be utterly impractical. It would require four HDMI ports and cables to approximate a single Thunderbolt port and cable.
Of course, I meant that more lanes could be used in one cable. Not multiple sockets for one datapath.

HDMI uses 3 TMDS channels operating at a maximum rate of 3.4 Gbps to create a single, half-duplex link capable of carrying 10.2 Gbps, or 8.16 Gbps after accounting for 8b/10b overhead.

DisplayPort 1.2 uses 4 lanes at up to 5.4 Gbps to create a single, half-duplex, main link of 21.6 Gbps, or 17.28 Gbps less 8b/10b overhead. There are currently zero DisplayPort 1.2 HBR panels or MST hubs on the market though, despite the first sink device silicon receiving certification on August 10, 2011. So all existing DisplayPort gear operates at only 2.7 Gbps per lane, 10.8 Gbps per link, or 8.64 Gbps without encoding overhead.

USB 3.0 SuperSpeed mode utilizes 2 dedicated signaling pairs to create a single, full-duplex 5 Gbps link, which provides 4 Gbps without 8b/10b overhead.

Thunderbolt combines 4 signaling pairs operating at 10.3125 Gbps to create 2 full-duplex channels capable of transporting 10 Gbps. That's a symbol rate 3.8x DisplayPort, 3.0x HDMI and more than 2x USB 3.0. Reliably achieving these kinds of speeds is currently a non-trivial engineering problem at power levels reasonable for a mobile device. This is the first time an interface anything like this has been included standard on a consumer PC. If your work flow can actually benefit from a 10 Gbps I/O port, Thunderbolt has the potential to be a massively less expensive solution than the other currently available options.

Good selection of numbers you have.
I think that when tb designers were pushing the envelope a bit too much when they made the hard decisions somewhere in 2010. Maybe they didn't realize, that to make a new useful standard, it has to be mass-adopted by consumers to get price efficient.

We are very close to the time, when "pro" tech will not be more advanced than consumer tech. Pushing the envelope is getting so expensive, that you simply can't do that with any smaller market size than the biggest.

Seems that every generation of designers and engineers have to try to make kind of new thing that should be "the thing to rule them all", only to see that next same thing is done 5 years later.
Now we still can't be sure if tb will even succeed being alive for that many years.

Usb3 was released in 2008, 8 years later than usb2.
If usb4 comes in 2016 with greater speed than tb now, a fraction of price of tb and there's no tb2 at the time, tb will die slowly away.
Then we can keep speculating that if tb had had 8 pairs of wires in the cable and therefore used cheap passive cables, it would have become as popular as usb...

Apple did not postpone inclusion of USB 3.0 for years. As soon as Intel included it, they did. 2012 will mark the first year that USB 3.0 will even achieve a 50% attach rate for new PCs and motherboards. USB 3.0 has only been available for 2.5 years from any manufacturer. Quarterly production of USB 3.0 controllers did not even exceed 10m until Q3 2011. To say that USB 3.0 was ubiquitous before this year is a gross overstatement of the situation, with only 6.5% of PCs in the wild having at least one port by the end of 2011.

I disagree fully.
Apple could have included usb3 in macs in 2010.
And more usb3 controllers were sold in 2010 than macs.
Now which one of these minorities are significant again and why?
There's no technical or economical reason for not to do this, if they would have wanted to maintain state-of-the-art imago. Saving $5 chip in cost just isn't that. Maybe this is just the one legacy baggage that Apple is carrying. After fw loosing to usb, they couldn't just face their defeat and had to come up something sexier than usb and that became tb. Also tb might be a whole lot less sexier, if macs would have had usb3 before tb.

Although you can't realise from controller sales numbers how much certain port is used and even more importantly how much there was benefit using this port compared to other alternatives, I just don't get this when something is widely enough accepted. Do you have to sell 40 million controllers a year, before you can say that the used standard is adopted? It would be very interesting to know how many tb devices has been sold that really benefits from tb? 4 digits or even 5? Tb displays and fw & ethernet dongles should be excluded from the list, since all these would be handled more price efficiently with usb3.

I just don't get this excuse that Apple shouldn't have included usb3 before it has certain percentage of market adoption. How about using the same rule for tb? Is there some reason why macs should be some kind of "average PC" and therefore have only features that most computers have? Does retina need to be ubiquitous to be justified? Comparing pc market to macs is just plain stupid old fruit comparison. When most of pc crap is way cheaper than macs, you just can't expect them having any advanced tech. On the other hand almost all pc hardware in macs' price group has had usb3 (and bd) for second of third year now.

State of the Macs is getting so very sad. They can't offer 17" OR matte (they were making profits with additional $50 charge on matte coating, but seems to be that not enough, also maybe offering matte screen would imply that there's something wrong with glossy ones...) OR usb3 at the same time! Oh well, maybe there will come some really working usb3 drivers for expresscard for my matte 17"...
They can't put usb3 OR tb to MP and at the same time MSI is selling mb with both included to windows markets...

Just compare usb3 and tb 18 months after first products were on the shelf. Usb3 had 100x more different products and adoption rate than tb. Maybe 1% of mac users will ever use tb and at the same time 99% of users will benefit usb3. Apple's customers who bought mac in 2010-2012 would also benefit from usb3 still many years. Apple chose not to offer it, since so few undersood to even ask it and now they can buy new macbooks sooner than ever before.

And to say that Thunderbolt crippled DisplayPort is just not accurate either. Current Thunderbolt controllers are capable of carrying up to two DP 1.1a streams. Intel's HD Graphics 3000 was not even capable of outputting DP 1.2. The 2012 Macs with Cactus Ridge controllers may actually be capable of outputting DisplayPort 1.2 signals over their Thunderbolt ports when used in DP mode, but there's not much way of knowing since there are precisely zero DisplayPort 1.2 panels or MST hubs available on the market to test this with. Until there exists a display that cannot be driven by Thunderbolt due to it only supporting DP 1.1a, this simply does not affect anyone. It is of absolutely no consequence in the real world at this juncture.

Well, we are in the verge of external retina displays. If Apple brings any kind of other external display to market, it will be disappointment. (Hmm, next model could be also just the same as now, but with usb3. Even if the display doesn't have intel's chipset that includes usb3... ;) ) And with their situation with tb versus retina they have to either drop external monitors from their products altogether or limit the amount of external retina displays to one per mac or upgrade tb specs and have a mess with angry customers, whose expensive hardware just turned obsolete even before anybody thought. None of the options are very nice for anybody.
As was the topic of the original article, Thunderbolt cables from other vendors have just entered the market this month, but as yet, none of them are selling for any less than Apple's offering. Clearly Apple has been aggressive with the price of their cable, even at $49, in order to promote adoption of the technology. Cables that operate at 10 Gbps per channel are generally found in data centers and HPC environments, not on the desktop. This is not the same as the far slower and more mature technologies you keep comparing it to, and thus it just can't be made as inexpensively.
So, maybe those cables should have stayed where they belong and the desktop alternatives should be designed with better price efficiency?

If you don't think Thunderbolt is a good value proposition already, then you don't need need 10 Gbps I/O period. Making it cheaper won't change that.

Yes it will.
Nobody, needs anything that is too expensive and everybody wants as fast as they can reasonably justify to afford. So there's no binary solution for this.

Maybe with better design and more forward thinking all customers who bought a mac from 2010 onwards, would have enjoyed usb3 AND multiple future retinas AND 10x cheaper tb than what it is now. Now, none of this happened and will not happen for few years in future.

2 years ago, the same number of PCs had USB 3.0 as had Thunderbolt one year ago—i.e. a tiny minority. Mini DP to HDMI adapters are less than $10 and have been available for years. I find it kind of odd that Apple even chose to include an HDMI port on the MBPR. Point to a single display in existence that cannot be driven by the DP implementation used by Thunderbolt. And of course we will have new iMacs and minis before the year is out.
If macs would had usb3 over 2 years now, maybe usb3 would had been adopted way faster. But yes, hdmi in new mbp is quite an oddball, maybe average mac user is stupid enough for not to know that dp-hdmi dongle does the same thing OR maybe Apple is preparing their customers to that if you want 2 external displays, the retina one takes the whole tb and the second external, non-retina model can use the hdmi OR if you need all tb bandwidth for data, you can use hdmi for display, which then does not take any bandwidth away from tb.
And now those new Thunderbolt systems also have USB 3.0...
Also, a very tiny fraction of computers, but still few million mbp's have tb but no usb3. Unfortunately there might never be a affordable tb-usb3 dongle or tb hub to use more these dongles...
Only 77m USB 3.0 controllers shipped in 2011 vs. 352.8m PCs worldwide. So best case would be 20% of new PCs having USB 3.0 in 2011. And of course AMD did beat Intel to the punch and manage to achieve some chipset integration in 2011. Nonetheless, there seems to be this notion that most PCs on the market last year included USB 3.0, when in actuality it was only 1 in 5.
Again there were more usb3 controllers sold than macs. Are they both not significant then?
Again, that 20% of pc's were on price group of macs and had usb3. At the same time the macs did not have. Is this a sign of bad design in pc's or macs?

----------

To ask the question a different way, why did Silicon Image not decide to put the transmit and receive electronics into the DVI cable (or HDMI). They could have done so, but they thought that would be a daft idea.
Good question!
Either way, chips inside the device or inside the cable, you need same amount of them, so it doesn't affect the price of the whole system.

AidenShaw
Jul 15, 2012, 11:27 AM
Either way, chips inside the device or inside the cable, you need same amount of them, so it doesn't affect the price of the whole system.

If the logic were integrated into the T-Bolt controller itself (instead of active cables) you'd need zero additional chips.

Eidorian
Jul 15, 2012, 12:15 PM
If the logic were integrated into the T-Bolt controller itself (instead of active cables) you'd need zero additional chips.Those watts would be killer. Though that is the case right now...

toke lahti
Jul 16, 2012, 08:11 AM
If the logic were integrated into the T-Bolt controller itself (instead of active cables) you'd need zero additional chips.
Good point.
I don't think those cables are getting very hot, so why they don't do this?
To use same chips when optical cables come to market?

Bubba Satori
Jul 16, 2012, 09:00 AM
Clearly no one is making cheap USB 3.0 cables. Case in point:

http://www.monoprice.com/products/product.asp?c_id=103&cp_id=10303&cs_id=1030309&p_id=6504&seq=1&format=2

Don't buy any of that cheap USB3 stuff.
It must be cheap.
And that's just wrong.

repoman27
Jul 16, 2012, 12:50 PM
Of course, I meant that more lanes could be used in one cable. Not multiple sockets for one data path.

More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason. Nobody wants to see a return of the old parallel SCSI cables with 50-pin Centronics connectors.

Good selection of numbers you have.
I think that when tb designers were pushing the envelope a bit too much when they made the hard decisions somewhere in 2010. Maybe they didn't realize, that to make a new useful standard, it has to be mass-adopted by consumers to get price efficient.

We are very close to the time, when "pro" tech will not be more advanced than consumer tech. Pushing the envelope is getting so expensive, that you simply can't do that with any smaller market size than the biggest.

Apple has significant enough market share at the moment to create a robust ecosystem for both software developers and accessory manufacturers all by themselves. A standard does not need to be adopted by billions of consumers or even the majority in order to become useful or a good value proposition.

There will always be a minority segment of the market that demands more performance than is available from run-of-the-mill consumer electronics devices and are willing to pay a premium for it: the pros and enthusiasts. There is also a key group of purchasers to whom price is virtually no object for products that meet their specific requirements: enterprise and government. While the market may be far larger for 1080p HDTVs than 4K medical imaging monitors, the profits are way better for the latter. Trickle down will continue to happen as it always has. 10.3125 Gbps per channel is old news to the telecom industry, who will be deploying 25-28 Gbps this year and paying thousands of dollars per port to do so. In 2014 Thunderbolt will get a speed bump and consumers will have access to those same speeds for a couple hundred dollars per port.

Apple only has a little more than 5% of the global market and around 12% of the US market for PCs based on unit shipments, but they take in over 35% of the operating profit for the entire industry. Apple needs to differentiate itself from the competition in order to justify higher unit prices and maintain those higher profit margins. Thunderbolt is just one more way to reinforce the perception that their devices are more elite, capable and valuable.

Usb3 was released in 2008, 8 years later than usb2.
If usb4 comes in 2016 with greater speed than tb now, a fraction of price of tb and there's no tb2 at the time, tb will die slowly away.
Then we can keep speculating that if tb had had 8 pairs of wires in the cable and therefore used cheap passive cables, it would have become as popular as usb...

But USB 3.0 devices were not available until 2010, two years later. Do you think USB would be as popular today if it had been UPB (Universal Parallel Bus) instead? Also note that USB is massively popular, whereas SuperSpeed USB is just beginning to gain momentum. There are 6 billion USB devices in use, but even by the end of 2012, only 7% will be of the USB 3.0 variety.

I disagree fully.
Apple could have included usb3 in macs in 2010.
And more usb3 controllers were sold in 2010 than macs.
Now which one of these minorities are significant again and why?
There's no technical or economical reason for not to do this, if they would have wanted to maintain state-of-the-art imago. Saving $5 chip in cost just isn't that. Maybe this is just the one legacy baggage that Apple is carrying. After fw loosing to usb, they couldn't just face their defeat and had to come up something sexier than usb and that became tb. Also tb might be a whole lot less sexier, if macs would have had usb3 before tb.

According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.

Why would Apple spend time developing and testing a driver for silicon that didn't meet their design criteria and was not available in sufficient quantity? Seriously, not even joking, the earliest Apple could have adopted USB 3.0 was mid 2011, and they would have had to lean hard on their suppliers to do so. Whether they should have done so or not at that point is entirely debatable, but saying that they could have added USB 3.0 in 2010 or even early 2011 is just not in line with reality.

The problem with standards is, well, you need to wait until things are standardized. Despite USB 3.0 being ratified in December of 2008, the first 4-port controller wasn't even certified until April of 2011.

Although you can't realise from controller sales numbers how much certain port is used and even more importantly how much there was benefit using this port compared to other alternatives, I just don't get this when something is widely enough accepted. Do you have to sell 40 million controllers a year, before you can say that the used standard is adopted? It would be very interesting to know how many tb devices has been sold that really benefits from tb? 4 digits or even 5? Tb displays and fw & ethernet dongles should be excluded from the list, since all these would be handled more price efficiently with usb3.

The industry generally relies on a metric known as "attach rate" (a term which I now realize I have been misusing for some time). It's the number of complimentary devices sold for each primary product sold. As far as I'm aware, these figures just haven't been made public for Thunderbolt, but without a doubt they are lower than for USB 3.0 at this point.

My guess is that you'd be solidly into 6-digit territory for the number of Thunderbolt devices sold. And discounting the Apple Thunderbolt Display would be asinine. As would not including Ethernet dongles, unless you can point to a currently available USB 3.0 GbE adapter. And replacing the functionality of FireWire gear you already own with USB 3.0 stuff is not only potentially impossible at this point, but also probably more expensive than a buying a $29 adapter. However, the adapters aren't even available yet, so it makes no difference whether you include them or not.

I just don't get this excuse that Apple shouldn't have included usb3 before it has certain percentage of market adoption. How about using the same rule for tb? Is there some reason why macs should be some kind of "average PC" and therefore have only features that most computers have? Does retina need to be ubiquitous to be justified? Comparing pc market to macs is just plain stupid old fruit comparison. When most of pc crap is way cheaper than macs, you just can't expect them having any advanced tech. On the other hand almost all pc hardware in macs' price group has had usb3 (and bd) for second of third year now.

I never argued that Apple needed to wait for a certain level of adoption before adding USB 3.0, or should only include mainstream technologies in their products. Just like everyone else, though, they can only purchase what their suppliers can produce. When Apple moves first, they can be aggressive and buy up all of a certain item. When it comes to things like USB 3.0 host controllers, or 4G LTE modem and baseband chips, there are a lot of other prospective customers, and any drawbacks due to early implementations can make waiting it out for a round or two seem like a better idea.

Just compare usb3 and tb 18 months after first products were on the shelf. Usb3 had 100x more different products and adoption rate than tb. Maybe 1% of mac users will ever use tb and at the same time 99% of users will benefit usb3. Apple's customers who bought mac in 2010-2012 would also benefit from usb3 still many years. Apple chose not to offer it, since so few undersood to even ask it and now they can buy new macbooks sooner than ever before.

But to be more realistic, the attach rate numbers are probably at least 2% for Thunderbolt, and according to the USB-IF were only around 60% for USB 3.0 after the first 18 months. The number of certified devices was only 250 for USB 3.0 vs. 50 for Thunderbolt, so that would be 5x more different products and 30x the attach rate.

Also, there is still only a very limited range of device silicon available for USB 3.0. You have thumb drives, card readers, SATA bridges, 4-port hubs, cameras and a media player. In the first 18 months, how many USB 3.0 to 6Gb/s SAS/SATA bridges, GbE or 10GbE adapters, HD video interfaces, Fibre Channel adapters, PCIe or ExpressCard expansion chassis, or displays anything remotely like the ATD were available? Despite some initial overlap, USB 3.0 and Thunderbolt will end up being used for very different purposes. Basically if a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?

Well, we are in the verge of external retina displays. If Apple brings any kind of other external display to market, it will be disappointment. (Hmm, next model could be also just the same as now, but with usb3. Even if the display doesn't have intel's chipset that includes usb3... ;) ) And with their situation with tb versus retina they have to either drop external monitors from their products altogether or limit the amount of external retina displays to one per mac or upgrade tb specs and have a mess with angry customers, whose expensive hardware just turned obsolete even before anybody thought. None of the options are very nice for anybody.

Do you really think there will be crowds of angry customers with pitchforks when they discover they can only run a single external display with a resolution higher than 6MP? I’d love to see the performance of an MBA trying to drive more than that many pixels. According to your logic, consumers should only be allowed to have 1920x1080 displays anyway, because they are so much cheaper due to economies of scale.

So, maybe those cables should have stayed where they belong and the desktop alternatives should be designed with better price efficiency?

Yes it will.
Nobody, needs anything that is too expensive and everybody wants as fast as they can reasonably justify to afford. So there's no binary solution for this.

Ahh, but don't you see that many people already can afford and justify the additional expense of Thunderbolt? Having a choice is a good thing, and I'm definitely glad that we finally have the option of using either USB 3.0 or Thunderbolt on the same platform.

If macs would had usb3 over 2 years now, maybe usb3 would had been adopted way faster. But yes, hdmi in new mbp is quite an oddball, maybe average mac user is stupid enough for not to know that dp-hdmi dongle does the same thing OR maybe Apple is preparing their customers to that if you want 2 external displays, the retina one takes the whole tb and the second external, non-retina model can use the hdmi OR if you need all tb bandwidth for data, you can use hdmi for display, which then does not take any bandwidth away from tb.

Even driving 2 daisy-chained 2560x1440 ATDs, PCIe bandwidth over Thunderbolt is only reduced by about 16%, and just in the outbound direction and only for devices attached to that chain. You constantly exaggerate the impact of DP on Thunderbolt's PCIe performance, when the odds of it significantly affecting any real-world workflows are slim to none. How would a USB 3.0 controller fare trying to drive two 2560x1440 displays while writing more than 8.4 Gbps to an external RAID array? Oh right, it can't do either of those things anyway...

Also, a very tiny fraction of computers, but still few million mbp's have tb but no usb3. Unfortunately there might never be a affordable tb-usb3 dongle or tb hub to use more these dongles...

But it is far more likely that an inexpensive Thunderbolt to USB 3.0 dongle based on the Port Ridge controller and leveraging Apple's drivers will see the light of day as soon as Mountain Lion and/or the 10.7.5 update become available to the general public.

Again there were more usb3 controllers sold than macs. Are they both not significant then?
Again, that 20% of pc's were on price group of macs and had usb3. At the same time the macs did not have. Is this a sign of bad design in pc's or macs?
Those controllers were still mostly finding their way onto enthusiast motherboards or drop-in PCIe expansion cards for desktop PCs, by a factor of 2:1 compared to notebook PC deployments. Once again, it is very clear that integrating third-party USB 3.0 controllers into Intel PCs with form-factors similar to Apple's product line did not happen in any significant way until mid-2011 or later. Until someone at Apple steps forward and recounts the tale of why the decision was made to wait until 2012, we can only speculate. I highly doubt that it was the result of the controllers being too expensive, or Apple's engineers not being up to the design task.

Good question!
Either way, chips inside the device or inside the cable, you need same amount of them, so it doesn't affect the price of the whole system.

The thing I have issue with is this:

In your own thought-experiment, take two thunderbolt devices connected by a thunderbolt cable. Now cut the ends of the cable and put the ends inside the tb devices.

Now you have two (differently designed) thunderbolt devices, communicating perfectly with each other as before, but via a cheap, passive cable.

The benefit being, you only buy the expensive electronics once per device, not twice per cable.

Why on earth didn't they implement it like that???

To ask the question a different way, why did Silicon Image not decide to put the transmit and receive electronics into the DVI cable (or HDMI). They could have done so, but they thought that would be a daft idea.

If the logic were integrated into the T-Bolt controller itself (instead of active cables) you'd need zero additional chips.

Intel's Light Peak controller, which became the Light Ridge Thunderbolt controller, was designed to be connected to an on-board optical transceiver. The output from the optical engine was then routed via fiber to a hybrid Cu/optical port containing a specially designed lens which allowed the use of passive optical cables. This system had many drawbacks which made commercialization challenging or downright impractical. Sony was the only OEM to go this route and only for a laptop that started at $3000.

Using the controller as-is with the active copper cables we have today was a far more flexible, cost-effective and expedient solution than going back to the drawing board and taping out a 120mm^2 controller all over again just to integrate a new PHY. I realize there is a lot of resistance to the notion that putting the logic in the cable may have been the best all-around solution, but there are reasons why it is the industry norm at 10.3125 GBaud per lane.

Active circuitry also allows the use of very thin wire for the signaling pairs, as small as 40 AWG, at power levels as low as 0.7 W per lane. Compare that to Intel's latest X540 10GBASE-T controller with integrated MAC/PHY. The 2 port controller weighs in at 625mm^2 and uses 6.25 W max per port when paired with UTP cables using 22 AWG wire. This is advertised as one of the lowest power 10GBASE-T solutions on the market. Meanwhile SFP+ modules get the job done using less than 1 W per port and allow considerable flexibility regarding the type of media used.

AidenShaw
Jul 16, 2012, 07:40 PM
More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason.

But T-Bolt is 4 parallel lanes, and 16 parallel lanes is the standard connector for most graphics cards.

Clearly skew is not a serious problem - when you have multiple packet-oriented serial interfaces running as a team, reassembling the packets in the correct order is a solved problem.

(And for people who don't realize it, T-Bolt takes 4 parallel signals, multiplexes it onto one serial signal, then de-muxes it back to 4 parallel lanes.)


According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.

I'd love to see the statistics for USB 3.0 adoption on systems based on selling price.

Since there are no inexpensive Apples (although Apples seem to be getting "cheaper" on the quality front, but not the price front), I would bet that when you look at PCs comparably priced to Apples a much higher proportion of them would have USB 3.0 than the stats that you quote.

At a meeting today I looked around at the laptops. Eight out of ten of them had USB 3.0 (the "blue port" - very easy to spot). The two that didn't had a half-eaten apple on the lid.

Stop apologizing for Apple's failure to support USB 3.0 in favor of its higher-priced connection. (I almost said "alternative", but T-Bolt and USB 3.0 have clearly different goals - and for most people USB 3.0 aligns with their goals.)

Eidorian
Jul 16, 2012, 08:14 PM
But T-Bolt is 4 parallel lanes, and 16 parallel lanes is the standard connector for most graphics cards.

Clearly skew is not a serious problem - when you have multiple packet-oriented serial interfaces running as a team, reassembling the packets in the correct order is a solved problem.

(And for people who don't realize it, T-Bolt takes 4 parallel signals, multiplexes it onto one serial signal, then de-muxes it back to 4 parallel lanes.)Did we ever really complain about Thunderbolt's lack of lanes or bandwidth? I know we would "love" 16 lanes but even at x4 you can push a near flagship video card. If you are that concerned about storage speeds on a notebook when you have 10 Gbps bidirectional...

repoman27
Jul 17, 2012, 01:46 AM
But T-Bolt is 4 parallel lanes, and 16 parallel lanes is the standard connector for most graphics cards.

Clearly skew is not a serious problem - when you have multiple packet-oriented serial interfaces running as a team, reassembling the packets in the correct order is a solved problem.

(And for people who don't realize it, T-Bolt takes 4 parallel signals, multiplexes it onto one serial signal, then de-muxes it back to 4 parallel lanes.)

No, most Thunderbolt controllers have a PCIe 2.0 x4 back end, but Thunderbolt is in no way 4 parallel lanes. Furthermore, we were discussing Thunderbolt cables, which carry two, full-duplex channels. As far as I know, data is not generally striped over the two channels, they operate as independent serial links.

How is inter-lane skew a solved problem? It still needs to be within certain bounds for the system to work, and as the frequencies go up, the UIs get tiny and it's all too easy to fall outside those bounds. Granted I probably should have picked something more obviously problematic such as differential skew, clock jitter, or the increased crosstalk that comes along with higher degrees of parallelism.

(And for those of you who don't realize it, AidenShaw's explanation of Thunderbolt is oversimplified to the point of being completely inaccurate. And yes, I do realize that the Thunderbolt protocol adapters are essentially SerDes.)

I'd love to see the statistics for USB 3.0 adoption on systems based on selling price.

Since there are no inexpensive Apples (although Apples seem to be getting "cheaper" on the quality front, but not the price front), I would bet that when you look at PCs comparably priced to Apples a much higher proportion of them would have USB 3.0 than the stats that you quote.

At a meeting today I looked around at the laptops. Eight out of ten of them had USB 3.0 (the "blue port" - very easy to spot). The two that didn't had a half-eaten apple on the lid.

Stop apologizing for Apple's failure to support USB 3.0 in favor of its higher-priced connection. (I almost said "alternative", but T-Bolt and USB 3.0 have clearly different goals - and for most people USB 3.0 aligns with their goals.)

I'm going to try to navigate your littered with trolling comments response as deftly as possible here. My overly wordy and pointlessly statistic laden response to toke lahti was an attempt to curb some of the hyperbolic rhetoric I often see on the forums. Namely, if you want to say that Apple held back on including USB 3.0 controllers in their PCs, there is only a certain window of time for which the argument is valid. By any reasonable interpretation of the available evidence, that window lies between May of 2011 and May of 2012.

Apple could not have shipped Macs with USB 3.0 in 2008 because the standard didn't exist.

Apple could not have shipped Macs with USB 3.0 in 2009 because no host controllers existed.

Apple had only one option for a USB 3.0 host controller in 2010, the NEC/Renesas µPD720200, which was not at all well suited for notebook or compact form factor PCs. It also wasn't available in quantities that would allow Apple to shift entirely to USB 3.0 across their entire product line. Arguing that Apple should have adopted at this point is pretty daft.

Apple actually could have shipped Macs with USB 3.0 in 2011 using discrete host controllers because certified controllers were now available from several vendors. In particular, Renesas introduced their 3rd gen controllers in March of 2011 and began ramping up production to levels sufficient for Apple shortly thereafter.

All of the Macs introduced thus far in 2012 do have USB 3.0.

So basically, the mid to late 2011 Macs could have had USB 3.0 if Apple opted to use discrete host controllers. Apple held out for one generation until an integrated solution was available from Intel.

At this point we have crossed the Rubicon and more PC's will ship with USB 3.0 than without. It's no longer a premium feature, it's the norm.

While USB 3.0 may align with the goals of most people, Apple doesn't necessarily try to sell to the majority. They target a wealthier more educated demographic. One that thinks differently. Even if it's a bunch of BS, it seems to be working for them.

AidenShaw
Jul 17, 2012, 09:45 AM
No, most Thunderbolt controllers have a PCIe 2.0 x4 back end, but Thunderbolt is in no way 4 parallel lanes. Furthermore, we were discussing Thunderbolt cables, which carry two, full-duplex channels. As far as I know, data is not generally striped over the two channels, they operate as independent serial links.

As I said, the 4 parallel lanes PCIe lanes are encapsulated onto T-Bolt, then split back out to 4 parallel PCIe lanes on the other side.


How is inter-lane skew a solved problem?

If PCIe x16 is successful, it must have been solved. I suspect having packetized data helps - since each packet could have a sequence number or timestamp so that it would be easy to de-skew. (edit: see attachments)


(... And yes, I do realize that the Thunderbolt protocol adapters are essentially SerDes.)

Right. Doesn't that mean that I'm essentially correct?


http://www.pcisig.com/developers/main/training_materials/get_document?doc_id=d9967efa833bbf0f223276571d647482be183e18

MagnusVonMagnum
Jul 17, 2012, 03:16 PM
Apple had only one option for a USB 3.0 host controller in 2010, the NEC/Renesas µPD720200, which was not at all well suited for notebook or compact form factor PCs. It also wasn't available in quantities that would allow Apple to shift entirely to USB 3.0 across their entire product line. Arguing that Apple should have adopted at this point is pretty daft.


You sound pretty daft. Apple COULD have and SHOULD have adopted USB 3.0 for iMacs, the Mac Mini and the Mac Pro at this stage, NONE of which have it as of this writing. Instead, they are STILL behind and that irritates the heck out of some of us. I've been waiting to replace my PowerMac Server with a Mac Mini server for some time now and the ONLY thing holding me up is the lack of USB 3.0.

Worse yet, that recent "update" to the Mac Pro in particular was PATHETIC. They would have been better off not doing anything at all than cheesing off all the professionals waiting for an update and then offering them crap. The VERY least they would have done was lower the price on the thing. Apple seems to think high iOS and notebook sales means they can just ignore the rest of their lines and sadly they are wrong. They are losing the professional market entirely. One might argue it's not worth it, but that's like saying car companies like Subaru shouldn't enter rally races since they aren't selling THAT car. But it brings the entire name and respect level up, which are steadily falling with Apple dropping professional features left and right. Apple seems to be aiming to become the Radio Shack of the 21st Century and that's SAD given the amount of capital they have which could ensure ALl those lines are up to date before anyone else.


Apple actually could have shipped Macs with USB 3.0 in 2011 using discrete host controllers because certified controllers were now available from several vendors. In particular, Renesas introduced their 3rd gen controllers in March of 2011 and began ramping up production to levels sufficient for Apple shortly thereafter.


And they didn't do that either. :rolleyes:


While USB 3.0 may align with the goals of most people, Apple doesn't necessarily try to sell to the majority. They target a wealthier more educated demographic. One that thinks differently. Even if it's a bunch of BS, it seems to be working for them.

Bullcrap. They're selling iPhones and iPods and iPads to the lowest common denominator at this point (well maybe the 2nd lowest if you count Samsung's devices as the lowest). The truth is they're more interested in form factor (thin thin thin thin thin) and style (unibody aluminum, glass, etc.) than UTILITY and there is no obvious market for that given no one has asked for an all encased with glass iPhone or even thinner Macbook Pros at the cost of ports and drive options. No, they seem to have gotten that from the late Mr. Jobs who was OBSESSED with thin (to no obvious account). And frankly, I wish they'd stop. Looks are OK, but don't destroy functionality for an extra 1/8" thickness off the thing. That's STUPID. Sadly, they've done just that lately (and a lot longer for things like graphics capability).

repoman27
Jul 17, 2012, 10:06 PM
As I said, the 4 parallel lanes PCIe lanes are encapsulated onto T-Bolt, then split back out to 4 parallel PCIe lanes on the other side.

Existing Thunderbolt controllers have connections for either 2 or 4 PCIe 2.0 lanes, 0 to 2 DisplayPort sources, sometimes a DisplayPort sink, and 1, 2 or 4 Thunderbolt channels.

Let's take the DSL3510L Cactus Ridge 4C controller as an example, since it has a little of everything and that's what is in the 2012 Macs.

It has connections for 4 PCIe 2.0 lanes, which lead to an on die 8-lane, 5-port PCIe 2.0 switch. There is no requirement for those PCIe lanes to be used in parallel or even connected to anything at all, as in the case of a hypothetical DisplayPort only Thunderbolt device. They can be configured as a single 4-lane link, 1 or 2 2-lane links, 1 to 4 single lanes, or one 2-lane link plus 1 or 2 single lanes. Not only do the links not have to utilize lanes in parallel, they can even operate at different speeds. You could connect a PCIe 2.0 x2 SATA controller and a PCIe 1.1 x1 GbE controller and each would operate at the highest rate possible.

The PCIe switch in the Thunderbolt controller has to deserialize, decode and descramble the incoming symbols. Then it has to un-stripe any bytes from lanes that were operating in parallel, strip off the framing characters used by the physical layer, and hand the data off to the data link layer. The DLL then has to disassemble and sequence the link layer packets, perform error checking, process any DLLP's, and then hand the transaction layer packets up to the transaction layer. The packets can then be forwarded to the correct destination port via the switch, and the whole process is reversed.

The upstream port of the PCIe switch is linked to the PCIe to Thunderbolt protocol adapter, where the TLPs are encapsulated as Thunderbolt packets, which are then forwarded to the appropriate Thunderbolt channel by the Thunderbolt crossbar switch.

Meanwhile, the DisplayPort source signals enter the Thunderbolt controller and are demuxed, with one set of signals bypassing the Thunderbolt logic and being fed to the DisplayPort legacy PHY for each Thunderbolt port. The other set of signals go to the DisplayPort source to Thunderbolt protocol adapters. There the DP main link signals are deserialized, decoded, descrambled, deskewed, decrypted, demuxed and fed to the main stream and secondary data packet unpackers. After unstuffing, unpacking and clock recovery, the packets can be reframed as Thunderbolt packets and forwarded to the appropriate Thunderbolt channel by the Thunderbolt crossbar switch. Aux channel data appears to be relayed as well, so this too must be digested and packetized for transport over Thunderbolt. And of course there is a Thunderbolt to DisplayPort sink protocol adapter that does the same in reverse.

The Thunderbolt packets headed outbound are handed down to the data link layer which I presume is also responsible for the "novel time synchronization protocol" that Intel advertises. From there it is off to the Thunderbolt PHY, which encodes and serializes the data stream into four individual channels and sends two each to a pair of muxes which can switch between them and the legacy DP signals. The selected signals are then output from the controller and travel a very short distance to however many Thunderbolt ports the device offers.

That's one side of the Thunderbolt equation, and I pretty much just glossed over it, which is why I felt you might be oversimplifying things a wee bit too much.

If PCIe x16 is successful, it must have been solved. I suspect having packetized data helps - since each packet could have a sequence number or timestamp so that it would be easy to de-skew.

Well, the striping happens at the byte level, not the packet level though. And to be honest, PCIe 1.0 was designed to solve the inter-lane skew issues that PCI had at higher frequencies. It also wasn't much of an issue at the 5 GHz speeds of PCIe 2.0. At 8 GHz things start to get a little tight, and it is once again problematic at 10 GHz, which is why PCIe 3.0 didn't just double the frequency again.

I believe a bit of skew is intentionally introduced to prevent voltage fluctuations caused by all of the lanes signaling at the same exact time. That means that at high enough frequencies, additional skew from unequal trace lengths can push things to the point where all of the symbols may not arrive within the necessary window to correctly de-stripe the data coming off the lanes. If error correction can't fix things, then you have to retransmit the whole packet again. With traces on a motherboard (i.e. PCIe) this is generally quite manageable. With external cables, it's a whole other kettle of fish.

You sound pretty daft. Apple COULD have and SHOULD have adopted USB 3.0 for iMacs, the Mac Mini and the Mac Pro at this stage, NONE of which have it as of this writing. Instead, they are STILL behind and that irritates the heck out of some of us. I've been waiting to replace my PowerMac Server with a Mac Mini server for some time now and the ONLY thing holding me up is the lack of USB 3.0...

I've felt like I was watching paint dry all this year waiting for Apple to roll out their new models, and the Mac Pro spec bump was aggravating as all heck. I am not saying that USB 3.0 isn't long overdue on Macs or that they could not have done it before now. I am not trying to defend all of their design decisions over the past 3 years.

All I was saying is that mid-2011 is about as early as they could have added USB 3.0 to Macs, which was still over a year ago. Claiming that they could have done it years before that is ridiculous. The product would have sucked if that had done that.

How many PCs have you seen from 2010 where all of the USB ports are SuperSpeed? There really weren't any, because people need at least a couple USB ports that actually work. The early silicon and drivers were flakey and had compatibility issues.

toke lahti
Jul 18, 2012, 07:35 PM
More lanes = more contacts and larger connectors, more conductors and larger, less-flexible cables, more problems with inter-lane skew, and more expense in the long run. The industry as a whole has shifted to high-speed serial interfaces rather than parallel busses for good reason. Nobody wants to see a return of the old parallel SCSI cables with 50-pin Centronics connectors.
Care to tell us what is your golden amount of pins then?
Nobody's suggested 50 pins for tb.
But those who design with price efficiency in mind, can add some lanes to the soup. Was dual-link DVI somekind of problem?
Did HDMI make terrible mistake by adding dual link (type B connector) to their 1.3 spec?
Tb has 20 pins now. Would 24 or 28 pins made it a big disaster?
If tb v1 would have been specced to 3 channels and 6 Gbit/s per channel, nobody would have noticed any more bottlenecks than they do now.
Or with 4 channels and 6 Gbit/s it would be faster than it is now.
And both of these could have been used with passive cables and all tb stuff would be adopted in greater scale and become more affordable faster.
Also it would be more future-proof if it had both more channels and more bitrate per channel to grow in the future.
There is an optimum in mass production between cost and rocket science. Clearly Apple still haven't got this.
Apple only has a little more than 5% of the global market and around 12% of the US market for PCs based on unit shipments, but they take in over 35% of the operating profit for the entire industry. Apple needs to differentiate itself from the competition in order to justify higher unit prices and maintain those higher profit margins. Thunderbolt is just one more way to reinforce the perception that their devices are more elite, capable and valuable.
[...]
But USB 3.0 devices were not available until 2010, two years later. Do you think USB would be as popular today if it had been UPB (Universal Parallel Bus) instead? Also note that USB is massively popular, whereas SuperSpeed USB is just beginning to gain momentum. There are 6 billion USB devices in use, but even by the end of 2012, only 7% will be of the USB 3.0 variety.
Huh, only 420 MILLION usb3 devices by the end of 2012!
How much there should be for you to recognise usb3 as totally biggest thing at the moment in interconnecting devices?
According to In-Stat, only 14 million USB 3.0 controllers shipped in 2010. Apple reported sales of 14.43m Macs in 2010. Also, Apple sells primarily notebooks, and the USB-IF's numbers show only 2 million of those controllers ending up in notebook PCs, largely because the early silicon wasn't that great, especially regarding power consumption.
Can you also tell us the numbers from 2011?
Do they tell better which one is more significant for IT world, macs or usb3?
Why would Apple spend time developing and testing a driver for silicon that didn't meet their design criteria and was not available in sufficient quantity? Seriously, not even joking, the earliest Apple could have adopted USB 3.0 was mid 2011, and they would have had to lean hard on their suppliers to do so. Whether they should have done so or not at that point is entirely debatable, but saying that they could have added USB 3.0 in 2010 or even early 2011 is just not in line with reality.

The problem with standards is, well, you need to wait until things are standardized. Despite USB 3.0 being ratified in December of 2008, the first 4-port controller wasn't even certified until April of 2011.
C'mon, we all know how much Apple can do about things that they care with their 35% profits of the industry. Too bad that they have monopoly on OsX machines and can choose not to do what customers want. Eg. for years I've been able to buy from Apple a fullHD laptop with matte screen. No matter what size of the screen, I can't do it anymore. Hopefully Apple has grown so big, that anti-trust laws will force it to split OsX from the gadgets and others can start making machines for it also. On the other hand worst thing for OsX that could happen is that Apple decids to kill it, since they think they don't need halo from the pro segment anymore, then they don't need halo from the macs to iOS and iOS makes the best profits, so goodbye OsX... Maybe there will some sort of iOS-macs in the future, but...
If Apple would have cared about what's most beneficial for most mac users, they could have used a tiny fraction of the resources they used to tb, to pushing usb3 out sooner and better.
Can you come up with any other reason why Apple was totally passive with usb3, than they had to put tb to macbooks first, because otherwise tb would have been considered almost useless comparing price-performance against usb3?
My guess is that you'd be solidly into 6-digit territory for the number of Thunderbolt devices sold. And discounting the Apple Thunderbolt Display would be asinine. As would not including Ethernet dongles, unless you can point to a currently available USB 3.0 GbE adapter. And replacing the functionality of FireWire gear you already own with USB 3.0 stuff is not only potentially impossible at this point, but also probably more expensive than a buying a $29 adapter. However, the adapters aren't even available yet, so it makes no difference whether you include them or not.
ATD does not need tb for the connections it has on the back. Those all could be handled with single usb3 connection. If there would be Apple Usb3 Display $100 cheaper than ATD, ATD would sell as much as 17" MBP compared to other MBPs.

Only thing ATD does more is that it allows daisy chaining of second external display and this only feature is taken away if/when external retina display is introduced.

Btw,
now that new Airs have usb3, why Apple is not offering usb3-GbE-dongle?
Even when this is available:
http://electronicdesign.com/article/digital/USB-to-GbE-Controller-Sits-On-One-Chip-
?
But to be more realistic, the attach rate numbers are probably at least 2% for Thunderbolt, and according to the USB-IF were only around 60% for USB 3.0 after the first 18 months. The number of certified devices was only 250 for USB 3.0 vs. 50 for Thunderbolt, so that would be 5x more different products and 30x the attach rate.
So usb3 has only 5x variety in products and 30x attach rate!
This makes tb as succesfull as what? Rdram?
Also, there is still only a very limited range of device silicon available for USB 3.0. You have thumb drives, card readers, SATA bridges, 4-port hubs, cameras and a media player. In the first 18 months, how many USB 3.0 to 6Gb/s SAS/SATA bridges, GbE or 10GbE adapters, HD video interfaces, Fibre Channel adapters, PCIe or ExpressCard expansion chassis, or displays anything remotely like the ATD were available? Despite some initial overlap, USB 3.0 and Thunderbolt will end up being used for very different purposes. Basically if a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?
Exactly!
If a job can be done just as well with USB 3.0, why would you pay extra to do it with Thunderbolt?
That's why other manufacturers have been using usb3 and not tb.
Other manucturers than Apple don't think that external display is the best available docking station. If they make a docking station, they let the user decide what kind of display suits the user's needs.
Most things that use tb would be pretty much as snappy with usb3, but a whole lot cheaper.
Do you really think there will be crowds of angry customers with pitchforks when they discover they can only run a single external display with a resolution higher than 6MP? I’d love to see the performance of an MBA trying to drive more than that many pixels. According to your logic, consumers should only be allowed to have 1920x1080 displays anyway, because they are so much cheaper due to economies of scale.
The angry one here would be Apple. They think that it confuses too much their customers, if you can attach only one display in certain conditions and two displays in other. And of course Apple would have to scrap current displays in the same second they announce new external retinas, so that users who'd like to use 2 external displays, can't do it any more ("We have this new great FcpX, which is buggy and have totally no support from 3rd party, but no, you can't buy FSC3 upgrade anymore, because we know you don't want it anymore...").
Even driving 2 daisy-chained 2560x1440 ATDs, PCIe bandwidth over Thunderbolt is only reduced by about 16%, and just in the outbound direction and only for devices attached to that chain. You constantly exaggerate the impact of DP on Thunderbolt's PCIe performance, when the odds of it significantly affecting any real-world workflows are slim to none. How would a USB 3.0 controller fare trying to drive two 2560x1440 displays while writing more than 8.4 Gbps to an external RAID array? Oh right, it can't do either of those things anyway...
Again, use your imagination and look to the future.
Care to count how much one 4k 10-bit 3D display would take from tb?
Usb3 wouldn't have to drive this, because there's this dp connection next to it, which is dedicated for handling displays.
But it is far more likely that an inexpensive Thunderbolt to USB 3.0 dongle based on the Port Ridge controller and leveraging Apple's drivers will see the light of day as soon as Mountain Lion and/or the 10.7.5 update become available to the general public.
Wishful thinking of the day!
Apple is trying to sell current models of MBP and usb3 is the one thing older models don't have. Why would they give new model's feature to the old ones? They would loose some sales, so they would never do that.
Did they offered update to my 2009 MBP, when next model could switch GPU on the fly?
Did they offer new EFI to old MP's when new GPU's would have needed it?
Do they give update to my MP1,1's EFI when ML ships?
All I was saying is that mid-2011 is about as early as they could have added USB 3.0 to Macs, which was still over a year ago. Claiming that they could have done it years before that is ridiculous. The product would have sucked if that had done that.

How many PCs have you seen from 2010 where all of the USB ports are SuperSpeed? There really weren't any, because people need at least a couple USB ports that actually work. The early silicon and drivers were flakey and had compatibility issues.
Again, if Apple had had any interest adopting usb3 sooner than they had to do it without loosing their face, they could have done a lot for usb3 to mature faster. Instead they did a whole lot bigger, harder and more complex thing with tb in secrecy with intel. And that secrecy of course postponed 3rd party td devices few years. Why Apple didn't want more 3rd party from the start? Maybe even they realized that they took too expensive and too long step from the beginning. There is a window in time, money and tech to do things with biggest impact and I do think tb did wrong in all 3. Too expensive, too early for chosen tech(+ usb3 too late) and future upgrade roadmap blocked in some ways (retina tb).

Saying that silicon fabs couldn't have produced 10 million additional usb3 chips to Apple in 2010, if Apple had ordered them is as ridiculous as saying that Apple couldn't written non-flakey drivers for usb for the last decade.

You might have some intelligent explanation why $5k MP has half the speed of usb2 port than crappy $500 windoze-ultrabook?

EDIT: new things in bold.

AidenShaw
Jul 18, 2012, 10:14 PM
...

Toke, just give up.

Repoman just won't acknowledge that T-Bolt is a technology doomed to near irrelevance by its high cost and marginal value for the 99% (compared to USB 3.0).

And I don't think that I've seen him agree that tying T-Bolt and DisplayPort together on the same connector/cable was a monumental folly.

Maybe T-Bolt v2.0 will drop DisplayPort, include real optical support, and survive. T-Bolt v1.0 is looking very much like a DOA technology.

repoman27
Jul 19, 2012, 12:28 AM
...

20 pins seems like a good number. Adding more materials to the BOM rarely makes something cheaper, yet Moore's law shows us that the silicon used in last year's Thunderbolt cables will approximately halve in cost every couple years. (Hey, just like original article implies.) Dual-link DVI is one of the few video connections that isn't easily converted to other common formats without the use of a ~$100 adapter... because it used 6 lanes. And you'll note that nobody makes Type B HDMI gear. We'll see if anyone decides to use that connector before they bump the single-link speed again. I think you're missing the point that Intel had created an ASIC that was based on four 10 Gbps channels, and Apple had already developed the mini-DP connector. They are both good designs. Considering their capabilities, $30 to the OEM for the controller, and $49 to the consumer for the cable are absolutely reasonable for a first generation I/O technology like this. The thing about Thunderbolt is it has no baggage. It is at the same time the most advanced and the most future-proof I/O interface you can find on a PC these days.

USB 3.0 is not the biggest thing in interconnects at the moment. USB 2.0 is, and massively so. See how several hundred million is still an order of magnitude less than several billion? USB 3.0 is the 3rd generation of the most popular I/O interface in history. USB is cheap and common and nothing to get excited about really.

About 77m USB 3.0 controllers and 17.8m Macs shipped in 2011. USB 3.0 missed their projections for the year, while Apple exceeded theirs. If you're in IT, you should probably be paying attention to Apple, since pretty much everyone else is following their lead these days. USB is a standard designed by consensus. If you like watching C-Span, the USB-IF's keynotes might be right up your alley.

To be honest, I think Steve Jobs wanted to see Thunderbolt in the world before he departed it. He knew USB 3.0 would be in every PC once AMD and Intel integrated it into their chipsets. It didn't matter to him. Thunderbolt was something he really wanted to see become reality, though. It's the type of thing that's pretty much insanely ahead of it's time, hence it's expensive and most people don't get it. That's the type of stuff Apple loves. Everyone else in the industry will reach for the lowest common denominator—USB 3.0 needed no support from Apple to become ubiquitous.

The ATD needs Thunderbolt for the GbE and FireWire ports. USB 3.0 to GbE still hasn't arrived yet, although I'm not sure exactly why, and USB to FireWire generally isn't possible due to the differences in architecture.

Thunderbolt may possibly be the most successful 1st generation PC I/O port in history, due solely to Apple's commitment to it. I'd be curious if anyone can point to statistics that show otherwise.

Other manufacturers were not able to ship PCs or motherboards with Thunderbolt until a little over a month ago due to Apple's exclusivity agreement. Sony did use what was essentially Light Peak for their Vaio Z, because USB 3.0 alone was not sufficient for their Power Media Dock.

I find it odd that you have so much faith in what is currently $10,000 display technology suddenly becoming affordable in the next two years, but you have issues with Apple taking what is currently $500-$1000 per port I/O technology and bringing it to the desktop for about $100 per port. As for driving a 4K display over Thunderbolt using both DP 1.1a streams... Generally one goes for color accuracy or 3D / high refresh rates, but at 4096x2160, 24 bpp, 48 Hz (that would be the 3D refresh rate at 4K) you'd still have more than twice the bandwidth of USB 3.0 left over for PCIe. At 4096x2160, 30 bpp, 60 Hz you'd still have almost 3 Gbps outbound and of course your full 10 Gbps inbound. Oh, and the 2012 Macs have a USB 3.0 port right next to the Thunderbolt port in case you need that too.

Apple most likely won't make a Thunderbolt to USB 3.0 dongle, but that will not stop someone else from doing it in a heartbeat once all the heavy lifting has already been done.

I was almost going to suggest earlier that the real reason Apple hasn't given us Macs with USB 3.0 before now is because they refused to release their driver until it could at least match the performance of the Windows ones.

Toke, just give up.

Repoman just won't acknowledge that T-Bolt is a technology doomed to near irrelevance by its high cost and marginal value for the 99% (compared to USB 3.0).

And I don't think that I've seen him agree that tying T-Bolt and DisplayPort together on the same connector/cable was a monumental folly.

Maybe T-Bolt v2.0 will drop DisplayPort, include real optical support, and survive. T-Bolt v1.0 is looking very much like a DOA technology.

Maybe the 1% look at things differently. I'll take better over cheaper any time I can.

Not even 18 months out and over 20 million ports shipped, 50 devices on the market, currently more than 2 new products coming to market every week...

And by the way, the only thing the 99% generally use a more than 6 Gbps external I/O connection for is to drive digital displays. HDMI and DP are the only common interfaces with that kind of bandwidth. That's why it makes sense to combine them. Thunderbolt would actually be useless to most people if it wasn't also their video out port.

Since you've never actually used Thunderbolt, how can you be so sure it isn't a good technology?

MagnusVonMagnum
Jul 19, 2012, 02:26 PM
I just saw my first Thunderbolt hard drive product at a Best Buy (BTW, 3TB drives have gone UP since I bought my last two; I paid probably $115 on average for the last two and this one was $150 for the exact same drive/brand WD USB3). Anyway, the Thunderbolt drive was huge (basically 2x the size of the 3TB WD drive which was already 2 drives thick since they're basically two 1.5TB drives raided together) and it was 4TB in size (I'm guessing four 1TB drives raided for speed) and cost $576 if I recall correctly. That's pretty pricey for 4TB, but if it really is raided for speed, it should be pretty quick. But even so, they're not SSD drives and if they're just RAID 0, they're not protected against errors. I didn't really look that closely since I don't have Thunderbolt and wow that's pricey. I've got 9TB for $380 now. I'm using them for backup and off-site backup respectively, so I guess I'd have to compare 3TB for $380, but even if that array is a higher RAID setup (doubt it since it'd be 4.5TB if they were 4 1.5TB drives RAIDED), it still doesn't solve an off-site backup solution (they have "LIVE" drives for a "CLOUD" off-site option these days on top of the local storage, but that would take awhile to transfer that kind of data).

AidenShaw
Jul 19, 2012, 03:56 PM
I just saw my first Thunderbolt hard drive product at a Best Buy (BTW, 3TB drives have gone UP since I bought my last two; I paid probably $115 on average for the last two and this one was $150 for the exact same drive/brand WD USB3). Anyway, the Thunderbolt drive was huge (basically 2x the size of the 3TB WD drive which was already 2 drives thick since they're basically two 1.5TB drives raided together) and it was 4TB in size (I'm guessing four 1TB drives raided for speed) and cost $576 if I recall correctly. That's pretty pricey for 4TB, but if it really is raided for speed, it should be pretty quick. But even so, they're not SSD drives and if they're just RAID 0, they're not protected against errors. I didn't really look that closely since I don't have Thunderbolt and wow that's pricey. I've got 9TB for $380 now. I'm using them for backup and off-site backup respectively, so I guess I'd have to compare 3TB for $380, but even if that array is a higher RAID setup (doubt it since it'd be 4.5TB if they were 4 1.5TB drives RAIDED), it still doesn't solve an off-site backup solution (they have "LIVE" drives for a "CLOUD" off-site option these days on top of the local storage, but that would take awhile to transfer that kind of data).

Was it this one? (it's 2 2TB drives or 2 3TB drives, with RAID-0 or RAID-1)

http://www.wdc.com/en/products/products.aspx?id=630
http://www.newegg.com/Product/Product.aspx?Item=N82E16822236217

If so, the USB/1394/eSATA version is $200 cheaper than T-Bolt.

MagnusVonMagnum
Jul 19, 2012, 06:43 PM
Was it this one? (it's 2 2TB drives or 2 3TB drives, with RAID-0 or RAID-1)

http://www.wdc.com/en/products/products.aspx?id=630
http://www.newegg.com/Product/Product.aspx?Item=N82E16822236217

If so, the USB/1394/eSATA version is $200 cheaper than T-Bolt.

Yeah, I think that's it. I guess that means my 3TB models are just one drive since that one is definitely twice as thick. As for the price, I always figured TB would cost $150-200 more since that's what Firewire drives tended to cost above just USB 2.x or even eSata + USB 2.0. I paid quite a bit for a 500GB FW800 drive a few years ago for my Macbook Pro. I think the internal 7200 RPM 500GB drive I put in cost me like $80 compared to like $300 for the external micro FW800 drive, but I did buy the former over a year after the external since that is when I was setting the MBP up for Logic Pro (also upgraded memory to 4GB).

Now lucky me I've got a noisy fan on the left side. It started a few months after the upgrade. I thought maybe I screwed something up when I put it back together, but I've read it's extremely common on these things. Mine is apparently far less noisy than most. I plan to take it apart and try to lube it. I didn't want to attempt it until my music project was done (took 2.5 years) just in case I screwed something up somehow. It hasn't really gotten any worse since then, though. Kind of a "wub wub" sound that's just loud enough to be annoying and not much else but definitely only on the left side (apparently the most common to go bad since most of the heat is over there; maybe it dries up the grease or something). The only good thing is my 8600M GT GPU has never gone bad, but then I keep the fan higher than normal and rarely play games on it. Certainly, I was very happy with Logic on it. Now my FW stuff will be a PITA on newer models that dump all that stuff in favor of dongles.

I've been debating whether to grab a 17" MBP while I can still get one. It's only missing USB 3.0 and I don't really need that on a notebook, necessarily since it has FW800 on it and you could technically add it with a TB adapter at some point and the 17" size would be nice for portable music project work, but then if this album I've about to release doesn't sell, I probably won't bother with a second project.

Logic, though is wonderful. I couldn't have asked for better results for sound quality and the effects were good enough I didn't need external guitar boxes, etc. (they could be improved upon, though; a lot of sounds need tweaked). It'd be a darn shame if Apple drops the ball on it. It's already been a good long time since the last update. Why bother with professional interfaces like TB if you don't properly maintain your Pro software lines? They cheesed a lot of people off with Final Cut X. I hope any Logic Pro X doesn't screw things up. It's already fantastic, but I'm sure many Pros still have features they'd like to see. I'd rather see improved default sound banks for the soft-synths and what not. I was pretty happy with editing for the most part, although cut/pasting bits for smooth extensions or loops could have been a bit easier, but then I don't know how to use every single feature it's so extensive. I managed to do everything I set out to do, though so it wasn't too bad even so.

toke lahti
Jul 19, 2012, 07:33 PM
Dual-link DVI is one of the few video connections that isn't easily converted to other common formats without the use of a ~$100 adapter... because it used 6 lanes. And you'll note that nobody makes Type B HDMI gear. We'll see if anyone decides to use that connector before they bump the single-link speed again. I think you're missing the point that Intel had created an ASIC that was based on four 10 Gbps channels, and Apple had already developed the mini-DP connector. They are both good designs.
I think you are missing the point that how well you can convert tb to other formats? I've used dl-dvi for years without any need to convert it to anything.
Maybe you can tell us why dp supports only sl-dvi without any conversions?
USB 3.0 is not the biggest thing in interconnects at the moment. USB 2.0 is, and massively so. See how several hundred million is still an order of magnitude less than several billion? USB 3.0 is the 3rd generation of the most popular I/O interface in history. USB is cheap and common and nothing to get excited about really.
You could also say that windowsXP is the biggest thing on OS's at the moment, but I think you know what I meant. Usb3 is the biggest new & fast interconnection. And maybe you are not excited for the best ratio for speed per buck, but many of us are.
The ATD needs Thunderbolt for the GbE and FireWire ports. USB 3.0 to GbE still hasn't arrived yet, although I'm not sure exactly why, and USB to FireWire generally isn't possible due to the differences in architecture.Usb3 could easily handle GbE and fw.
Also, if you checked the link I gave, usb3-2-GbE is already in silicon, so I bet it will be on the shelf in few months. Usb3 has everything necessary for usb3-2-fw. Check out the specks. I guess the reason why they still don't exist, is so low demand for them.
I find it odd that you have so much faith in what is currently $10,000 display technology suddenly becoming affordable in the next two years, but you have issues with Apple taking what is currently $500-$1000 per port I/O technology and bringing it to the desktop for about $100 per port. As for driving a 4K display over Thunderbolt using both DP 1.1a streams... Generally one goes for color accuracy or 3D / high refresh rates, but at 4096x2160, 24 bpp, 48 Hz (that would be the 3D refresh rate at 4K) you'd still have more than twice the bandwidth of USB 3.0 left over for PCIe. At 4096x2160, 30 bpp, 60 Hz you'd still have almost 3 Gbps outbound and of course your full 10 Gbps inbound. Oh, and the 2012 Macs have a USB 3.0 port right next to the Thunderbolt port in case you need that too.
Apple has introduced high pixel density screens now in phones, tablets and laptops. How logical would it be to stop here?

4k is standard in DCI and there are already even consumer 4k video cameras. Also still photography benefits from HiDPI. 4k televisions are coming. How much HiDPI has increased so far Apple's devices' prices? Economy of scale just works here. There's nothing new that when mass production adopts new tech, one or two zeros drop from price.
When 4k computer screens go mainstream, nobody will accept 24 Hz for 3D. Same display has to be good in sports-tv, video games and movies. So it will have to be at least 4096x2160, 30 bpp, 60 Hz and you can double that for 3D. All of a sudden a usb3 port is faster for data when these 4k-3D-displays are connected to tb.
So tb has to have new revision very soon now. This will be very problematic for Apple since they are obsessed for over-simplifying and two versions of tb this close in time and also before than the first version even gets widely accepted will be very bad. Mac customers will be pissed off, when they notice that their 1/2/3 year old flagship is all of a sudden obsolete because of its "the most advanced and the most future-proof I/O". There's your baggage.
Apple most likely won't make a Thunderbolt to USB 3.0 dongle, but that will not stop someone else from doing it in a heartbeat once all the heavy lifting has already been done.
And that dongle will cost customer 10 times more than what including usb3 port to a mac would have cost to Apple.
I was almost going to suggest earlier that the real reason Apple hasn't given us Macs with USB 3.0 before now is because they refused to release their driver until it could at least match the performance of the Windows ones.
Why windows machines perform so good?
Can't Apple write drivers for their OS?
Or did they just started writing so much late?
Maybe the 1% look at things differently. I'll take better over cheaper any time I can.
That 1% wasn't enough for 17" MBP. Apple will ditch tb as soon as it has more profitable way to do same things.
Not even 18 months out and over 20 million ports shipped, 50 devices on the market, currently more than 2 new products coming to market every week...
Still, usb3 is doing 5x/30x better, like you calculated...
And by the way, the only thing the 99% generally use a more than 6 Gbps external I/O connection for is to drive digital displays. HDMI and DP are the only common interfaces with that kind of bandwidth. That's why it makes sense to combine them. Thunderbolt would actually be useless to most people if it wasn't also their video out port.
Combining two high speed ports is slower than giving two ports. But you are right; it seems that Apple's biggest fear is expensive port that has no use. Now that macs are getting more widely hdmi ports, tb might just end like that...
Since you've never actually used Thunderbolt, how can you be so sure it isn't a good technology?
Just lookin' at price tag is enough!

AidenShaw
Jul 19, 2012, 10:18 PM
Originally Posted by repoman27
Since you've never actually used Thunderbolt, how can you be so sure it isn't a good technology?

Just lookin' at price tag is enough!

Didn't the late turtlenecked overlord teach you that it was good to pay $500 for a $300 disk drive because it used Apple-only technology - even though the real world performance of the $300 drive was the same?

:rolleyes:

repoman27
Jul 20, 2012, 12:17 PM
I think you are missing the point that how well you can convert tb to other formats? I've used dl-dvi for years without any need to convert it to anything.
Maybe you can tell us why dp supports only sl-dvi without any conversions?

If you have a display that requires DL-DVI and lacks DP or HDMI, you need an expensive adapter. This is true for both DP and HDMI outputs because they reduced the pin count to 20 and 19 pins respectively from DVI's 25 in order to make more compact and mobile device friendly connectors. DL-DVI uses 6 signaling pairs, whereas DP only has 4 and HDMI only has 3. Then again, the only displays that present this issue cost over a grand at the time they were originally sold, so paying 10% of that for an adapter can be rationalized I guess.

You could also say that windowsXP is the biggest thing on OS's at the moment, but I think you know what I meant. Usb3 is the biggest new & fast interconnection. And maybe you are not excited for the best ratio for speed per buck, but many of us are.

I had to restrain myself yesterday from impulse buying 2 new USB 3.0 drive enclosures that I came across because they were less than $100 combined. I tend to have to deal with a lot of legacy equipment in the field, so I'm not as excited now as I will be when the majority of that gear finally has a USB 3.0 port on it. (I also still have to deal with a fair number of PC's running XP.) I recommended a USB 3.0 backup device that came bundled with a free USB 3.0 PCIe adapter to a client a few months ago, and they decided that they could save about $50 by just going with a USB 2.0 only version because they didn't need it to be fast. Total face-palm.

Also, if you checked the link I gave, usb3-2-GbE is already in silicon, so I bet it will be on the shelf in few months. Usb3 has everything necessary for usb3-2-fw. Check out the specks. I guess the reason why they still don't exist, is so low demand for them.

Actually, I'm not sure why the USB 3.0 GbE adapters aren't shipping already. I wonder if they're having difficulty getting them to fit in the power envelope of a standard USB device.

There is no device silicon yet to bridge FireWire to USB 3.0, and I wonder if there would ever be enough demand to make a dongle that could actually do so. USB not allowing DMA makes FireWire to USB a bit tricky though. I'm not sure it would ever be able to offer all of the same functionality.

Apple has introduced high pixel density screens now in phones, tablets and laptops. How logical would it be to stop here?

4k is standard in DCI and there are already even consumer 4k video cameras. Also still photography benefits from HiDPI. 4k televisions are coming. How much HiDPI has increased so far Apple's devices' prices? Economy of scale just works here. There's nothing new that when mass production adopts new tech, one or two zeros drop from price.
When 4k computer screens go mainstream, nobody will accept 24 Hz for 3D. Same display has to be good in sports-tv, video games and movies. So it will have to be at least 4096x2160, 30 bpp, 60 Hz and you can double that for 3D. All of a sudden a usb3 port is faster for data when these 4k-3D-displays are connected to tb.
So tb has to have new revision very soon now. This will be very problematic for Apple since they are obsessed for over-simplifying and two versions of tb this close in time and also before than the first version even gets widely accepted will be very bad. Mac customers will be pissed off, when they notice that their 1/2/3 year old flagship is all of a sudden obsolete because of its "the most advanced and the most future-proof I/O". There's your baggage.

I realize 4K displays are coming, and I'm all for them. I am really excited about the "retina" trend. I'll be very happy to see the day when it is common for displays less than 30" to be 300-600 ppi. The problem is that there are significant barriers to pushing this type of technology into the mainstream right now. For instance, trying to broadcast sports at 4096x2160, 30 bpp, 60 Hz presents significant problems for the content providers. If there is no adequate means of content delivery, there will be no content for the end user, and thus no demand for the higher resolution displays. That's why digital cinema projectors are hitting those resolutions, but it hasn't really made it to the home yet. We can't all have the studios mail us a hard drive with a movie on it, and the intertubes aren't ready for Netflix to start offering 4K streaming to the masses.

In the PC/tablet space, screen resolution is currently bound by the graphics capabilities of the device. The MBPR has a half decent GPU, and it has just enough horsepower to maintain fluidity driving a single panel at 2880x1800, 24 bpp, 60 Hz. 4096x2160, 30 bpp, 120 Hz would require 35 Gbps of pixel data! That's more than two DisplayPort 1.2 or four HDMI 1.4a connectors could drive. What GPU could keep up with that? Even the AMD Radeon 7970 and NVIDIA GeForce GTX 690 cannot push that many pixels.

So for the foreseeable future, Thunderbolt is looking pretty good. Intel has said that the first speed increase will likely occur in 2014, and it would stand to reason that it will jump to 25 Gbps per channel at that time. Plenty to deal with the increased burden of large, hi-dpi digital displays when they do arrive.

And that dongle will cost customer 10 times more than what including usb3 port to a mac would have cost to Apple.

Yep.

Didn't the late turtlenecked overlord teach you that it was good to pay $500 for a $300 disk drive because it used Apple-only technology - even though the real world performance of the $300 drive was the same?

:rolleyes:

Well, it's more like you're paying $389 (plus $49 for a cable) for a $169 disk drive (which is even more of a delta) because it uses Intel-only technology to deliver about the same performance.

However, if you're looking at multiple disks or SSD's, or non-storage based applications, Thunderbolt offers far better real-world performance than USB 3.0 for your extra $250. Pretty much anything that falls under the 275 MB/s threshold would be more economically achieved via USB 3.0 though.

toke lahti
Jul 22, 2012, 06:58 AM
If you have a display that requires DL-DVI and lacks DP or HDMI, you need an expensive adapter. This is true for both DP and HDMI outputs because they reduced the pin count to 20 and 19 pins respectively from DVI's 25 in order to make more compact and mobile device friendly connectors. DL-DVI uses 6 signaling pairs, whereas DP only has 4 and HDMI only has 3. Then again, the only displays that present this issue cost over a grand at the time they were originally sold, so paying 10% of that for an adapter can be rationalized I guess.
You are missing the point. Has there ever been any problems for dl-dvi users for dl-dvi using 6 pairs of signals? Cables too thick, too short, too expensive, too heavy?
I never had any problems with the size of full dvi connection in powerbooks and there's also mini-dvi available. If the only problem is how to convert dl-dvi to other (newer) formats, then isn't these other (newer) formats the problem?

So once again, if dp & tb designers would had put room for few additional pins, all cables and adaptors would be affordable. I don't believe that BOM of pins have any significant role here. Designers just design these things for present time & need, without looking to the future.
Actually, I'm not sure why the USB 3.0 GbE adapters aren't shipping already. I wonder if they're having difficulty getting them to fit in the power envelope of a standard USB device.
I guess that there are so few computers with usb3, but no rj45, that there just haven't been so much demand for these in the past.

Usb3 increased power supply from 500mA to 900mA. Pretty surprising, if GbE uses over double what 100MbE uses.
There is no device silicon yet to bridge FireWire to USB 3.0, and I wonder if there would ever be enough demand to make a dongle that could actually do so. USB not allowing DMA makes FireWire to USB a bit tricky though. I'm not sure it would ever be able to offer all of the same functionality.
There's so much advanced tehniques used in Usb3 compared to usb2, that I don't think that DMA access is so important any more. Also computers and devices have a lot more processing power in multiple cores to spend. Also DMA has security problems.
I realize 4K displays are coming, and I'm all for them. I am really excited about the "retina" trend. I'll be very happy to see the day when it is common for displays less than 30" to be 300-600 ppi. The problem is that there are significant barriers to pushing this type of technology into the mainstream right now. For instance, trying to broadcast sports at 4096x2160, 30 bpp, 60 Hz presents significant problems for the content providers. If there is no adequate means of content delivery, there will be no content for the end user, and thus no demand for the higher resolution displays. That's why digital cinema projectors are hitting those resolutions, but it hasn't really made it to the home yet. We can't all have the studios mail us a hard drive with a movie on it, and the intertubes aren't ready for Netflix to start offering 4K streaming to the masses.
But we can have play.com or amazon mail us a blu-ray. I guess that within 2 years we will have new bd version, which is 4k.
4k is essentially for motion pictures for first decade, but other will follow.
3D broadcasts will be mainstream in a few years and 4k is next.
Anyway broadcast standards didn't stop Apple selling rMBP. Display industry is moving to higher pixel densities. There's nothing to stop it. Television and computer is mixing more and more all the time, so pretty soon broadcasting standards are not so important when buying displays.
In the PC/tablet space, screen resolution is currently bound by the graphics capabilities of the device. The MBPR has a half decent GPU, and it has just enough horsepower to maintain fluidity driving a single panel at 2880x1800, 24 bpp, 60 Hz. 4096x2160, 30 bpp, 120 Hz would require 35 Gbps of pixel data! That's more than two DisplayPort 1.2 or four HDMI 1.4a connectors could drive. What GPU could keep up with that? Even the AMD Radeon 7970 and NVIDIA GeForce GTX 690 cannot push that many pixels.
Half decent isn't enough then, so this may be a big broblem to Apple. They'd have to upgrade their GPU offerings and drivers to full decent!

There's already dual-dp in use. Next upgrade to dp & hdmi specs will come soon. Doubling the speed in every revision is usual business. GPU power will keep increasing like it has for decades. You sound just like people who said that you just can't put high pixel density to phones, tablets or laptops. Big screen with desktop computer is the easiest option here, but once again, it seems to a bit out of Apple's focus.

So for the foreseeable future, Thunderbolt is looking pretty good. Intel has said that the first speed increase will likely occur in 2014, and it would stand to reason that it will jump to 25 Gbps per channel at that time. Plenty to deal with the increased burden of large, hi-dpi digital displays when they do arrive.
You keep repeating that tb is looking good. Is it really?

So far it seems that first year when wide usage is even possible is 2013. And for big retina displays, they need new version 2014 at the latest, maybe even next year. Meaning that tv v1 could be widely used only under 2 years or maybe even only less than year. That kind of upgrade path will not lead to happy ecosystem.

How much do you think those 25Gb tb devices and cables will cost in 2014? Twice the price of today? Quadruple? Add one zero to price? None of these will work. Tb's main object should be getting the price down ASAP and increasing channel speed will do just opposite.
Well, it's more like you're paying $389 (plus $49 for a cable) for a $169 disk drive (which is even more of a delta) because it uses Intel-only technology to deliver about the same performance.

However, if you're looking at multiple disks or SSD's, or non-storage based applications, Thunderbolt offers far better real-world performance than USB 3.0 for your extra $250. Pretty much anything that falls under the 275 MB/s threshold would be more economically achieved via USB 3.0 though.
And at the same time we have to keep ignoring the most reliable (s.m.a.r.t) $1-solution to connect storage: the native connection or eSATA(p).

When 99% of high bandwidth need is either display or storage, I just can't stop thinking how much easier and cheaper it would have been, if Apple would have just used dp & esata.

AidenShaw
Jul 22, 2012, 09:52 AM
When 99% of high bandwidth need is either display or storage, I just can't stop thinking how much easier and cheaper it would have been, if Apple would have just used dp & esata.

I can give you an example.

I recently added 12 TB as a 9 TB RAID-5 array to my home PC for backup storage.


$100 Sans Digital TR4M 4 drive hot-swap eSATA external cabinet, includes free cable! (newegg (http://www.newegg.com/Product/Product.aspx?Item=N82E16816111177))
$600 WD WD30EZRX 3 TB Intellipower 64 MiB cache (4)(newegg (http://www.newegg.com/Product/Product.aspx?Item=N82E16822136874))
--------
$700 Total (before sales tax)


Apple sells the 12 TB Pegasus for $2499 (without cable) - and while it's more performant than my setup on paper, for backups it would be more than 3 times the cost for no added value.

Lance-AR
Jul 22, 2012, 01:24 PM
The Pegasus is also upgradable to 24 TB.

It's human nature to defend our choices. I chose TB for my DAS. I chose USB3 for my portable drive. In my mind, TB to USB isn't a straightforward comparison.

pdjudd
Jul 22, 2012, 01:30 PM
The Pegasus is also upgradable to 24 TB.

It's human nature to defend our choices. I chose TB for my DAS. I chose USB3 for my portable drive. In my mind, TB to USB isn't a straightforward comparison.

From what I read, they aren’t meant to. The only thing that they have in common is that they have an interface used to transmit data. They are very different implementations and different applications. USB3 is meant to supplement/replace USB2. TB isn’t meant to do that. TB isn’t meant to replace or “Kill” USB at all. If anything, it’s meant to kill and expand on Firewire.

AidenShaw
Jul 22, 2012, 01:50 PM
The Pegasus is also upgradable to 24 TB.

It's human nature to defend our choices. I chose TB for my DAS. I chose USB3 for my portable drive. In my mind, TB to USB isn't a straightforward comparison.

From what I read, they aren’t meant to. The only thing that they have in common is that they have an interface used to transmit data. They are very different implementations and different applications. USB3 is meant to supplement/replace USB2. TB isn’t meant to do that. TB isn’t meant to replace or “Kill” USB at all. If anything, it’s meant to kill and expand on Firewire.

The topic I was addressing was eSATA vs T-Bolt, not USB.

...and I can add a second $100 cabinet to the spare eSATA port, upgrade to 8 4 TB drives and have 32 TB for still a lot less than the 12 TB Pegasus. (Although I wouldn't do that, the 4 TB drives are still having teething pains - lots of DOA units and lots of infant mortality.)

...and, like I said, the Pegasus is more performant - but it's wasteful overkill for my purpose. If I needed high performance DAS and cost weren't important, T-Bolt is a good solution.

Since the bottleneck for my purpose is Cat6 GbE Ethernet, T-Bolt is a waste of money.

toke lahti
Jul 23, 2012, 06:54 PM
When there is tb-esata dongle for <$50, I'll get interested in tb...

AidenShaw
Jul 23, 2012, 07:12 PM
When there is tb-esata dongle for <$50, I'll get interested in tb...

But even that could be over-priced.

(since a dual-port 6 Gbps eSATA PM-enabled PCIe hardware RAID 0/1/5 controller is only $45 (http://www.newegg.com/Product/Product.aspx?Item=N82E16816115076))

G51989
Jul 24, 2012, 12:31 AM
While USB 3.0 may align with the goals of most people, Apple doesn't necessarily try to sell to the majority. They target a wealthier more educated demographic. One that thinks differently. Even if it's a bunch of BS, it seems to be working for them.

* Looks at College degree....makes really good money....simulation engineer.....does all my work on a PC *.....I must be more and stupid for using a PC huh? Or it could be because Apple makes underpowered hardware?

USB 3.0 is going to co exist with Thunderbolt on macs, because the industry standard will be USB 3.0 for 95% of devices, while TB will be like FW and find a niche.

The Mac as it stands now I think caters to computer users who have money to spend, but really don't want to do anything serious on their machines, or they just like the OSX interface.

I mean, if we are talking the " Pro " market. The recent " update " shows exactly how much apple cares about " Pros "


Not that anyone in their right might should buy a Modren Mac, I've been buying new Macs since the Pismo, ( first new tower was a G4, bought g4s over the years, and a new G5 ). New Macs are total crap build quality wise compared to the old ones.

toke lahti
Jul 24, 2012, 05:07 AM
But even that could be over-priced.

(since a dual-port 6 Gbps eSATA PM-enabled PCIe hardware RAID 0/1/5 controller is only $45 (http://www.newegg.com/Product/Product.aspx?Item=N82E16816115076))
Oops, I overstated. Macbook with tb needs also at least fullHD matte screen, before I'm willing to pay for it...
USB 3.0 is going to co exist with Thunderbolt on macs, because the industry standard will be USB 3.0 for 95% of devices, while TB will be like FW and find a niche.
Also 95% of tb users will use tb just for display and/or storage, which would both be more efficiently handled by dp & esata.

AidenShaw
Jul 24, 2012, 03:43 PM
Oops, I overstated. Macbook with tb needs also at least fullHD matte screen, before I'm willing to pay for it...

Also 95% of tb users will use tb just for display and/or storage, which would both be more efficiently handled by dp & esata.

Probably closer to 99.9% - or 0.1%, depending on how you count.

If you connect an mDP display to the T-Bolt port, you're not using T-Bolt - so you're not a T-Bolt user.

thekev
Jul 26, 2012, 12:53 AM
Also 95% of tb users will use tb just for display and/or storage, which would both be more efficiently handled by dp & esata.

USB3 should gain some popularity too. Regular old displayport is great. It's a pain in the ass to find a mini displayport to displayport cable, and dongles suck. Apple has some of the most worthless adapters imaginable.

takao
Jul 26, 2012, 05:03 AM
hardly surprising it is now really looking like firewire (especially 800) all over again .. incredible performance at an incredible price...

firewire was perhaps 30% more expensive and it it didn't make it on the market, and TB turns out to be even more expensive, ... the idea to put the interface chips into the cables to make devices cheaper was a rather huge failure IMHO

it has been 1,5 years and the stores are filled with usb3 devices while TB devices are as common as unicorns

luckily i can use a mini display port to HDMI adapter or otherwise this would have been the most pointless port any of my computers ever had IMHO

toke lahti
Jul 26, 2012, 08:54 AM
it has been 1,5 years and the stores are filled with usb3 devices while TB devices are as common as unicorns
If just someone calculated their probability of existence to over 0.5, maybe they started to appear...