Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Absolutely right - Light Peak will allow 10G connectivity to a 2012 model MBA without spoiling the look - so why bother with an interim technology.

How does USB spoil the look of your MBA ? Why do you even care how your electronics look ? Looking at my MBA presently, I don't even see the ports.. :confused:

USB3 will supplant LP for one reason that has been stated ad nauseum in this thread : Backwards compatibility. That's pretty big.
 
I'd enjoy getting an HP ePrinter if only to take advantage of AirPrint in Mac OS X 10.6.5 and iOS 4.2.

http://www.apple.com/pr/library/2010/09/15airprint.html

But I have never liked Bluetooth anything. It's an inadequate technology for computers: Clunky, unreliable, with lousy latency and bandwidth. Kill it already!
:mad:

Such terrible latency that Sony use it as the wireless protocol on their wireless PS3 controllers.
 
Apple will support it when Intel chipset's support it. That's my guess.

That said, Steve Jobs is a moron. They could be installing USB 3.0 right now with a 3rd party chip. They didn't have a problem with that in the PPC Macs.

Yes, Apple could easily add a 3rd party chip for lots of things.

It's obvious Steve Jobs has made a command decision to save money and hope no one will notice and/or Apple is just totally concentrating on the iPhone/iPad/iPod platforms.

Meanwhile Apple is one of the richest companies in the world and Steve Jobs is one of the richest men in the world.

It reminds me of an old quote my Dad used to say...
"You don't get rich by spending lots of money."

Of course Dad didn't have a BILLION DOLLARS in the bank like Apple though.

It kind of amazes me that while Apple has continually neglected the Mac platform with top notch graphics cards, BluRay, and now USB 3.0, and is restricting 3rd party companies on certain accessories, Mac sales continue to remain very strong.

However, a lot of that has to do with Microsoft fumbling the ball on VISTA.

If I were Steve Jobs this holiday season, I'd say grace before Thanksgiving dinner thanking Microsoft and on Christmas, I'd visit the Microsoft campus in Redmond dressed as Santa and hand out presents to every employee. :D
 
It amazes me that you think that third party chips are like Legos - you just snap them together with no consequences. I already posted why that's not the case - the only current solutions are big, power hungry, and take up motherboard resources that are used by other things. Go look at the datasheets for the chips. When a decent technical solution is available (i.e. integrated chipsets support it) there will be usb3 across the mac range.


Yes, Apple could easily add a 3rd party chip for lots of things.

It's obvious Steve Jobs has made a command decision to save money and hope no one will notice and/or Apple is just totally concentrating on the iPhone/iPad/iPod platforms.

Meanwhile Apple is one of the richest companies in the world and Steve Jobs is one of the richest men in the world.

It reminds me of an old quote my Dad used to say...
"You don't get rich by spending lots of money."

Of course Dad didn't have a BILLION DOLLARS in the bank like Apple though.

It kind of amazes me that while Apple has continually neglected the Mac platform with top notch graphics cards, BluRay, and now USB 3.0, and is restricting 3rd party companies on certain accessories, Mac sales continue to remain very strong.

However, a lot of that has to do with Microsoft fumbling the ball on VISTA.

If I were Steve Jobs this holiday season, I'd say grace before Thanksgiving dinner thanking Microsoft and on Christmas, I'd visit the Microsoft campus in Redmond dressed as Santa and hand out presents to every employee. :D
 
Yes, Apple could easily add a 3rd party chip for lots of things.

It's obvious Steve Jobs has made a command decision to save money and hope no one will notice and/or Apple is just totally concentrating on the iPhone/iPad/iPod platforms.

Meanwhile Apple is one of the richest companies in the world and Steve Jobs is one of the richest men in the world.

It reminds me of an old quote my Dad used to say...
"You don't get rich by spending lots of money."

Of course Dad didn't have a BILLION DOLLARS in the bank like Apple though.

It kind of amazes me that while Apple has continually neglected the Mac platform with top notch graphics cards, BluRay, and now USB 3.0, and is restricting 3rd party companies on certain accessories, Mac sales continue to remain very strong.
For the record, the lion's share of Steve Job's wealth comes from his Disney holdings.

He bought the The Graphics Group for $10 million from Lucasfilms and renamed it Pixar. At first, Pixar sold a line of animation computers before they started creating animated shorts, they then moved on to some very successful feature films. Pixar was bought out by Disney in a stock transaction, and in the process Steve Jobs became the largest DIS shareholder (he holds more shares than Eisner).

But none of that means anything to Apple shareholders. His job is still to run Apple in a way that gets a decent return on investment to AAPL shareholders.

The reality is that you don't build an MBA (or even 13" MBP) by adding a bunch of additional chips or chipsets to what is already a tiny mobo. Which is the same reason those laptops don't have i3 or i5 processors, because suitable integrated graphics chips aren't available from Intel for the newer Arrandale processors, where Nvidia 320M integrated graphics and a C2D processor can actually get better overall performance.

Didn't your dad ever tell you life was full of tradeoffs? :p

There is little evidence that USB 3.0 has that problem if use a modern multicore computer that isn't in the Atom/ARM market place. It was not the CPU that was the major issue. It was the lack of bandwidth to ship the low latency isochronous data traffic over.

USB 3.0 doesn't necessarily have that problem. (if force a revert to USB 2.0 circuitry it will.... but then you are strictly running USB 2.0 at that point. )


a. The SuperSpeed bus is a completely different set of wires. So if you plug in some pokey USB 1.1 keyboard and a chatty USB 2.0 web camera it has no impact at all on the SuperSpeed channel. (you might get very minor impact since share same PCI-e connection but that isn't likely to be a significant issue). Seperate sub-controllers run each different set of wires independently for the most part (some minor connection/metadata issues).

b. The SuperSpeed bus is duplex ( so that device and computer talk at same time). Again completely diffferent protocol than USB 2.0 where the computer's hub has to request each device to speak-only-when-spoken to.
If computer doesn't ask audio device for data updates often enough may loose data. Doesn't happen on SuperSpeed.


c. The bandwidth that can be reserved for isoschronous traffic is larger. Audio data (unless bound numerous tracks together ) isn't that much larger now than 3-4 years ago. That means more bandwidth than had before. Meaning less drop outs. Meaning cleaner data.

d. The increase in data traffic doesn't need anywhere near a 1:1 increase in CPU consumption. (in part because of b but also other factors incorporated into the new design. )

The primarily high end audio complaint left is when you stretch FW out to maximum distance ( dozens of meters ). LP will get much more traction there than USB 3.0 will.

The other self inflicted problem is where overload the computer's CPU generating audio (why? DSPs are more effective, but prehaps strapped for cash ) and run some other software component at 90+% CPU utiliztaion levels while at same time capture some large music data stream. Besides if pressing keys and fiddling trackpad on the computer you are generating USB traffic (and CPU overhead ) too. The duration of the CPU overhead is also a major contributor. Again since faster the duration is shorter ( if isoloate SuperSpeed channel to primarily just audio capture. )

If plug a USB 3.0 capable audio device into a USB 2.0 socket ... then yeah you will get the same old USB 2.0 problems. However, if plug a USB 3.0 device into a USB 3.0 socket you get a new, not saddled with dubious design constraints and not limited by backward compatibiliy network between the two devices. It is brand new game. Trotting out the old USB 2.0 litany of complaints about the new 3.0 is lame.
What ARE you talking about? The problem with USB 2.0 isn't the bus, the problem is that moving data around requires CPU cycles that would be better utilized doing analog-to-digital encoding. (Which FireWire solved by adding their own controller instead of stealing cycles from the CPU.)

Do you actually use this stuff? Honestly, it sounds to me as if you know nothing about digital recording. If that's the case, why play off as though you are some kind of expert? :rolleyes:

I've not seen a USB 3.0 audio interface yet. Not from the big players. Not from the niche market. I know that the USB 2.0 devices (which is all we have at this point outside FireWire audio interfaces) aren't going to be "saved" by a USB 3.0 port operating in a backwardly compatible USB 2.0 mode.

Which only points out how immature USB 3.0 is, except maybe as a data storage topology. Which is exactly where we came in...

(SHEESH, some people's kids!)
 
my worry about light peak is my understanding is Apple holds some of the patents on it. I could easily see Apple preventing it Light Peak from getting widely used in PC. This would screw over its adaption.
A huge reason Firewire never took off is the patent licencing cost was very costly compared to USB. Hence the reason USB took off.

So in terms of cost Light peak is way to much for most things.
 
A huge reason Firewire never took off is the patent licencing cost was very costly compared to USB. Hence the reason USB took off.
I see that repeated a lot, as though it were a fact. Can you define "very costly"?

The reason FireWire never took off is because it required an additional chip on the mobo and an additional port on the side (or back) of the laptop. Sound familiar? :p
 
The reason FireWire never took off is because it required an additional chip on the mobo and an additional port on the side (or back) of the laptop. Sound familiar? :p

Sounds like Light Peak to me.

The reason is also partly the firewire trademark itself. Apple refused to license it out, forcing other vendors to use different names, hence the ieee1394s and the i.links. This caused confusion in the market.
 
I see that repeated a lot, as though it were a fact. Can you define "very costly"?

Very costly = $0.25

Multiply that times the millions of chipsets that Intel manufactures per week, and it is "very costly" for the price sensitive portion of the market.
 
my worry about light peak is my understanding is Apple holds some of the patents on it. I could easily see Apple preventing it Light Peak from getting widely used in PC. This would screw over its adaption.
A huge reason Firewire never took off is the patent licencing cost was very costly compared to USB. Hence the reason USB took off.

So in terms of cost Light peak is way to much for most things.

Light Peak is being developed by Intel. There is absolutely no chance that Intel would develop something that can not be used on PC. Besides I am not aware of anything optical-related developed by Apple whilst Intel has been working in this field for many years.
 
I see that repeated a lot, as though it were a fact. Can you define "very costly"?

The reason FireWire never took off is because it required an additional chip on the mobo and an additional port on the side (or back) of the laptop. Sound familiar? :p

Well firewire liencing cost was $1 per device manufactured. I think MPEG-LA is giving them out at $0.25 per device manufactured
USB is $1000 per year per vender. Apple for example pays USB $2000 for the vender ID but can manufacture unlimited number devices for that.
It does not take much to make USB a lot cheaper.

If you produced more than 4000 or 8000 item a year USB is cheaper and it drops even faster after that.
That 4k or 8k is assuming the 25 cents cost.
 
Light Peak is being developed by Intel. There is absolutely no chance that Intel would develop something that can not be used on PC. Besides I am not aware of anything optical-related developed by Apple whilst Intel has been working in this field for many years.

go read up on light peak hell just look at wiki. Apple brought it to intel. So chances are really good Apple holds some of the patents on it.
 
I searched with Yahoo!, and found Sources: 'Light Peak' technology not Apple idea and many similar stories.

I got mine from wiki
Apple brought the concept of Light Peak, an interoperable standard which could handle large amounts of data and replace the multitudinous connector types with a single universal connector, to Intel in 2007 with the intention of Intel producing and developing the technology.[1]

wiki source http://www.engadget.com/2009/09/26/exclusive-apple-dictated-light-peak-creation-to-intel-could-be/

That is what lead me to believe apple has something with the patents
 
As expected, I see different people quoting different licensing costs for FireWire. I'm sure the truth is out there somewhere...
 
As expected, I see different people quoting different licensing costs for FireWire. I'm sure the truth is out there somewhere...

grab from the wiki page on firewire
However, the expensive hardware needed to implement it (US$1–$2) has prevented FireWire from displacing USB in low-end mass-market computer peripherals, where product cost is a major constraint.[3]
Under the license offered by MPEG LA, a royalty of US$0.25 per unit is payable upon the manufacture of each 1394 product.
 
Someone hasn't heard of Itanium it seems. :rolleyes:

How has this experiment turned out? In any case, it's not a valid comparison. Itanium was developed for everyone to use. Intel developing something exclusively for Apple would represent a totally different precedent.
 
How has this experiment turned out?

Not that good, though all my HP Integrity servers use Itanium processors. You know... the kind of hardware that costs 20,000$ for a 2 core 1.2 ghz system with 8 GB of RAM.

Not your typical PC fare. Itanium never was aimed at the PC contrary to all the hot air was pushing about it.

In any case, it's not a valid comparison. Itanium was developed for everyone to use. Intel developing something exclusively for Apple would represent a totally different precedent.

That's not what you said. You said :

There is absolutely no chance that Intel would develop something that can not be used on PC.

I responded that Itanium is something that can not be used on PC and yet Intel made it. Sure they touted it as the 64 bit PC architecture while developing it, then when came time to push it to the market, it ended up into high-end server systems only because of prohibitive costs and amd64 or x86-64 by AMD became the defacto standard on the PC platform.

For all we know, LightPeak might end up being a server only technology also, making its way into the high-end platforms, not the PC based servers. I can very much see this happening. USB3 is backwards compatible, uses the same ports currently used. It is the most consumer friendly transition the industry can take.
 
How has this experiment turned out? In any case, it's not a valid comparison. Itanium was developed for everyone to use. Intel developing something exclusively for Apple would represent a totally different precedent.

4th most deployed chip in the enterprise market for servers.

It did really well at its target. Intel knew that chances were good that it would never make it into the PC world because it did not support X86 32 bit world.
 
4th most deployed chip in the enterprise market for servers.

It did really well at its target.

You consider that well ? By the time Itanium reached the market, what other high end CPU architectures were there in the server market ?

- PPC
- Sparc
- x86/x86-64
- Itanium

And they managed to place fourth out of ... hum.. 4 ? Alpha (DEC) died in the 90s when they were acquired by Compaq and subsequently HP, MIPS saw its demise when SGI failed to turn a profit for a few quarters and switched over to Itanium/Pentium systems and never managed to recover. HP killed PA-RISC in favor of Itanium, which was a given considering the technology originated with them in the first place.

Nowadays, who remains ? HP ? Itanium was a big failure in the end. I think this graph bests demonstrate how much of a failure it was :

Itanium_Sales_Forecasts_edit.png
 
Intel had no idea, which is why we caught them flat-footed with k8. Their only 64bit plan was itanium (with the cbox to run x86 badly).

Similarly we had no idea k8 would win, but we had no license for Itanium, so we did what we could.

4th most deployed chip in the enterprise market for servers.

It did really well at its target. Intel knew that chances were good that it would never make it into the PC world because it did not support X86 32 bit world.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.