Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I will dig up the link for you

I understand that it's about all of that together to determine bandwidth, but who has a DP 1.2 4k 3d monitor? Yeah that's what i thought, and since the current mbp doesn't even support dp1.2 you would not even be able to hook it up

you need dp1.2 on both ends (comp and monitor.)

The MBP does not have DP 1.2 because of stupid Thunderbolt. The discrete graphics chips are DP 1.2

It doesn't have to be 4K 3D to need DP 1.2. 2560x1600 3D needs DP 1.2

That it's not available now doesn't mean it couldn't be available in two years. The MBP would have not depreciated yet.
 
The MBP does not have DP 1.2 because of stupid Thunderbolt. The discrete graphics chips are DP 1.2

It doesn't have to be 4K 3D to need DP 1.2. 2560x1600 3D needs DP 1.2

That it's not available now doesn't mean it couldn't be available in two years. The MBP would have not depreciated yet.
ok fair enough, but that's getting away from the original argument anyways that knightwrx didn't want to replace his "several thousand" dollar monitor, so i'm still wondering what need he has for a 10Gbps for one display?
 
I remember seeing that. I was interested in how it worked, didn't you have to boot the mac in a special mode to access that feature?


That isn't a fault of the GPU, that is Apples fault. eyefinity (AMD) is a 3+ display tech. With the top end cards able to drive 6 displays. The use of DP1.1 is what is keeping us from running more than one monitor at this point (per DP plug). Notice how only the 27" iMac can run two additional monitors, because it has two ports. What I think people may not realize is that per Thunderbolt controller you can only have two mDPs. So if Apple wanted to enable the 6 monitor eyefinity support for the top end GPU they would have to include 3 Thunderbolt controllers (per Intels graphic) because they don't support more than 2 DP connections per controller.

Even with 3 TB ports the top iMac still wouldn't be able to support ONE 30" 3D monitor. It could if they had put 1 TB port and 1 DP 1.2 port. Worst designed computer of all.
 
and you don't think that in a few years that thunderbolt tech can't increase? It's supposed to be able to do 100Gbps when switched to Optical..... your argument is meaningless.

And also, in the tech demo from intel they were running a 4k ACD plugged into a lacie lbd array running 4 1080p streams simultaneously, so i'm not thinking your problem is valid at all.

Crap man forgot that the interface is only PCI express 4x. If this is version 2 then max is 16 gbps, if version 3 then 32 gbps. Much less than advertised.

Unless you are using a SSD (or writing/reading to/from multiple drives at once) thunderbolt will not give you any advantage over USB 3.0. The speed limit will be the drive and not the cable and interface. Apple advertises TB as reaching speeds of 800MB/s. The fastest mechanical drives are maybe 200MB/s. You would need four to saturate the connection. And how many average people write to four external hard drives simultaneously. (Note 864MB/s = actual real world of 6.9 gbps. THIS INFO IS FROM APPLE)

At 800 MB/s it will take 320 seconds to completely fill a 256 GB SSD (assuming the SSD could write at 800MB/s the fastest currently is around 520MB/s) That is ridiculously fast. Yet if I doubled that to 640 seconds it would still be ridiculously fast. And how often do you completely write your drive full at once?

5gbps is around 625 MB/s reduce by 40% for USB inefficiencies and you get around 450 MB/s. this is still almost fast enough to saturate the fastest SSD and when you consider that these drives will cost hundreds its not much of a deal to the average consumer. Consumers care about size for external hard drives not speed (mainly).

I personally think that it would have been better for apple to have announced this first, allowed the market to start to get ready to produce peripherals, notified the pc world about thunderbolt (because if it is to take if it needs to be common), use the space that the thunderbolt chip takes to put dedicated graphics in the 13 inch MBP, and then released thunderbolt in the next generation.
 
Crap man forgot that the interface is only PCI express 4x. If this is version 2 then max is 16 gbps, if version 3 then 32 gbps. Much less than advertised.

Unless you are using a SSD (or writing/reading to/from multiple drives at once) thunderbolt will not give you any advantage over USB 3.0. The speed limit will be the drive and not the cable and interface. Apple advertises TB as reaching speeds of 800MB/s. The fastest mechanical drives are maybe 200MB/s. You would need four to saturate the connection. And how many average people write to four external hard drives simultaneously. (Note 864MB/s = actual real world of 6.9 gbps. THIS INFO IS FROM APPLE)

At 800 MB/s it will take 320 seconds to completely fill a 256 GB SSD (assuming the SSD could write at 800MB/s the fastest currently is around 520MB/s) That is ridiculously fast. Yet if I doubled that to 640 seconds it would still be ridiculously fast. And how often do you completely write your drive full at once?

5gbps is around 625 MB/s reduce by 40% for USB inefficiencies and you get around 450 MB/s. this is still almost fast enough to saturate the fastest SSD and when you consider that these drives will cost hundreds its not much of a deal to the average consumer. Consumers care about size for external hard drives not speed (mainly).

I personally think that it would have been better for apple to have announced this first, allowed the market to start to get ready to produce peripherals, notified the pc world about thunderbolt (because if it is to take if it needs to be common), use the space that the thunderbolt chip takes to put dedicated graphics in the 13 inch MBP, and then released thunderbolt in the next generation.
well you could always do a RAID array of drives to compensate for the speed loss of mechanical drives since hard drives are pretty cheap now a days you could get a lot of storage at really fast write speeds.

Also, you wouldn't want drives to already be maxing out the throughput of the tech, that would make it obsolete far quicker. You always want room for improvement!

And i don't think that the controller for tb is big enough that they could have put a dgpu in instead, but maybe?

That isn't a fault of the GPU, that is Apples fault. eyefinity (AMD) is a 3+ display tech. With the top end cards able to drive 6 displays. The use of DP1.1 is what is keeping us from running more than one monitor at this point (per DP plug). Notice how only the 27" iMac can run two additional monitors, because it has two ports. What I think people may not realize is that per Thunderbolt controller you can only have two mDPs. So if Apple wanted to enable the 6 monitor eyefinity support for the top end GPU they would have to include 3 Thunderbolt controllers (per Intels graphic) because they don't support more than 2 DP connections per controller.
ok thanks, you're right I was reading an old article that the gpu's couldn't support more than 2 displays, should have looked at the date *dohh*
 
The use of DP1.1 is what is keeping us from running more than one monitor at this point (per DP plug). Notice how only the 27" iMac can run two additional monitors, because it has two ports. What I think people may not realize is that per Thunderbolt controller you can only have two mDPs. So if Apple wanted to enable the 6 monitor eyefinity support for the top end GPU they would have to include 3 Thunderbolt controllers (per Intels graphic) because they don't support more than 2 DP connections per controller.
So, if Apple would have been smart enough they would have replaced todays ports (ether, fw, TB, 3 x usb) with dp1.2 and 4 TB+usb3 ports. Ether + fw could be handled with dongles to TB and this would have given future upgrade possibility to 10G ethernet. On the motherboard fw controller would have been replaced with another TB controller.

As a result, there would be one port less than now and only two types, so if Apple's long term goal is to make less ports simplier, this would be it.
If they would choose to keep the amount of ports the same, they could even add µHDMI to the mix for user convenience.
Maybe stand-alone dp port could utilize discrete GPU and TB ports integral GPU.

You could attach that über-geek hyper display with 20Gbit/s bandwidth after a decade from now or you could attach 8(!) screens to your new mbp today!
Otherwise you could also daisy chain 24 TB devices to one mbp!
And you could have a dock connector for all these ports and devices, that you could undock in one click and leave that room full of hardware behind when heading home from work.

But no, someone said that we need shine curves and reflecting screens and fingerprints and internal batteries and facetimeHD more than that...

TB in it's present state is almost as big joke as not including usb3 to "the most advanced computer on the planet"!
 
Last edited:
Wtf?

So the people here claiming that Apple is fragmenting anything clearly do NOT know the back-story of Thunderbolt, and the author of this story clearly fails to understand the difference between Lightpeak (intels implementation and brand) and Apples Thunderbolt implementation and brand. Intel does not refer to the I/O as Thunderbolt, they refer to it as and have branded it Lightpeak.

Thunderbolt / Lightpeak was Apples idea, they went to Intel and asked their engineers to make it happen – so to claim that Apple is somehow fragmenting a new standard they helped design and develop is unfounded and silly!
 
So the people here claiming that Apple is fragmenting anything clearly do NOT know the back-story of Thunderbolt, and the author of this story clearly fails to understand the difference between Lightpeak (intels implementation and brand) and Apples Thunderbolt implementation and brand. Intel does not refer to the I/O as Thunderbolt, they refer to it as and have branded it Lightpeak.

Thunderbolt / Lightpeak was Apples idea, they went to Intel and asked their engineers to make it happen – so to claim that Apple is somehow fragmenting a new standard they helped design and develop is unfounded and silly!

Wrong.

http://www.intel.com/technology/io/thunderbolt/index.htm
 
It's right there with "BunnyPeople"

Intel does not refer to the I/O as Thunderbolt, they refer to it as and have branded it Lightpeak.

Wrong, and wrong.
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    81.9 KB · Views: 81
  • untitled1.jpg
    untitled1.jpg
    227.4 KB · Views: 89
So the people here claiming that Apple is fragmenting anything clearly do NOT know the back-story of Thunderbolt, and the author of this story clearly fails to understand the difference between Lightpeak (intels implementation and brand) and Apples Thunderbolt implementation and brand. Intel does not refer to the I/O as Thunderbolt, they refer to it as and have branded it Lightpeak.

Thunderbolt / Lightpeak was Apples idea, they went to Intel and asked their engineers to make it happen – so to claim that Apple is somehow fragmenting a new standard they helped design and develop is unfounded and silly!
Light Peak = code name
Thunderbolt = final/official product name

It was developed by Intel and other partners, of which Apple was one of them (namely software support, which is why the early demonstrations were done under OS X). Most, if not all, of the other partners have to do with the optical side (companies such as Dow Corning), but that aspect has either been dropped entirely, or will be implemented at a later time since they've been able to get it to work over copper (originally developed as an optical signal).

Take a look at the Thunderbolt wiki, or even go back and search for the Light Peak thread/s here in MR for further details.
 
Dude, do you have Comic Sans as your default font?

Of course - except for fixed-pitch fonts, which are Lucida Console.

(In Comic Sans, numerals are taller than letters, so a password or other key with mixed letters/numerals is much easier to read, and one is much less likely to confuse "1" (one) and "l" (el) and "O" and "O". Plus, I prefer sans-serif fonts in general.)
 
And also, in the tech demo from intel they were running a 4k ACD plugged into a lacie lbd array running 4 1080p streams simultaneously, so i'm not thinking your problem is valid at all.

And how much bandwidth do you think 4 1080p streams need ? What was the bitrate on those ? 10 Mbps... so... 40 Mbps total ?

Color me impressed. :rolleyes:

Also, link to the 4k ACD. Since 4k, 24 bit color, at 60 hz requires over 12 Gbps to drive. Thunderbolt = no dice. DP 1.1a = no dice.
 
Is using a USB port really that hard? Whenever you put the female USB port in, make sure the two square holes are facing up.
how often are you plugging in female ports? I think male ports are way more prevalent... ;)

And how much bandwidth do you think 4 1080p streams need ? What was the bitrate on those ? 10 Mbps... so... 40 Mbps total ?

Color me impressed.

Also, link to the 4k ACD. Since 4k, 24 bit color, at 60 hz requires over 12 Gbps to drive. Thunderbolt = no dice. DP 1.1a = no dice.
well that's what the article stated, i youtubed and all i can find is the 2009 IDF and they say a "higher than hd roughly by 2 times" display (doesn't look like ACD) and they say it is running at 8Gbps, but I don't see 4 movie clips in that video so i'm not sure what demo the article from pcmag was talking about but you can go watch the video from a 2009 tech demo on youtube if you want.

also found this video from ces 2010, go to the 1:50 mark he says you could extend capabilities to the display with other display/video ports and other i/o technology and have all of the different protocols transferring over one cable... seems to me that pretty much confirms a hub, at least a "hub monitor" is possible with TB.

http://www.youtube.com/watch?v=ZDHHM-NsGOo&feature=related

and another video where he specifically says a docking stations

http://www.youtube.com/watch?v=izNoF1SWtSg
 
Last edited:
well that's what the article stated, i youtubed and all i can find is the 2009 IDF and they say a "higher than hd roughly by 2 times" display (doesn't look like ACD) and they say it is running at 8Gbps, but I don't see 4 movie clips in that video so i'm not sure what demo the article from pcworld was talking about but you can go watch the video from a 2009 tech demo on youtube if you want.

also found this video from ces 2010, go to the 1:50 mark he says you could extend capabilities to the display with other display/video ports and other i/o technology and have all of the different protocols transferring over one cable... seems to me that pretty much confirms a hub, at least a "hub monitor" is possible with TB.
FYI, ACD = apple cinema display. There has not been 4k ACD.
And for those demos, it is pretty different thing if they are streaming raw "display data stream" or compressed / uncompressed video stream.
 
This is interesting:

http://www.brightsideofnews.com/new...es-thunderbolt-apple-trademark-situation.aspx

Regarding Sony and use of USB as Thunderbolt connector it seems that Sony is seriously pushing it out with their systems:

Dave Salvator, Senior Communications Manager at Intel wrote the following

in a nutshell, Apple filed for the original trademark and is now transferring that trademark to Intel. At the same time, Apple will continue to have unrestricted use of the technology. 3rd party implementations such as Sony's desire to use USB Connector instead of DisplayPort one and the eventual change of technology branding (Sony's IEEE1394 a.k.a. Firewire implementation was named i.LINK) will have to be ironed out as the time passes by.
 
FYI, ACD = apple cinema display. There has not been 4k ACD.
And for those demos, it is pretty different thing if they are streaming raw "display data stream" or compressed / uncompressed video stream.
i know what apple cinema display is, and i know that there hasn't been a 4k one, pcmag is the ones who said it not me, i said it doesn't look like a 4k display to me, but they could also be talking about a different demo. The pcmag article quote is:

As such, Thunderbolt can drive both an external display as well as serve as a data connection. In a demonstration, Intel showed off a 4K Apple Cinema Display being driven by a Thunderbolt cable from a LaCie Little Big Disk array, with four 1080p uncompressed videos being displayed. The DisplayPort, data, and audio data are all intermuxed (or mixed) across the same channel, with data priorities given where needed, executives said. If necessary, Thunderbolt uses the second channel for additional data.
The demonstration showed a peak throughput rate of about 700 MBps. Thunderbolt transmits data with 8 ns of accuracy, and low latency; up to seven devices can be daisy-chained together, executives said.

and if things are being demoed/prototyped you don't think apple could have given them a high res display? It's not like apple doesn't have things in their labs that haven't been released to the public.
 
pcmag said:
As such, Thunderbolt can drive both an external display as well as serve as a data connection. In a demonstration, Intel showed off a 4K Apple Cinema Display being driven by a Thunderbolt cable from a LaCie Little Big Disk array, with four 1080p uncompressed videos being displayed. The DisplayPort, data, and audio data are all intermuxed (or mixed) across the same channel, with data priorities given where needed, executives said. If necessary, Thunderbolt uses the second channel for additional data.
The demonstration showed a peak throughput rate of about 700 MBps. Thunderbolt transmits data with 8 ns of accuracy, and low latency; up to seven devices can be daisy-chained together, executives said.
Hmm, article says that video from LaCie were shown in monitor.
So they were demonstrating both data connection to storage and display connection to screen. 4k could have been just the frame size of the video in LaCie and the screen probably was just normal ACD with mDP connection.

4 uncompressed 1080p streams can mean 4:2:2 with 8 bit colors @24fps, which is only 800 Mbit/s per stream or it could mean 4:4:4 with 16 bit colors @60fps, which would mean 6000 Mbit/s per stream.
(Where TB could not stream even 2 streams...)
Anyway, I guess that display was driven with its native resolution and color depth.
and if things are being demoed/prototyped you don't think apple could have given them a high res display? It's not like apple doesn't have things in their labs that haven't been released to the public.
Are you new in following Apple? Apple never shows anything from their labs. Ever.
 
also what i was thinking but there is still an idc demo from 2009 which shows a 4k display, just because it's not from apple doesn't really mean anything.

Odd considering every 4K tv/monitor I have been able to find use HD-SDI/DVI/HDMI ports. I am still trying to find one that uses DP (or even mDP).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.