Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Ok all we need is a 3rd party to make a high res 27" monitor w/ a built in powerhouse pcie graphics card that supports thunderbolt and win7 at the same time. I'm not going to hold my breath. Of course if apple makes a thunderbolt monitor with a built in card it would have native osx support. But I guarantee they would make it to where you could not upgrade the card.
 
not enough bandwith in thunderbolt yet. Thunderbolt = 4x PCIe. Modern video cards use 16x PCIe.
 
not enough bandwith in thunderbolt yet. Thunderbolt = 4x PCIe. Modern video cards use 16x PCIe.

Please see the multiple posts in this thread disputing this fact. Its annoying when people spread information that is not correct. To summarize, modern GPUs support 16x but do not utilize that potential bandwidth except for the highest end cards. Even then, 4x is enough to get over 90% of the potential out of the highest GPUs so this implementation could greatly benefit those who are willing to pay for the option.
 
I don't think you can, currently. The video would be sent to the monitor directly connected to the GPU output. The only way around that would be some sort of SLI setup where the GPU in the iMac would be the master GPU where frames are sent out to the slave GPUs but then composited and displayed from the master GPU. Chances are that will introduce some input lag.

What about Thunderbolt's target mode? Couldn't it be done that way? Perhaps not because as you say, the iMac's GPU would need to be "shut off." :(
 
not enough bandwith in thunderbolt yet. Thunderbolt = 4x PCIe. Modern video cards use 16x PCIe.

I'd rather have 4x PCIe than what I've got going with the Intel 3000 HD. Plus, if you'd read the article Arn posted earlier, you'd have seen that the performance differences between 4x and 16x are negligible. Still, even if there was a reduction in speed, I'd rather have 75% performance out of a mid-range graphics card than what my 2011 13" MBP is getting from the Intel 3000 HD.
 
I wonder how many different graphics cards would be used with every single such chassis before the built-in graphics of the cheapest Mac matches the external graphics. Given two years of usage per card, I suspect most people would use a single card or two at most before upgrading to a new computer altogether, making the external option redundant. Perhaps external graphics solutions should have the GPU integrated in them to lower costs and improve their form-factor, making them compelling for a short-term upgrade, which is all most of these external chassises will ever be.
 
as long as PCIe exists

I wonder how many different graphics cards would be used with every single such chassis before the built-in graphics of the cheapest Mac matches the external graphics. Given two years of usage per card, I suspect most people would use a single card or two at most before upgrading to a new computer altogether, making the external option redundant. Perhaps external graphics solutions should have the GPU integrated in them to lower costs and improve their form-factor, making them compelling for a short-term upgrade, which is all most of these external chassises will ever be.

As long as PCIe exists, you'll be able to upgrade the external card to something better than whatever Apple embeds in a system.

PCIe cards will advance at the same rate as internal GPUs - they won't fall behind.
 
not enough bandwith in thunderbolt yet. Thunderbolt = 4x PCIe. Modern video cards use 16x PCIe.

Shame thunderbolt isn't fast enough to support full-speed PCIe cards, otherwise it'd be a lot more interesting.

Actually it doesn't matter. Even a GTX580 cannot fully saturate a PCI-Express 2.0 4x slot of 16gbps bi-directional.

Thunderbolt will do 20gbps bidirectionally. So it can fully support a 4x slot with room to spare.

The highest CPU>GPU bandwidth i have ever seen is around 7gbps bi-directional.

You may see a slight hit in performance, but no current GPU is going to bottleneck on a 4x slot. It's still going to be leaps and bounds faster than anything you can currently get with mobile graphics.

Really the only limitation will be how much power they will require. Currently they are using 2x external bricks to allow up to 225 watts (75w for the pci-e slot, and 75w for each 6pin 12v connector). The 6pin power connectors have a max power rating of 75 watts, so as long as your video card only has 2x 6pin power connectors or fewer you won't have to worry about power issues.

GTX570 only has 2x 6pin power connectors, so it should work fine in their current power implementation. GTX580 however has 1x 6pin and 1x8pin (which uses up to 150w), so it will need up to 300w of power. So depending on if they beef up their power bricks or not, you may have to "settle" for gtx570 speeds.

The biggest problem however will be with driver support. Either VGA drivers for graphics cards on OSX, or thunderbolt drivers for Windows 7. While thunderbolt drivers for windows may take another year, it will definitely come eventually. The bigger question is will nvidia or amd start making osx drivers. One glimmer of hope for OSX users is that it seems you can currently get full Geforce 5xx series VGA card support in Lion with only minor text edits.

This is my current list of parts for my next gaming rig when this product launches. I already have a monitor, keyboard and mouse:

$599 - Mac Mini www.apple.com
$80 -- 8GB DDR3 1333 http://eshop.macsales.com/item/Other World Computing/1333DDR3S08S/
$204 - 120GB OWC SSD http://eshop.macsales.com/item/Other World Computing/SSDEX6G120/
$20 -- 2.5" External Enclosure http://eshop.macsales.com/item/Other World Computing/ES2.5BPU2S/
$18 -- Torx Tool Set http://eshop.macsales.com/item/Newer Technology/TOOLKIT11/
$300 - Thunderbolt ViDock 4plus http://www.villageinstruments.com/tiki-index.php?page=Store
$330 - Geforce GTX570 http://www.newegg.com/Product/Product.aspx?Item=N82E16814130593
 
Last edited:
Wow this is really cool. Things might be looking up for gaming on Apple machines in the future...
 
As I understood it the thunderbolt port allow a device to tap directly into the RAM. Would it be possible to have RAM extention is the form of a thunderbolt memory stick ? This would be very interesting to upgrade, say a Macbook Air limited to a poor 4Gb of RAM.
 
It's 10Gb/s per channel and only the PCIe channel can be used to transfer data required by the GPU.

I've heard one of the channels CAN be used for display port, but this is the first I've heard that one of the channels can ONLY be used for display port and nothing else, even if it's not being utilized by any devices.

How then does it work on MBA's that only have a single channel?

I thought the whole point of thunderbolt was that the channels can be used for anything, at any time, all together on the same channel. So then you may share bandwidth with the display port protocol while using a monitor in the daisy chain with other things in the chain, like an external graphics card. Graphical output doesn't use 10gb/s of bandwidth, it would be a huge waist if you had to dedicate an entire 10gb/s line that could only do that and nothing else.
 
Last edited:
I've heard one of the channels CAN be used for display port, but this is the first I've heard that one of the channels can ONLY be used for display port and nothing else, even if it's not being utilized by any devices.

Everything that I have read points the vice versa. An example from AnandTech:

Apple claims that one of the channels is used for DisplayPort while the other is used for PCIe.

http://www.anandtech.com/show/4489/promise-pegasus-r6-mac-thunderbolt-review

How then does it work on MBA's that only have a single channel?

MBA's is dual-channel. Other Macs have quad-channel, although only iMac is using all four (other Macs have only one port, thus only two channels).

I thought the whole point of thunderbolt was that the channels can be used for anything, at any time, all together on the same channel. So then you may share bandwidth with the display port protocol while using a monitor in the daisy chain with other things in the chain, like an external graphics card. Graphical output doesn't use 10gb/s of bandwidth, it would be a huge waist if you had to dedicate an entire 10gb/s line that could only do that and nothing else.

2560x1440 display at 60Hz and 24-bit color needs 7.87Gb/s of bandwidth. That's already pretty close to 10Gb/s mark. Also, you can't mix signals. It's either DisplayPort or PCIe, you can't use 8Gb/s for DP and then the remaining 2Gb/s for PCIe.
 
Apple claims that one of the channels is used for DisplayPort while the other is used for PCIe.

I read that article also, but it doesn't specify that one channel is ONLY display port, only that one port uses display port. It doesn't say you can't have anything else on that channel also.

I'm not sure if you can take a quote from a review site who is quoting an arguably obscure and unclear comment from a random apple representative as gospel without technical proof to back it up.

Hellhamer said:
2560x1440 display at 60Hz and 24-bit color needs 7.87Gb/s of bandwidth. That's already pretty close to 10Gb/s mark. Also, you can't mix signals. It's either DisplayPort or PCIe, you can't use 8Gb/s for DP and then the remaining 2Gb/s for PCIe.

Yes it is close to the mark, but what if your not using a thunderbolt display? Then that entire 10gb/s channel would be completely wasted. If I want to get a mac mini and use the thunderbolt port for my graphics card, then my display would be plugged into the external graphics card so i wouldn't need a display port, then i can fully utilize the potential of the port for just graphics performance.

Who says you can't mix signals? You mix signals across physical medium's for computer technology all the time? There are lots of ways to do it. Either packaging protocols in another "universal protocol" for transmission and then unpackaged and dispersed where it should go. You could do time modulation or frequency modulation to separate different protocols on the same physical medium. It all depends on how the thunderbolt chipset is designed to handle things. But nothing is physically or electrically keeping different protocols from using the same physical mediums at the same time.

Also remember these things are built to be daisy chained up to 7 devices. How do you daisy chain 7 different thunderport devices, all with different protocols over just 2 channels? I havn't read anything anywhere that says those 7 daisy chained devices would be limited to only 2 different types of protocol's.

You also have to remember, this is tech from Intel, and is not exclusive to Apple. It is going to come out for PC's soon. Do you think everyone is going to switch from DVI/HDMI monitors to displayport? I highly doubt it. So then is Intel going to be pushing an interface to PC manufacturers that self gimps itself to only 50% of its potential because the PC industry doesn't use displayports? Or are you thinking intel is going to produce a completely separate chip just for apple, and then a separate chip for the PC industry? Any way you look at it, Intel would take a hit. It's much more likely that they are generic channels that can be utilized by any protocol.
 
Last edited:
Another Anandtech Quote:
http://www.anandtech.com/show/4528/the-2011-macbook-air-11-13inch-review/4
Apple clarified that you can in fact mix video and PCIe traffic on a single channel or across multiple channels.

So it seems my interpretation of the former quote from apple was correct. But i understand how you could have interpreted in the way that you had. It wasn't stated very clearly.

One thing i read though is that the thunderbolt controller is connected to the PCH through a PCI-E 2.0 4x channel, which is limited to 16gb/s. With Lightridge its dual 4x channels, and with Eagleridge its only a single 4x channel. So even if you only have a single thunderbolt port, but have a lightridge chip, you would be able to fully utilize your 20gb/s pipe since you'd have 32gb/s bandwidth behind the chip to the PCH. But if you have Eagleridge and a single thunderbolt port, you'd be limited to 16gb/s bandwidth on your one port.

So you could potentially bottleneck your single thunderbolt port by having an eagleridge chip instead of a lightridge chip. And also if you have 2x ports with the lightridge chip and use both ports, you could potentially be bottlenecked.

It's still a minimum of 16gb/s per thunderbolt port, which will allow you to fully utilize a pci-e 2.0 4x slot for external graphics.

One thing I'm not sure of, but i wonder if you could possible have a device with 2x thunderbolt ports, and utilize both for either a pci-e 2.0 8x slot, or dual pci-e 2.0 4x slots for SLI or crossfire setups with external graphics.
 
Last edited:
As I said before: Thunderbolt is going to destroy a good portion of the MacPro market. With this event, I won't buy another MacPro in the future. iMacs will be fine, or even Mac Minis in a few years as the processors step up more and more.
 
Looking at that prototype picture, I can't help but think Apple could make such a device the 'base' of an iMac monitor and thus have a full blown desktop GPU and extra ports on an iMac without putting all the extra heat in the back of the monitor where it presents a problem to the other components. Imagine an iMac that's an actual desktop equivalent for once.

I do like the idea of a Macbook Pro that could dock and become a regular desktop. I would no longer have to own separate computers in the future and just might not need that Hackintosh I was planning on building in the next year. OTOH, a new MBP + such a device would inevitably cost nearly as much as a base model Mac Pro and given my current MBP is still working fine, maybe it's not so attractive (unless you already own a recent Thunderbolt MBP). I can build a Mac Pro killer Hackintosh (in the areas that count for consumers) for the price of a base model 13" Macbook Pro. Now maybe a Mac Mini + such a hub could work for a reasonable price combination to create a 'gaming Mac/PC combo' that isn't so far fetched in price relative to a Hackintosh.
 
I wonder how many different graphics cards would be used with every single such chassis before the built-in graphics of the cheapest Mac matches the external graphics. Given two years of usage per card, I suspect most people would use a single card or two at most before upgrading to a new computer altogether, making the external option redundant. Perhaps external graphics solutions should have the GPU integrated in them to lower costs and improve their form-factor, making them compelling for a short-term upgrade, which is all most of these external chassises will ever be.

Interesting point, perhaps the need for upgradability is somewhat overblown. At least by using a solution such as this (whether as a chassis, or with an integrated GPU) a Mac user can be cutting edge at some point, if you buy a Mac you start off with a low to mid-range card to begin with.

Sadly, there's not a snowball's chance of Apple ever supporting this. They seem to be less and less interested in modular, upgradable designs; it's too complex and ungainly, and most importantly 'hardcore' gaming just doesn't feature on their radar. :(
 
Interesting point, perhaps the need for upgradability is somewhat overblown. At least by using a solution such as this (whether as a chassis, or with an integrated GPU) a Mac user can be cutting edge at some point, if you buy a Mac you start off with a low to mid-range card to begin with.

Sadly, there's not a snowball's chance of Apple ever supporting this. They seem to be less and less interested in modular, upgradable designs; it's too complex and ungainly, and most importantly 'hardcore' gaming just doesn't feature on their radar. :(

No mobile graphics card even comes close to a true desktop grade one so this idea that it doesn't matter two years later is patently absurd. It matters one day 1 because it simply doesn't cut it if you want to game. Apple doesn't have to directly support it. It could still let you use anything Apple has a graphics driver for already (i.e. anything made for the Mac Pro, for example) and if NVidia or AMD stepped in to fill the slack, they could probably ensure at least one decent card a year. It doesn't have to support everything, just something decent for computers that are otherwise near useless for any kind of serious gaming (whether in OSX or in Windows).

Imagine connecting a $999 bare bones Macbook or even a $599 bare bones Mac Mini up to this thing and being able to play games that normally only a Mac Pro could handle. Then there's Windows. Even if Apple won't support this device in OSX, it could still be a Mac user's best friend in Windows for gaming there on the same machine with Boot Camp (it would just need a Windows driver for the device itself and you could use standard NVidia and Radeon drivers afterwards). In other words, it would save you having to buy a separate machine to play Windows games (and maybe OSX ones as well, at least for some cards). I know if I had the latest Mac Mini, I'd rather spend $600 on this with a good graphics card than $1200+ on a whole separate PC with the same card.
 
Very good post, but unlikely that The Steve is listening. Maybe "The Tim" will listen, though. Perhaps Apple will focus a bit more on "Macs" in the post-Steve era.


...it would just need a Windows driver for the device itself and you could use standard NVidia and Radeon drivers afterwards)....

I don't think that I've ever seen a graphics card that didn't work in a generic VGA/SVGA mode under Windows. No worries about driver support - the generic driver is fine for installation and simple use.

Of course, any Windows user with a clue knows to immediately go to the Nvidia website (or ATI if the Windows user has 3/4 of a clue) and download the full Nvidia driver soon after the initial login.
 
@Magnus

I totally agree. I would much rather have a $600 Mac Mini with a $300 graphics card + $300 vidock than a completely separate PC for gaming.

The other great thing about this is its modularity. I know many Apple fans who have multiple products. MBP, MBA, Mac-Mini etc etc. You could invest in a single $300 external enclosure and then use it on any of them, and switch it between them depending on where you are and what your doing.

So one day it may be sitting on your desk hooked up to your Mac-Mini and your 27" monitor, and the next you take it with you to use with your MBA.

I hear they are working on the possibility to allow you to use a single TB port to both use external graphics and then send the output of that back up the line to display on your laptop/imac display. While you would effectively be cutting your bandwidth in half to accommodate a video output signal, it would give you portability so you don't have to carry a monitor around with you while on the road.

It just allows for so many different possibilities and options. The convenience factor alone makes it way better than building a second PC just for gaming.
 
it would surprise me

I hear they are working on the possibility to allow you to use a single TB port to both use external graphics and then send the output of that back up the line to display on your laptop/imac display.

Any links to add to that rumour?

It seems like a very difficult problem to solve, for very few potential beneficiaries. (If you have a GPU in a TBolt expander, you're guaranteed to be on mains power. Connect a monitor to the GPU rather than trying to push the display upstream to the laptop. The use cases where you have mains power but cannot use a monitor are few.)

A similar thing would be SLI - but that uses an external cable to allow the GPUs to team, it isn't done over the PCIe bus.

Adding an external GPU for CUDA, of course, would be a piece of cake. Nvidia even sells "GPU cards" without monitor connections for just that situation - you want the GPU for CUDA, not for the screens.
 
Any links to add to that rumour?

It seems like a very difficult problem to solve, for very few potential beneficiaries. (If you have a GPU in a TBolt expander, you're guaranteed to be on mains power. Connect a monitor to the GPU rather than trying to push the display upstream to the laptop. The use cases where you have mains power but cannot use a monitor are few.)

A similar thing would be SLI - but that uses an external cable to allow the GPUs to team, it isn't done over the PCIe bus.

Adding an external GPU for CUDA, of course, would be a piece of cake. Nvidia even sells "GPU cards" without monitor connections for just that situation - you want the GPU for CUDA, not for the screens.

not necessarily the current crop of egpus can use the internal screen, specially when using nvidia gpus due to optimus
 
Any links to add to that rumour?

It seems like a very difficult problem to solve, for very few potential beneficiaries. (If you have a GPU in a TBolt expander, you're guaranteed to be on mains power. Connect a monitor to the GPU rather than trying to push the display upstream to the laptop. The use cases where you have mains power but cannot use a monitor are few.)

A similar thing would be SLI - but that uses an external cable to allow the GPUs to team, it isn't done over the PCIe bus.

Adding an external GPU for CUDA, of course, would be a piece of cake. Nvidia even sells "GPU cards" without monitor connections for just that situation - you want the GPU for CUDA, not for the screens.

I've been discussing it with the ViDock rep on their facebook page. He specifically listed how internal mobile graphics already use the pci-e bus to both transfer data to the graphics card and then simultaneously output the video back through the same pci-e bus.

While granted you'll still need some sort of external power (possibly vehicle power on the go), it's much easier to put a small external graphics box in your notebook bag and take it with you to use the built in notebook display than it is to then also carry a large lcd panel with you. It just makes the whole setup more portable, even though it will still require some sort of mains power.
 
I've been discussing it with the ViDock rep on their facebook page. He specifically listed how internal mobile graphics already use the pci-e bus to both transfer data to the graphics card and then simultaneously output the video back through the same pci-e bus.

Back to where? The display is connected to the "graphics card", where would video sent to the PCIe bus go?

Link to the Facebook conversation?


While granted you'll still need some sort of external power (possibly vehicle power on the go), it's much easier to put a small external graphics box in your notebook bag and take it with you to use the built in notebook display than it is to then also carry a large lcd panel with you. It just makes the whole setup more portable, even though it will still require some sort of mains power.

But, what are you gaining?

I don't disagree with the possibility (except for the issue of the external GPU driving the internal display), but I don't see any advantages to having a mains-powered GPU connected to a portable - except for the "docking station" situation of having the large LCD display at the desktop.
______

Thinking about it for a while, I realized that perhaps hardware video decode could be a case for two-way communication with the GPU. The OS sends an encoded stream to the GPU, the GPU sends individual frames back to the OS. The application then sends the frames back to the display engine in the GPU - perhaps after post-processing.

However, since even the HD 3000 can easily do hardware decode of 1080p source - I fail to understand the need to add a mains-powered TBolt GPU to a portable for on-the-road display.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.