Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Probably a daft question but i'll ask anyhows so forgive my techie noobness!

With the advent of thunderbolt and its high bandwidth, will it possible for a gfx card to be sited externally in some kind of cradle and be used as the main gfx card or wouldn't the internal "plumbing" allow it to happen ?

/noob mode off

;)

It would be very well possible. Remember, Thunderbolt is derived from LightPeak. One of the reasons to develop LightPeak was to transmit data at very fast rates over a distance. Essentially, not have everything so closed together.

In other words, you can the CPU in room A and the RAM in room B which is 20 feet away and get the same result. This is one of the reasons Intel developed LightPeak. There are many other reasons for development obviously.

However, Thunderbolt in its current stage is not suited for such lengthy exchange due to its copper nature. However, say you have a GFX cradle on your desk, you could well use Thunderbolt's current implementation to feed data. However, you'd need multiple implementations of Thunderbolt in order for it to work great. Currently, many GFX solutions use PCIe 2.0 x16 interface which pretty much uses 8 GB/s bandwidth so one Thunderbolt interface will do fine and still have a nice 2GB/s overhead. However, the newer PCIe 3.0 interface pushes 16GB/s now so you'd need two Thunderbolt interfaces.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8F190 Safari/6533.18.5)

toddybody said:
next step amd cpus

*Children Screaming in background

Im no snob against AMD GPUS...but their CPU's are nearly 2 generations behind intel. I dont think Bulldozer is going to match the 1155 SB, much less the upcoming 2011 socket chips.

What I want to see is a 27inch iMac with an HD 6970 2GB...Whoa whoa wee wow:eek:

And is not perfect by any means but then again Intel doesn't really deserve the credit they get. Just look at the SB GPU and the bugs in SB in general. Since on can get superior GPU performance from AMD, and that is critical for some users, why not go with an entire AMD system? Yes I know the CPU is a little behind what Intel offers but that isn't a problem in Apples low end systems. Let's face it the Mini has never had a bleeding edge processor.

This discussion gets even more interesting when you consider AMDs coming Fusion processors. If you are about to buy a system with an integrated SoC solution which would you rather have an AMD GPU or an Intel one? Yeah I realize that some people need the fastest CPUs they can get, but for many a fast GPU delivers a better experience.

On top of all of that AMD seems to have the same vision of the future where the GPU becomes a kore equal partner to the CPU on SoCs. AMD is all in with OpenCL support today and has future plans to make such code much lower in overhead. Right up Apples alley.

In any event I see a number of reasons for Apple to split sales between AMD and Intel. Long term a few AMD based machines from Apple is better for both Apple and the industry.
 
First ....

They'll need to do something about the power connectors, though.

The last couple times I decided to go with a re-flashed PC version of a graphics card for my Mac Pro, I had to buy special 6-pin power connectors to go between the card and the motherboard since the PC version assumed you had a different type of power connector to use.

When you consider you often need 2 of these cables, typically priced at upwards of $20 each, that can start to make a re-flashed PC version of a given card look a lot less attractive compared to the official Apple version.


i would love to buy an off the shelf gpu for half the price of a mac branded amd card. please let this be true then i will not sell my 2008 macpro
 
This is HUGE, ginormous news. If Lion, or even later released of Snow Leopard, has this kind of support, ti would revitalize the Mac gaming scene. Even 3D artists would have more options, especially when you consider how well the high end consumer cards stack up against their FireGL competition.

Now all they need is complete 6900 series support-- yeah, I'm lookin' at you, 6990. ;-)
 
It would be very well possible. Remember, Thunderbolt is derived from LightPeak. One of the reasons to develop LightPeak was to transmit data at very fast rates over a distance. Essentially, not have everything so closed together.

In other words, you can the CPU in room A and the RAM in room B which is 20 feet away and get the same result. This is one of the reasons Intel developed LightPeak. There are many other reasons for development obviously.

However, Thunderbolt in its current stage is not suited for such lengthy exchange due to its copper nature. However, say you have a GFX cradle on your desk, you could well use Thunderbolt's current implementation to feed data. However, you'd need multiple implementations of Thunderbolt in order for it to work great. Currently, many GFX solutions use PCIe x16 interface which pretty much uses 8 GB/s bandwidth so one Thunderbolt interface will do fine and still have a nice 2GB/s overhead. However, the newer PCIe 3.0 interface pushes 16GB/s now so you'd need two Thunderbolt interfaces.
The one thing I wonder about is DRM. As it is now the connection to the display (and through DP) are protected (with either HDCP or DPCP). Do we know if LP/TB supports that protection (especially since the DP stream is actually separate from the PCIe stream)?
 
Looks like NVIDIA is going to be out of the picture for a while. After the Mac Mini, MacBook and MacBook Air are updated to Sandy Bridge/Ivy Bridge it will be all Intel/AMD graphics across the board. Apple should really think about implementing hardware acceleration for AMD/ATI cards and Intel's IGP. Hopefully it will be there in Lion.
 
The one thing I wonder about is DRM. As it is now the connection to the display (and through DP) are protected (with either HDCP or DPCP). Do we know if LP/TB supports that protection (especially since the DP stream is actually separate from the PCIe stream)?

It has too seeing as Intel is pushing DRM protection into the physical CPU.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8F190 Safari/6533.18.5)

Full of Win said:
I wonder if this may imply the coming of that unicorn rider we all know and love, the 'headless mac" (aka xMac).

Removable drives, no screen, more powerful than an iMac, 1499.99.

That is exactly what I'm thinking! Seriously there is no need for that many GPUs in the Pro and IMac requires a custom card. So where would all of these cards go - XMac is my guess.

Or it could simply be a sign of a unified driver from AMD. That would make sense as it is a smarter approach than the highly targeted drivers of the past.
 
It has too seeing as Intel is pushing DRM protection into the physical CPU.

But the GPU still has to decode what was sent and put it on the screen, which is why I asked if the TB itself can do the encoding. If it can how much overhead will that add (again as it has to happen over the PCIe side)?

Or can you send graphics information over DP that still needs to be processed, ie raw frames?
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8F190 Safari/6533.18.5)



And is not perfect by any means but then again Intel doesn't really deserve the credit they get. Just look at the SB GPU and the bugs in SB in general. Since on can get superior GPU performance from AMD, and that is critical for some users, why not go with an entire AMD system? Yes I know the CPU is a little behind what Intel offers but that isn't a problem in Apples low end systems. Let's face it the Mini has never had a bleeding edge processor.

This discussion gets even more interesting when you consider AMDs coming Fusion processors. If you are about to buy a system with an integrated SoC solution which would you rather have an AMD GPU or an Intel one? Yeah I realize that some people need the fastest CPUs they can get, but for many a fast GPU delivers a better experience.

On top of all of that AMD seems to have the same vision of the future where the GPU becomes a kore equal partner to the CPU on SoCs. AMD is all in with OpenCL support today and has future plans to make such code much lower in overhead. Right up Apples alley.

In any event I see a number of reasons for Apple to split sales between AMD and Intel. Long term a few AMD based machines from Apple is better for both Apple and the industry.

Fusion is not just about graphics. Fusion has a DirectX 11 class GPU with true OpenCL, while Sandy Bridge and the next Atom have DirectX 10.1 class GPUs with an alpha of OpenCL which runs on the CPU side.
 
Not if they redesign the Macbooks so the video signal goes back the other way down the thunderbolt cable and directly to the display.

So wait, you'd have to dongle a video card to the thunderbolt port to get a decent GPU for the internal monitor, if the signal can travel both ways (going out the port to get processed by this external GPU and then come back to get displayed on the internal screen).

No, just no. That's a terrible idea.

Although using a 2GB HD 6970 on a 1280x800 display is a bit silly.

How is it silly ? We're talking about a GPU. Even at 1280x800, the Intel GPU sucks, why would it be silly to want to run games on high settings ?
 
That's not clever at all. You'd still be stuck with the Intel GPU on the internal screen.

So what? Play your game on the external screen then. This will allow third displays on Macs that don't have slots. Imagine having three displays on your MacBook Pro. Or if you are a video editor, two displays and an SD or HDMI output.
 
If Apple do move to supporting off the shelf ATI cards, what are we betting that it will require a revised 2011 Mac Pro ;)
 
Is it unusual for Apple to start supporting so many graphics chips in one release?

If it was just to support a new line of iMacs/Mac Pros, they surely would only have 2 or 3 new chips. But 10? Please oh please tell me that with the increasing importance of games to Apple's bottom line, they're starting to take gaming more seriously.
 
I wonder if support is really there or just the ability to identify the cards.
 
But the GPU still has to decode what was sent and put it on the screen, which is why I asked if the TB itself can do the encoding. If it can how much overhead will that add (again as it has to happen over the PCIe side)?

Or can you send graphics information over DP that still needs to be processed, ie raw frames?

The GPU can do that, no need for CPU. The CPU is just there to tell the GPU what to crunch assuming no FLAGS were thrown regarding a particular DRM-protected data.

Thunderbolt is just the transmission protocol, there is no actual decode or encode besides what is hard wired at the ports.
 
There are few PCIe lanes in Thunderbolt. You cannot do heavy graphics.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.