Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hector said:
pci-X is not pci express they are two different things the g5 has pci-X slots (turbo charged pci slots) pci express is like the next agp it's like agp 16x but it is also a replacement for pci and pci-X
Actually the current pci express is similar in performance to the current agp. In many cases pci express is actually slower. I've seen a couple of tests to confirm this ... anandtech.com is just one.

As far as I have been able to understand, the advantage of pci express is that it is a serial design as opposed to agp which is a parallell design. The serial design makes it easier (cheaper?) to produce and easier to increase speed in the future.

PCI Express x16 just means that it uses 16 serial links to communicate. It does not mean that it's twice as fast as AGP 8x.
 
ddtlm said:
Mav451:
Can't tell if you are confused or not, but dual-link DVI is not the same thing as dual DVI ports. Apple's 6800 has two dual-link DVI ports, I'm not sure what other cards do.

Yup, I think what the previous poster was seeing is a "dual head" setup, two ports for two monitors.

Does anyone know how exactly Dual-link (or dual-dual-link) work? I was originally under the impression that both DVI ports were linked up via a two-headed cable to the 30" monitor, but if the 6800 can handle two 30" monitors, how does that work?
 
whooleytoo said:
Does anyone know how exactly Dual-link (or dual-dual-link) work? I was originally under the impression that both DVI ports were linked up via a two-headed cable to the 30" monitor, but if the 6800 can handle two 30" monitors, how does that work?

Good question, when I talked to an apple engineer yesterday, he told me each both ports were 'standard dvi'

I think Jobs made a mistake, I think it takes both ports to power the setup.

The 'dual link' probably refers to having two ports function as 1 software-wise (so when you 'present movie' in QT, for instance, it'll fill the whole screen using both ports).

I wonder if there will eventually be a PCI-X version that we can link to the first one using NVidia's newly-re-released SSL technology.
 
slughead said:
Good question, when I talked to an apple engineer yesterday, he told me each both ports were 'standard dvi'

I think Jobs made a mistake, I think it takes both ports to power the setup.

The 'dual link' probably refers to having two ports function as 1 software-wise (so when you 'present movie' in QT, for instance, it'll fill the whole screen using both ports).

I wonder if there will eventually be a PCI-X version that we can link to the first one using NVidia's newly-re-released SSL technology.

put the crack pipe down if steve said you can drive 2 of them on one card then you can and this means that each one only uses one dvi port and thismeans that dual link dvi is like a super dvi that is backwards compatible


and dont get pci-X confused with pci express they are two very different things
 
Hector said:
put the crack pipe down if steve said you can drive 2 of them on one card then you can and this means that each one only uses one dvi port and thismeans that dual link dvi is like a super dvi that is backwards compatible

Judging by Apple's tech docs, that seems to be the case. Although there's no indication what exactly Dual Link DVI is, my guess is it's just a "clock-doubled" DVI.
 
BrianKonarsMac said:
like the guy above said...it's almost necessary for Doom 3, not to mention it's ID's preferred card (which makes one wonder why...i thought the x800 was better? less power draw, higher fps, smaller).

regardless, it's worth the money, you already dropped 2.5+k on the computer, is 1/2k that much more to ensure the best performance? not to mention it's damn sexy and will make you feel cool.

The GeForce 6800 beats the ATI x800 in most benchmarks thought it draws more power. What's more, the architeture of the GeForce 6800 is much more robust than ATI's and has far more "breathing room" for growth of the chipset. This speaks to superior platform viability for nVIDIA over ATI.



blakespot
 
Hector said:
put the crack pipe down if steve said you can drive 2 of them on one card then you can

Just like Jobs talking about 3ghz, right?

Anyways I read up on Dual link and basically it's dependent upon total pixels times the hertz.

I had no idea apple would release a $3,200 monitor with a max of 60 hz, but I overestimated them! At 60 Hertz, dual link is more than enough for 4.1M pixels (this screen runs at less than that, of course)!

Again, it looks to me that Apple didn't make a huge monumental mistake. Dual Link DVI is able to transmit enough data to run 5,500,000 pixels at 60Hz!

So yes, standard dual link DVI + [the crappy] 60hz is way bigger than 2560x1600. At that res, they could get over 80hz!
 
slughead said:
Just like Jobs talking about 3ghz, right?

Anyways I read up on Dual link and basically it's dependent upon total pixels times the hertz.

I had no idea apple would release a $3,200 monitor with a max of 60 hz, but I overestimated them! At 60 Hertz, dual link is more than enough for 4.1M pixels (this screen runs at less than that, of course)!

Again, it looks to me that Apple didn't make a huge monumental mistake. Dual Link DVI is able to transmit enough data to run 5,500,000 pixels at 60Hz!

So yes, standard dual link DVI + [the crappy] 60hz is way bigger than 2560x1600. At that res, they could get over 80hz!

Quick question...I thought that with LCD panels, refresh rate wasn't really an issue, as there isn't an electron gun sweeping across a phosphor screen paiting the display line by line...when an LCD screen refreshes, all the pixels are updated at once...so no flickering. The only problem would be if you were playing some kind of game at a refresh rate higher than the panel could supply...then the actual FPS you saw on the screen would be limited to only the refresh rate of the screen itself (in this case, 60fps). Err...so my question is, does a 60Hz refresh rate on an LCD panel matter? 60Hz would be bad on a CRT and cause eye-strain, but is the same true for a 60Hz panel? Is there any discernable difference between an 80Hz panel and a 60Hz panel?
 
Will the Nvidia 6800 make graphic applications, particularly Final Cut Pro, and I guess, DVD Studio Pro, any faster??

if so, by a lot?

THANKS
 
Confidence of 6800 performance?

I'm about to jump back in to the Mac world with a dual 2.5GHz G5 and 23" HD Cinema Display, and had been planning on going with the Radeon 9800XT before the WWDC announcement. I'm wondering about going for the 6800, since we don't have any indications of it's capabilities... There are the PC reviews, but those aren't of the card that will be in the G5, and I've heards some statements saying that the Apple flavor of some video cards were downclocked. The card doesn't look like the other 6800 cards, and NVIDIA has nothing on their website talking about it.

Other downsides would be cost and availability... But if the card is enough of an improvement over the 9800XT, I can cope with that.

I am a gamer, though how much I'll do on my Mac vs PC I don't know.

Bob
 
Fordan said:
There are the PC reviews, but those aren't of the card that will be in the G5, and I've heards some statements saying that the Apple flavor of some video cards were downclocked. The card doesn't look like the other 6800 cards, and NVIDIA has nothing on their website talking about it.

Other downsides would be cost and availability... But if the card is enough of an improvement over the 9800XT, I can cope with that.

You're absolutely right to be concerned. The truth is we don't know.

The fact that it has dual monitor support each with dual link DVI is definitely advantageous over the PC version, however, we won't have any benchmarks until the cards come out.

The biggest problem, IMO, with mac vid cards is that the drivers aren't optimized and re-optimized every month (or even every 6 months), like they are on the PC. The second biggest problem is Apple doesn't let 3rd parties benchmark their hardware before they release it, so early adopters are always the ones getting hosed.

So basically, nobody knows if the FX6800 will be as fast as the PC version. However, I assure you it will be at least 50% faster than 9800XT for Mac.



------------

Final cut pro:

No, this will not increase performance until at least Tiger, and even then they'd have to modify FCP to use Apple's new Core Video. Currently, it does not exist.
 
slughead said:
You're absolutely right to be concerned. The truth is we don't know.

Actually, in https://forums.macrumors.com/threads/77363/, JNasty4G63 makes a pretty good case for the 6800 not being underclocked based on the specs that Apple published about the card.

And so, given that it's not underclocked, and that NVIDIA focuses more on OpenGL versus ATI on DirectX, I guess the 6800 is the way I will probably go, even if it means no Mac before September. :(

(Besides, 30" may become cheaper in the future. :D )

Bob
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.