NVIDIA 6800 GT DDL coming in november!

Converted2Truth

macrumors 6502a
Original poster
Feb 6, 2004
608
0
Hell@HighAltitude
Apple's site says in red (when you configure a powermac G5) right under the video card selection that:

Note: The NVIDIA GeForce 6800 GT DDL graphics card will be available for order in early November.

Choose from the latest graphics card options from both ATI and NVIDIA, including the new, advanced NVIDIA GeForce 6800 Ultra DDL or NVIDIA GeForce 6800 GT DDL with dual DVI ports. All cards include dual display support and ship with a DVI to VGA adapter. Special note on the ATI Radeon 9800 XT and NVIDIA GeForce 6800 Ultra DDL or 6800 GT DDL: the larger the size of these advanced graphics card reduces the number of available PCI or PCI-X slots from three to two on the Power Mac G5.
Learn more
 

Converted2Truth

macrumors 6502a
Original poster
Feb 6, 2004
608
0
Hell@HighAltitude
NVIDIA GeForce 6800 Ultra DDL or NVIDIA GeForce 6800 GT DDL
The groundbreaking NVIDIA GeForce 6800 graphics processor delivers the industry's first 16-pipe superscalar architecture and support for the world's fastest DDR3 memory to raise the bar for 3D graphics performance. The specifications of the GeForce 6800 GT DDL GPU, are stunning: Using over 220 million transistors it supports a 256-bit interface for an effective memory bandwidth of 32 GB per second which delivers 525 million vertices, 5.6 billion textured pixels per second. The specifications for the GeForce 6800 Ultra DDL take these numbers up yet another notch to 35.2GB per second throughput, 600 million vertices and 6.4 billion textured pixels per second. Both GPUs feature NVIDIA's industry leading CineFX 3.0 technology allowing unprecendented special effects to be processed in real-time.

Both the GeForece 6800 Ultra and GT GPUs are built on an AGP 8X board and include 256MB of GDDR3 memory for use in the most demanding graphics applications. Both cards support the DVI standard dual link digital signal specification on each of the two DVI ports that are included. This capability is required to drive the new Apple Cinema HD display the first high-resolution 30-inch LCD. The combination of a GeForce 6800 Ultra or GT DDL with a dual processor Power Mac G5 driving two 30-inch Apple Cinema HD Displays is the definitive tool for the creative professional. Special note on the NVIDIA GeForce 6800 Ultra DDL and GEForce 6800 GT DDL: due to larger size of these advanced graphics card, the available PCI or PCI-X slots will be reduced to two slots from three.
The NVIDIA GeForce 6800 GT DDL graphics card will be available for order in early November.
 

iriejedi

macrumors 6502a
Oct 4, 2000
807
116
Nor Cal
go ultra

if your paying that much money - pay the $450 (or $405 edu) and go for the ULTRA version... wow it is amazing....


:)

Converted2Truth said:
It is also avaliable as an upgrade kit for $499.
 

a2daj

macrumors member
Sep 14, 2004
88
2
I'm wondering if they added the 6800 GT due to inventory issues with the Ultra...
 

Converted2Truth

macrumors 6502a
Original poster
Feb 6, 2004
608
0
Hell@HighAltitude
a2daj said:
I'm wondering if they added the 6800 GT due to inventory issues with the Ultra...
The NVIDIA chips are manufactured on the IBM process, just like the G5. And just like the G5 chip, the NV40 (or whatever it's called) has serious yeild issues. NVIDIA can't get their hands on these chips because IBM can't figure out how to increase yield. So.. just like the G5's, all the defective 6800 Ultra's are clocked down until they work, and are then sold as 6800 GT's.

The G5's are the same way. A 1.6 G5 is a defective 2.0 down-clocked(to 1.6) so it works without error. Likewise (now here is when i start blowing smoke), the overly efficient 2.0 G5's manufactured @90nm are then overclocked to 2.5ghz, water cooled, and sold as a 2.5ghz G5.

If only IBM would figure out how to manufacture microchips... then we'd have G5's an 6800 Ultra's! And even better!, if they were efficient at it, then there would be no 1.6 G5 and 6800 GT's (unless they intentionally downclocked fully functional chips).
 

Converted2Truth

macrumors 6502a
Original poster
Feb 6, 2004
608
0
Hell@HighAltitude
Xenious said:
So what are the technical differences between the two cards?
NVIDIA GeForce 6800Ultra DDL 35.2GB/sec 6.4bil TP/sec 600mil verticies

NVIDIA GeForce 6800 GT DDL: 32 GB/sec 5.6 bil TP/sec 525 mil verticies

---and just for kicks---

ATI 9800 Pro Special Ed (rides on the short bus now... :eek: ) 3.0 bil TP/sec (aka 'giga pixels') 380 mil verticies (aka MTriangles)

Ati X800 Pro 5.7 GB/sec 475 mill verticies
 

Dont Hurt Me

macrumors 603
Dec 21, 2002
6,056
6
Yahooville S.C.
Converted2Truth said:
NVIDIA GeForce 6800Ultra DDL 35.2GB/sec 6.4bil TP/sec 600mil verticies

NVIDIA GeForce 6800 GT DDL: 32 GB/sec 5.6 bil TP/sec 525 mil verticies

---and just for kicks---

ATI 9800 Pro Special Ed (rides on the short bus now... :eek: ) 3.0 bil TP/sec (aka 'giga pixels') 380 mil verticies (aka MTriangles)

Ati X800 Pro 5.7 GB/sec 475 mill verticies
Plus it turns out that the 6800GT is easy to bump up the clock in the PC world and some sites have had it running at 435 mhz vs 350 for stock. My BFG 6800GT has 2 fans and is clocked at 370 just for anyone's information.Many reviews on the net on the 6800GT. Most i have seen have been very positive. Base 6800 has 12 pipes, 6800 UT & GT both have 16. pipes.
 

applekid

macrumors 68020
Jul 3, 2003
2,098
0
We probably won't see any good overclocking tools for the 6800 GT DDL.

LOL. I was reading the red print when I was checking out the latest low-end PowerMac and only assumed it was talking about the 6800 Ultra. Silly me. :)
 

Anarchy99

macrumors 6502a
Dec 13, 2003
858
758
CA
invaLPsion said:
Nope, they upped the price on the 6800 ultra to an extra $500 (or $450 edu).
what when i look at the cards it says 599.99 for the ultra and 499.99 for the Gt am i being stupid or am i missing something i dont see the $500 raise
 

Converted2Truth

macrumors 6502a
Original poster
Feb 6, 2004
608
0
Hell@HighAltitude
$499 is a hell of alot better than $600. Expect the X800 to be the same. However, we might be able to get the X800 for cheaper because apple doesn't hold the monopoly on distribution of ATI cards as they do with nvidia cards.

I don't know though... when cards are this expensive, what's an extra $100 to have the best. Then again, 'The Best' comes with out-dated drivers and no override panels, so is it really the best?
 

Converted2Truth

macrumors 6502a
Original poster
Feb 6, 2004
608
0
Hell@HighAltitude
Pre-orders are being taken now (well, yesterday and on)

I found a way to preorder the card. Click on accessories -> Displays -> [CLICK ON THE 'LEARN MORE ABOUT THIS PRODUCT' LINK] below nVIDIA 6800 GT DDL. ...and there's your 'add to card' icon. I successfully placed an order, so it should work for all you guys.
 

Dont Hurt Me

macrumors 603
Dec 21, 2002
6,056
6
Yahooville S.C.
Another example of how being the oddball doesnt help you. Blame Apple , the PPC, but you guys are talking about a card i have been using for a year and a half. When apple goes all Intel there will be no more Mac cards, imagine the ability to use any video card with your new powermac? Plus the ability to use newer cards. The Intel move should put a end to the we are getting it for $100 more and a year and half later syndrome.:(
 

Eric5h5

macrumors 68020
Dec 9, 2004
2,406
357
Dont Hurt Me said:
When apple goes all Intel there will be no more Mac cards, imagine the ability to use any video card with your new powermac? Plus the ability to use newer cards. The Intel move should put a end to the we are getting it for $100 more and a year and half later syndrome.:(
Don't count on it. Aside from possible firmware issues (not sure how likely, but possible), there's the driver problem. Cards don't "just work," you need drivers to run them. Well, technically they all have generic fallback modes, but are you really going to spend $500 for the latest and greatest just to run it in 640x480 non-accelerated 16 color mode? Are ATI and nVidia suddenly going to start writing drivers for OS X themselves? Realistically, no. It will still be all about waiting for Apple.

--Eric