PDA

View Full Version : NVidia NV30 NV35 Specs?


arn
May 20, 2002, 05:03 PM
Per an Xlr8yourmac (http://www.xlr8yourmac.com/) link...F1Gamers posted (http://www.f1gamers.com/f1/apanel/view_news.php?id=2649) unofficial specs of upcoming NVidia graphics cards. First cards seem to be aimed at 3rd or 4th quarter of 2002.

A previous ZDNet article (http://www.macrumors.com/pages/2002/05/20020502135243.shtml) revealed that this new video card/chipset would be a fundamentally new architecture from the GeForce 4.

firewire2001
May 20, 2002, 05:53 PM
thats kinda interesting.... my really good friend that used to live here jus moved to san jose which really sucks...

his dad ( i dunno if im supposed to say this... ) works for nvidia.... the reason he moved was cause of work... his dad is really smart, and he, for instance, found the one transistor out of milllions of transistors in the geforce 3 for the xbox that was placed inmproperly on the board, which cause the chip to malfunction.

at nvidia, he is at a high postion and is one of the primary chip designers... anyways

hes really a pc guy.. and so is his son -- i helped him build hgis forst pc a few years ago -- and so one day i was over at his house (about 2 weeks before he would move) and he explained the main reason he couldnt work from so far away anymore (this is really amazing, he completely worked on all the schematics for the cards from his laptop at home -- would uplaod them to nvidias servers, then would visit san jose every so often), and he realized i was so into macs, so he told me one of the main reasons he couldnt work so far away anymore was because he had to be in contact with the guys who wrote the drivers more closely, because nvidia was moving towards mac in general, and other platforms.

maybe this was also in conjunction with the new geforce 4 architecture... just a thought

shadowfax0
May 20, 2002, 10:24 PM
How's this 8x AGP going to work with anything? I know AGP is backwards compatable, but I think thsi might be too much of a step backwards for someone to really squeeze all the juice out of it...My friend have a 6x in his P4 1.2 Ghz, and that fancy RDRAM, so maybe he could get alot fo potential out of it, but I think I have better graphics anyway with Quartz, compared to whatever the hell XP uses (My guess is it's something along the lines of...crap...yeah I think that would sum it up nicely) :)

Catfish_Man
May 20, 2002, 10:32 PM
...like a copy of the Matrox Parhelia 512. Maybe nVidia will get around to making a Mac version (Matrox hasn't).

Mr. Anderson
May 20, 2002, 10:33 PM
NV30: specs
0.13 micron process
400MHz GPU
512-bit chip structure
AGP 8X
8 rendering pipelines
Supports 128-256MB of DDR SDRAM
900MHz DDR SDRAM
200 million polygons per second
Lightspeed Memory Architecture III
Supports DirectX 9 and OpenGL 1.3

NV35 specs:
0.13 micron process
500MHz GPU
512-bit chip structure
AGP 8X
8 rendering pipelines
Supports 128-256MB of DDR SDRAM
1000-1200MHz DDR or QDR
400Mhz RAMDAC
Lightspeed Memory Architecture III
Supports DirectX 9.1 and OpenGL 2.0

These specs are insane, wow! I still remember when having 256MB of RAM in your computer was amazing, let alone in the friggin Video card.

Could someone tell me what RAMDAC is?

Rower_CPU
May 20, 2002, 10:40 PM
Originally posted by dukestreet
These specs are insane, wow! I still remember when having 256MB of RAM in your computer was amazing, let alone in the friggin Video card.

Could someone tell me what RAMDAC is?

RAMDAC is the frequency of the RAM...in this case 400 MHz would be quad-pumped to get 1200MHz overall.

Funkatation
May 20, 2002, 11:50 PM
To copy from whatis.com

"RAMDAC (random access memory digital-to-analog converter) is a microchip that converts digital image data into the analog data needed by a computer display. A RAMDAC microchip is built into the video adapter in a computer. It combines a small static RAM (SRAM) containing a color table with three digital-to-analog converters that change digital image data into analog signals that are sent to the display's color generators, one for each primary color - red, green, and blue. In a cathode ray tube (CRT) display, an analog signal is sent to each of three electron guns. With displays using other technologies, the signals are sent to a corresponding mechanism. "

Basically, its the part that drives your display.. It's not the frequency of the RAM. Most RAMDAC's are rated at 350-400MHZ (To provide clear crisp 2d/3d on your screen)

Rower_CPU
May 21, 2002, 12:05 AM
Sorry, my bad...I should have double-checked first.

But if video cards come with DVI and ADC ports now instead of VGA what does the RAMDAC do? Changing "digital image data into analog signals" is uneccessary with digital outputs.

emdezet
May 21, 2002, 04:32 AM
Originally posted by Rower_CPU
If video cards come with DVI and ADC ports now instead of VGA what does the RAMDAC do? Changing "digital image data into analog signals" is uneccessary with digital outputs.

The only Apple machine to come with DVI _instead_ of VGA is the Ti, which features a DVI->VGA adapter. All the other models _do_ have VGA right by the ADC. And as I understand it, the analog signal is always present on DVI- and ADC-connectors.

As long as CRTs are that much cheaper than TFTs, they will have to offer analog video-output, or even the entry-level minitower would _require_ an Apple Display making it unattractive to many potential customers.

Dr. Distortion
May 21, 2002, 06:46 AM
Hmm, didn't RAMDAC have to do with screen size and refresh rates. For example, I've got a voodoo2 banshee with 250 mhz RAMCAC, and I've heard that that's the reason why I can display resolutions like 1024x768 at 120 or 160 hertz refresh rate...

Rower_CPU
May 21, 2002, 10:27 AM
Originally posted by emdezet
The only Apple machine to come with DVI _instead_ of VGA is the Ti, which features a DVI->VGA adapter. All the other models _do_ have VGA right by the ADC. And as I understand it, the analog signal is always present on DVI- and ADC-connectors.

As long as CRTs are that much cheaper than TFTs, they will have to offer analog video-output, or even the entry-level minitower would _require_ an Apple Display making it unattractive to many potential customers.

Wrong. The new TiBooks have DVI only.

dantec
May 21, 2002, 10:41 AM
So basically, your saying if I shoved one of these new cards in my Powermac, I would never get the full potential, because I don't have AGP 8x. That is pretty damn mean... Maybe they should do an extension that plugs into 2 firewire ports... That would be sweet.

When are they going to do multicore grahpics cards?? That would be sweet, 4 NV35's in one GeForce 4 or GeForce 5 Ti...

dantec
May 21, 2002, 10:43 AM
The question is, will OpenGL 2.0 support my graphics card???

I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!

emdezet
May 21, 2002, 12:52 PM
Originally posted by Rower_CPU
Wrong. The new TiBooks have DVI only.

Wrong. Read the f+cking tech specs!

DVI output port
VGA output support with included Apple DVI to VGA Adapter

Pants
May 21, 2002, 12:54 PM
Originally posted by dantec
The question is, will OpenGL 2.0 support my graphics card???

I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!

no the *real* question is, how long will you have to wait for a game that utilises OpenGL2? if your buying this card for its ogl support, your wasting your money..... best wait for a game that uses all its whistles and bells, and then buy it for a fraction of the price.

And no Apple does not *want* you to buy a new box just for an AGP 8x card. no, but I'm sure they'd like to take your money, and who can blame them if your daft enough to feel you *need* this card? Do some of you guys actually *think* before posting? or even play any games?? shheesh! :)

'multi core' graphics cards...remember something called a voodoo 5? or twin voodoo 2s?

emdezet
May 21, 2002, 12:59 PM
Originally posted by dantec
The question is, will OpenGL 2.0 support my graphics card???

I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!

Well, so?
When I was complaining about X running too slowly on my 500/66 iBook, everybody flamed me :mad:

I don't like NVidia anyway. What about that rumored new ATI killerGPU, that they want to keep NVidia at bay with til 2004?

whawho
May 21, 2002, 01:22 PM
Taken from 3dlabs website:

Our goals for OpenGL 2.0 are to add support for pixel and fragment shaders, improve memory management and give applications more control over the rendering pipeline. In doing so, we still will provide compatibility with OpenGL 1.3 - so older applications will run on graphics accelerators with OpenGL 2.0 drivers.

I really don't think that any new video cards, or old for that matter, will get left behind. I think what will happen is OpenGL 2.0 will add features to applications that lack on the Mac platform because of openGL 1.3. This will make standards in the graphics industry (much needed). The good thing about OpenGL 2.0 is that all the major players seen to be involved (Including Microsoft).

dantec
May 21, 2002, 02:14 PM
Originally posted by emdezet


Well, so?
When I was complaining about X running too slowly on my 500/66 iBook, everybody flamed me :mad:

I don't like NVidia anyway. What about that rumored new ATI killerGPU, that they want to keep NVidia at bay with til 2004?

I was the only one who didn't. They did however flame me with Quartz Extreme, and the issue made it to the front page!

mc68k
May 21, 2002, 03:13 PM
Originally posted by emdezet
And as I understand it, the analog signal is always present on DVI- and ADC-connectors.
Yes, but it depends on the type of DVI connector. Analog is present in the ADC, but not used anymore AFAIK. The Apple Studio Display 17 CRT had an ADC connector, so there must be analog in there somewhere, but it is probably not utilized.

As for DVI the Analog component of the connector is the part that is shaped like a cross. That is the part that can be parsed out to create the same signal as VGA, but from DVI-I. DVI-D lacks this extra part, and is just the digital signal.

mc68k
May 21, 2002, 03:18 PM
I had problems with the PeeCee, hence this extra post on a mac. Bleh.

Rower_CPU
May 21, 2002, 03:56 PM
Originally posted by emdezet
Wrong. Read the f+cking tech specs!

DVI output port
VGA output support with included Apple DVI to VGA Adapter

Sorry, an adapter does not make it analog.

Oh, by the way, the GeForce 4 Ti has an ADC and DVI ports, not VGA...sounds like someone else needs to go back and read some tech specs. :rolleyes:

oldMac
May 21, 2002, 11:25 PM
Hi Rower,

I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.

Rower_CPU
May 21, 2002, 11:34 PM
Originally posted by oldMac
Hi Rower,

I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.

Hey, welcome to the "demi-God(dess) club"!

It seems that way, but since we are moving away from an analog signal, why should the RAMDAC play an important role?

mc68k
May 21, 2002, 11:35 PM
Originally posted by oldMac
Hi Rower,

I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.
Yes, it is DVI-I— I'm with oldMac. Look at the picture above, all it needs to do is parse the signal from the right-hand side of the connector to DB-15.

mc68k
May 21, 2002, 11:39 PM
Originally posted by Rower_CPU


Hey, welcome to the "demi-God(dess) club"!

It seems that way, but since we are moving away from an analog signal, why should the RAMDAC play an important role?
Marketing. You know the whole MHz thing— same applies for RAMDAC. If you don't have a CRT and there's no need for RAMDAC, then it's just "we've got fast RAMDAC, buy me!"

But they do still need to push RAMDAC due to the large CRT base. Unitl that goes away, then it's still a needed spec.

Rower_CPU
May 21, 2002, 11:41 PM
I stand corrected...I'll just slink back to my server room and plot on how to get back at you all...:p

mc68k
May 21, 2002, 11:43 PM
Originally posted by Rower_CPU
I stand corrected...I'll just slink back to my server room and plot on how to get back at you all...:p
Uh-oh……that might be worse for some of us than others;)……*runs and hides*

Spare me! Not another pie war!

emdezet
May 22, 2002, 02:46 AM
Originally posted by Rower_CPU
I stand corrected...I'll just slink back to my server room and plot on how to get back at you all...:p

About that GeForce Titanium with ADC and DVI...I don't care
And excuse me for "f+cking" snapping atcha.:D
But I don't take kindly to "Wrong <period>" when I'm right.

Well, it's nice to know another stupid acronym of the IT industries (RAMDAC) but to hell with it. My iBook has gotten me so used to TFT by now, I don't plan on buying another CRT ever again, unless my SONY 29" TV breaks down:(

Anyway, a friend told me about ATI planning this next generation GPU which they seem to be mighty proud of. Anyone have any specifics and advice how to read them?