NVidia NV30 NV35 Specs?

firewire2001

macrumors 6502a
Apr 2, 2002
718
0
Hong Kong
thats kinda interesting.... my really good friend that used to live here jus moved to san jose which really sucks...

his dad ( i dunno if im supposed to say this... ) works for nvidia.... the reason he moved was cause of work... his dad is really smart, and he, for instance, found the one transistor out of milllions of transistors in the geforce 3 for the xbox that was placed inmproperly on the board, which cause the chip to malfunction.

at nvidia, he is at a high postion and is one of the primary chip designers... anyways

hes really a pc guy.. and so is his son -- i helped him build hgis forst pc a few years ago -- and so one day i was over at his house (about 2 weeks before he would move) and he explained the main reason he couldnt work from so far away anymore (this is really amazing, he completely worked on all the schematics for the cards from his laptop at home -- would uplaod them to nvidias servers, then would visit san jose every so often), and he realized i was so into macs, so he told me one of the main reasons he couldnt work so far away anymore was because he had to be in contact with the guys who wrote the drivers more closely, because nvidia was moving towards mac in general, and other platforms.

maybe this was also in conjunction with the new geforce 4 architecture... just a thought
 

shadowfax0

macrumors 6502
May 2, 2002
408
0
How's this 8x AGP going to work with anything? I know AGP is backwards compatable, but I think thsi might be too much of a step backwards for someone to really squeeze all the juice out of it...My friend have a 6x in his P4 1.2 Ghz, and that fancy RDRAM, so maybe he could get alot fo potential out of it, but I think I have better graphics anyway with Quartz, compared to whatever the hell XP uses (My guess is it's something along the lines of...crap...yeah I think that would sum it up nicely) :)
 

Mr. Anderson

Moderator emeritus
Nov 1, 2001
22,561
0
VA
NV30: specs
0.13 micron process
400MHz GPU
512-bit chip structure
AGP 8X
8 rendering pipelines
Supports 128-256MB of DDR SDRAM
900MHz DDR SDRAM
200 million polygons per second
Lightspeed Memory Architecture III
Supports DirectX 9 and OpenGL 1.3

NV35 specs:
0.13 micron process
500MHz GPU
512-bit chip structure
AGP 8X
8 rendering pipelines
Supports 128-256MB of DDR SDRAM
1000-1200MHz DDR or QDR
400Mhz RAMDAC
Lightspeed Memory Architecture III
Supports DirectX 9.1 and OpenGL 2.0


These specs are insane, wow! I still remember when having 256MB of RAM in your computer was amazing, let alone in the friggin Video card.

Could someone tell me what RAMDAC is?
 

Rower_CPU

Moderator emeritus
Oct 5, 2001
11,219
0
San Diego, CA
Originally posted by dukestreet
These specs are insane, wow! I still remember when having 256MB of RAM in your computer was amazing, let alone in the friggin Video card.

Could someone tell me what RAMDAC is?
RAMDAC is the frequency of the RAM...in this case 400 MHz would be quad-pumped to get 1200MHz overall.
 

Funkatation

macrumors regular
Dec 29, 2001
138
0
That's not the RAMDAC...

To copy from whatis.com

"RAMDAC (random access memory digital-to-analog converter) is a microchip that converts digital image data into the analog data needed by a computer display. A RAMDAC microchip is built into the video adapter in a computer. It combines a small static RAM (SRAM) containing a color table with three digital-to-analog converters that change digital image data into analog signals that are sent to the display's color generators, one for each primary color - red, green, and blue. In a cathode ray tube (CRT) display, an analog signal is sent to each of three electron guns. With displays using other technologies, the signals are sent to a corresponding mechanism. "

Basically, its the part that drives your display.. It's not the frequency of the RAM. Most RAMDAC's are rated at 350-400MHZ (To provide clear crisp 2d/3d on your screen)
 

Rower_CPU

Moderator emeritus
Oct 5, 2001
11,219
0
San Diego, CA
Sorry, my bad...I should have double-checked first.

But if video cards come with DVI and ADC ports now instead of VGA what does the RAMDAC do? Changing "digital image data into analog signals" is uneccessary with digital outputs.
 

emdezet

macrumors member
Feb 12, 2002
55
0
cologne, germany
Originally posted by Rower_CPU
If video cards come with DVI and ADC ports now instead of VGA what does the RAMDAC do? Changing "digital image data into analog signals" is uneccessary with digital outputs.
The only Apple machine to come with DVI _instead_ of VGA is the Ti, which features a DVI->VGA adapter. All the other models _do_ have VGA right by the ADC. And as I understand it, the analog signal is always present on DVI- and ADC-connectors.

As long as CRTs are that much cheaper than TFTs, they will have to offer analog video-output, or even the entry-level minitower would _require_ an Apple Display making it unattractive to many potential customers.
 

Dr. Distortion

macrumors regular
May 2, 2002
159
0
Eindhoven, the Netherlands
Hmm, didn't RAMDAC have to do with screen size and refresh rates. For example, I've got a voodoo2 banshee with 250 mhz RAMCAC, and I've heard that that's the reason why I can display resolutions like 1024x768 at 120 or 160 hertz refresh rate...
 

Rower_CPU

Moderator emeritus
Oct 5, 2001
11,219
0
San Diego, CA
Originally posted by emdezet
The only Apple machine to come with DVI _instead_ of VGA is the Ti, which features a DVI->VGA adapter. All the other models _do_ have VGA right by the ADC. And as I understand it, the analog signal is always present on DVI- and ADC-connectors.

As long as CRTs are that much cheaper than TFTs, they will have to offer analog video-output, or even the entry-level minitower would _require_ an Apple Display making it unattractive to many potential customers.
Wrong. The new TiBooks have DVI only.
 

dantec

macrumors 6502a
Nov 6, 2001
605
0
California
So basically, your saying if I shoved one of these new cards in my Powermac, I would never get the full potential, because I don't have AGP 8x. That is pretty damn mean... Maybe they should do an extension that plugs into 2 firewire ports... That would be sweet.

When are they going to do multicore grahpics cards?? That would be sweet, 4 NV35's in one GeForce 4 or GeForce 5 Ti...
 

dantec

macrumors 6502a
Nov 6, 2001
605
0
California
The question is, will OpenGL 2.0 support my graphics card???

I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!
 

Pants

macrumors regular
Aug 21, 2001
194
3
Originally posted by dantec
The question is, will OpenGL 2.0 support my graphics card???

I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!
no the *real* question is, how long will you have to wait for a game that utilises OpenGL2? if your buying this card for its ogl support, your wasting your money..... best wait for a game that uses all its whistles and bells, and then buy it for a fraction of the price.

And no Apple does not *want* you to buy a new box just for an AGP 8x card. no, but I'm sure they'd like to take your money, and who can blame them if your daft enough to feel you *need* this card? Do some of you guys actually *think* before posting? or even play any games?? shheesh! :)

'multi core' graphics cards...remember something called a voodoo 5? or twin voodoo 2s?
 

emdezet

macrumors member
Feb 12, 2002
55
0
cologne, germany
Originally posted by dantec
The question is, will OpenGL 2.0 support my graphics card???

I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!
Well, so?
When I was complaining about X running too slowly on my 500/66 iBook, everybody flamed me :mad:

I don't like NVidia anyway. What about that rumored new ATI killerGPU, that they want to keep NVidia at bay with til 2004?
 

whawho

macrumors regular
May 7, 2002
134
0
Columbus, OH
OpenGL 2.0

Taken from 3dlabs website:

Our goals for OpenGL 2.0 are to add support for pixel and fragment shaders, improve memory management and give applications more control over the rendering pipeline. In doing so, we still will provide compatibility with OpenGL 1.3 - so older applications will run on graphics accelerators with OpenGL 2.0 drivers.

I really don't think that any new video cards, or old for that matter, will get left behind. I think what will happen is OpenGL 2.0 will add features to applications that lack on the Mac platform because of openGL 1.3. This will make standards in the graphics industry (much needed). The good thing about OpenGL 2.0 is that all the major players seen to be involved (Including Microsoft).
 

dantec

macrumors 6502a
Nov 6, 2001
605
0
California
Originally posted by emdezet


Well, so?
When I was complaining about X running too slowly on my 500/66 iBook, everybody flamed me :mad:

I don't like NVidia anyway. What about that rumored new ATI killerGPU, that they want to keep NVidia at bay with til 2004?
I was the only one who didn't. They did however flame me with Quartz Extreme, and the issue made it to the front page!
 

mc68k

macrumors 68000
Apr 16, 2002
1,996
0
Originally posted by emdezet
And as I understand it, the analog signal is always present on DVI- and ADC-connectors.
Yes, but it depends on the type of DVI connector. Analog is present in the ADC, but not used anymore AFAIK. The Apple Studio Display 17 CRT had an ADC connector, so there must be analog in there somewhere, but it is probably not utilized.

As for DVI the Analog component of the connector is the part that is shaped like a cross. That is the part that can be parsed out to create the same signal as VGA, but from DVI-I. DVI-D lacks this extra part, and is just the digital signal.
 

mc68k

macrumors 68000
Apr 16, 2002
1,996
0
DIV Pic

I had problems with the PeeCee, hence this extra post on a mac. Bleh.
 

Attachments

Rower_CPU

Moderator emeritus
Oct 5, 2001
11,219
0
San Diego, CA
Originally posted by emdezet
Wrong. Read the f+cking tech specs!

DVI output port
VGA output support with included Apple DVI to VGA Adapter
Sorry, an adapter does not make it analog.

Oh, by the way, the GeForce 4 Ti has an ADC and DVI ports, not VGA...sounds like someone else needs to go back and read some tech specs. :rolleyes:
 

oldMac

macrumors 6502a
Oct 25, 2001
522
1
Pretty sure it's running an analog signal in there...

Hi Rower,

I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.
 

Rower_CPU

Moderator emeritus
Oct 5, 2001
11,219
0
San Diego, CA
Re: Pretty sure it's running an analog signal in there...

Originally posted by oldMac
Hi Rower,

I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.
Hey, welcome to the "demi-God(dess) club"!

It seems that way, but since we are moving away from an analog signal, why should the RAMDAC play an important role?
 

mc68k

macrumors 68000
Apr 16, 2002
1,996
0
Re: Pretty sure it's running an analog signal in there...

Originally posted by oldMac
Hi Rower,

I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.
Yes, it is DVI-I— I'm with oldMac. Look at the picture above, all it needs to do is parse the signal from the right-hand side of the connector to DB-15.
 

mc68k

macrumors 68000
Apr 16, 2002
1,996
0
Re: Re: Pretty sure it's running an analog signal in there...

Originally posted by Rower_CPU


Hey, welcome to the "demi-God(dess) club"!

It seems that way, but since we are moving away from an analog signal, why should the RAMDAC play an important role?
Marketing. You know the whole MHz thing— same applies for RAMDAC. If you don't have a CRT and there's no need for RAMDAC, then it's just "we've got fast RAMDAC, buy me!"

But they do still need to push RAMDAC due to the large CRT base. Unitl that goes away, then it's still a needed spec.