NVidia NV30 NV35 Specs?

Discussion in 'MacRumors News Discussion (archive)' started by arn, May 20, 2002.

  1. arn
    macrumors god

    arn

    Staff Member

    Joined:
    Apr 9, 2001
    #1
    Per an Xlr8yourmac link...F1Gamers posted unofficial specs of upcoming NVidia graphics cards. First cards seem to be aimed at 3rd or 4th quarter of 2002.

    A previous ZDNet article revealed that this new video card/chipset would be a fundamentally new architecture from the GeForce 4.
     
  2. macrumors 6502a

    Joined:
    Apr 2, 2002
    Location:
    Hong Kong
    #2
    thats kinda interesting.... my really good friend that used to live here jus moved to san jose which really sucks...

    his dad ( i dunno if im supposed to say this... ) works for nvidia.... the reason he moved was cause of work... his dad is really smart, and he, for instance, found the one transistor out of milllions of transistors in the geforce 3 for the xbox that was placed inmproperly on the board, which cause the chip to malfunction.

    at nvidia, he is at a high postion and is one of the primary chip designers... anyways

    hes really a pc guy.. and so is his son -- i helped him build hgis forst pc a few years ago -- and so one day i was over at his house (about 2 weeks before he would move) and he explained the main reason he couldnt work from so far away anymore (this is really amazing, he completely worked on all the schematics for the cards from his laptop at home -- would uplaod them to nvidias servers, then would visit san jose every so often), and he realized i was so into macs, so he told me one of the main reasons he couldnt work so far away anymore was because he had to be in contact with the guys who wrote the drivers more closely, because nvidia was moving towards mac in general, and other platforms.

    maybe this was also in conjunction with the new geforce 4 architecture... just a thought
     
  3. macrumors 6502

    Joined:
    May 2, 2002
    #3
    How's this 8x AGP going to work with anything? I know AGP is backwards compatable, but I think thsi might be too much of a step backwards for someone to really squeeze all the juice out of it...My friend have a 6x in his P4 1.2 Ghz, and that fancy RDRAM, so maybe he could get alot fo potential out of it, but I think I have better graphics anyway with Quartz, compared to whatever the hell XP uses (My guess is it's something along the lines of...crap...yeah I think that would sum it up nicely) :)
     
  4. macrumors 68030

    Catfish_Man

    Joined:
    Sep 13, 2001
    Location:
    Portland, OR
    #4
    It sounds...

    ...like a copy of the Matrox Parhelia 512. Maybe nVidia will get around to making a Mac version (Matrox hasn't).
     
  5. Moderator emeritus

    Mr. Anderson

    Joined:
    Nov 1, 2001
    Location:
    VA
    #5
    NV30: specs
    0.13 micron process
    400MHz GPU
    512-bit chip structure
    AGP 8X
    8 rendering pipelines
    Supports 128-256MB of DDR SDRAM
    900MHz DDR SDRAM
    200 million polygons per second
    Lightspeed Memory Architecture III
    Supports DirectX 9 and OpenGL 1.3

    NV35 specs:
    0.13 micron process
    500MHz GPU
    512-bit chip structure
    AGP 8X
    8 rendering pipelines
    Supports 128-256MB of DDR SDRAM
    1000-1200MHz DDR or QDR
    400Mhz RAMDAC
    Lightspeed Memory Architecture III
    Supports DirectX 9.1 and OpenGL 2.0


    These specs are insane, wow! I still remember when having 256MB of RAM in your computer was amazing, let alone in the friggin Video card.

    Could someone tell me what RAMDAC is?
     
  6. Moderator emeritus

    Rower_CPU

    Joined:
    Oct 5, 2001
    Location:
    San Diego, CA
    #6
    RAMDAC is the frequency of the RAM...in this case 400 MHz would be quad-pumped to get 1200MHz overall.
     
  7. macrumors regular

    Joined:
    Dec 29, 2001
    #7
    That's not the RAMDAC...

    To copy from whatis.com

    "RAMDAC (random access memory digital-to-analog converter) is a microchip that converts digital image data into the analog data needed by a computer display. A RAMDAC microchip is built into the video adapter in a computer. It combines a small static RAM (SRAM) containing a color table with three digital-to-analog converters that change digital image data into analog signals that are sent to the display's color generators, one for each primary color - red, green, and blue. In a cathode ray tube (CRT) display, an analog signal is sent to each of three electron guns. With displays using other technologies, the signals are sent to a corresponding mechanism. "

    Basically, its the part that drives your display.. It's not the frequency of the RAM. Most RAMDAC's are rated at 350-400MHZ (To provide clear crisp 2d/3d on your screen)
     
  8. Moderator emeritus

    Rower_CPU

    Joined:
    Oct 5, 2001
    Location:
    San Diego, CA
    #8
    Sorry, my bad...I should have double-checked first.

    But if video cards come with DVI and ADC ports now instead of VGA what does the RAMDAC do? Changing "digital image data into analog signals" is uneccessary with digital outputs.
     
  9. macrumors member

    Joined:
    Feb 12, 2002
    Location:
    cologne, germany
    #9
    The only Apple machine to come with DVI _instead_ of VGA is the Ti, which features a DVI->VGA adapter. All the other models _do_ have VGA right by the ADC. And as I understand it, the analog signal is always present on DVI- and ADC-connectors.

    As long as CRTs are that much cheaper than TFTs, they will have to offer analog video-output, or even the entry-level minitower would _require_ an Apple Display making it unattractive to many potential customers.
     
  10. macrumors regular

    Joined:
    May 2, 2002
    Location:
    Eindhoven, the Netherlands
    #10
    Hmm, didn't RAMDAC have to do with screen size and refresh rates. For example, I've got a voodoo2 banshee with 250 mhz RAMCAC, and I've heard that that's the reason why I can display resolutions like 1024x768 at 120 or 160 hertz refresh rate...
     
  11. Moderator emeritus

    Rower_CPU

    Joined:
    Oct 5, 2001
    Location:
    San Diego, CA
    #11
    Wrong. The new TiBooks have DVI only.
     
  12. macrumors 6502a

    Joined:
    Nov 6, 2001
    Location:
    California
    #12
    So basically, your saying if I shoved one of these new cards in my Powermac, I would never get the full potential, because I don't have AGP 8x. That is pretty damn mean... Maybe they should do an extension that plugs into 2 firewire ports... That would be sweet.

    When are they going to do multicore grahpics cards?? That would be sweet, 4 NV35's in one GeForce 4 or GeForce 5 Ti...
     
  13. macrumors 6502a

    Joined:
    Nov 6, 2001
    Location:
    California
    #13
    The question is, will OpenGL 2.0 support my graphics card???

    I just got my Quicksilver (July edition), and I'm screwed if I buy a new graphics card, so Apple wants me to buy a new Mac to get AGP 8x & to support OpenGL 2.0. Soon we are going to see games that "need" OpenGL 2.0, and if it isn't an upgrade, we are all screwed!
     
  14. macrumors member

    Joined:
    Feb 12, 2002
    Location:
    cologne, germany
    #14
    Wrong. Read the f+cking tech specs!

    DVI output port
    VGA output support with included Apple DVI to VGA Adapter
     
  15. macrumors regular

    Joined:
    Aug 21, 2001
    #15
    no the *real* question is, how long will you have to wait for a game that utilises OpenGL2? if your buying this card for its ogl support, your wasting your money..... best wait for a game that uses all its whistles and bells, and then buy it for a fraction of the price.

    And no Apple does not *want* you to buy a new box just for an AGP 8x card. no, but I'm sure they'd like to take your money, and who can blame them if your daft enough to feel you *need* this card? Do some of you guys actually *think* before posting? or even play any games?? shheesh! :)

    'multi core' graphics cards...remember something called a voodoo 5? or twin voodoo 2s?
     
  16. macrumors member

    Joined:
    Feb 12, 2002
    Location:
    cologne, germany
    #16
    Well, so?
    When I was complaining about X running too slowly on my 500/66 iBook, everybody flamed me :mad:

    I don't like NVidia anyway. What about that rumored new ATI killerGPU, that they want to keep NVidia at bay with til 2004?
     
  17. macrumors regular

    Joined:
    May 7, 2002
    Location:
    Columbus, OH
    #17
    OpenGL 2.0

    Taken from 3dlabs website:

    Our goals for OpenGL 2.0 are to add support for pixel and fragment shaders, improve memory management and give applications more control over the rendering pipeline. In doing so, we still will provide compatibility with OpenGL 1.3 - so older applications will run on graphics accelerators with OpenGL 2.0 drivers.

    I really don't think that any new video cards, or old for that matter, will get left behind. I think what will happen is OpenGL 2.0 will add features to applications that lack on the Mac platform because of openGL 1.3. This will make standards in the graphics industry (much needed). The good thing about OpenGL 2.0 is that all the major players seen to be involved (Including Microsoft).
     
  18. macrumors 6502a

    Joined:
    Nov 6, 2001
    Location:
    California
    #18
    I was the only one who didn't. They did however flame me with Quartz Extreme, and the issue made it to the front page!
     
  19. macrumors 68000

    mc68k

    Joined:
    Apr 16, 2002
    #19
    Yes, but it depends on the type of DVI connector. Analog is present in the ADC, but not used anymore AFAIK. The Apple Studio Display 17 CRT had an ADC connector, so there must be analog in there somewhere, but it is probably not utilized.

    As for DVI the Analog component of the connector is the part that is shaped like a cross. That is the part that can be parsed out to create the same signal as VGA, but from DVI-I. DVI-D lacks this extra part, and is just the digital signal.
     
  20. macrumors 68000

    mc68k

    Joined:
    Apr 16, 2002
    #20
    DIV Pic

    I had problems with the PeeCee, hence this extra post on a mac. Bleh.
     

    Attached Files:

    • dvi.jpg
      dvi.jpg
      File size:
      48.5 KB
      Views:
      1,024
  21. Moderator emeritus

    Rower_CPU

    Joined:
    Oct 5, 2001
    Location:
    San Diego, CA
    #21
    Sorry, an adapter does not make it analog.

    Oh, by the way, the GeForce 4 Ti has an ADC and DVI ports, not VGA...sounds like someone else needs to go back and read some tech specs. :rolleyes:
     
  22. macrumors 6502a

    Joined:
    Oct 25, 2001
    #22
    Pretty sure it's running an analog signal in there...

    Hi Rower,

    I'm pretty sure that the DVI on the new TiBook is also pushing out an analog signal. Otherwise, you'd need a big converter with wires hanging off of it (not like the nice little in-line one that comes with the TiBook).

    It's pretty standard these days to run both digital and analog signals out the DVI port and to bundle an a VGA adaptor. My ATI Radeon card came that way, too.
     
  23. Moderator emeritus

    Rower_CPU

    Joined:
    Oct 5, 2001
    Location:
    San Diego, CA
    #23
    Re: Pretty sure it's running an analog signal in there...

    Hey, welcome to the "demi-God(dess) club"!

    It seems that way, but since we are moving away from an analog signal, why should the RAMDAC play an important role?
     
  24. macrumors 68000

    mc68k

    Joined:
    Apr 16, 2002
    #24
    Re: Pretty sure it's running an analog signal in there...

    Yes, it is DVI-I— I'm with oldMac. Look at the picture above, all it needs to do is parse the signal from the right-hand side of the connector to DB-15.
     
  25. macrumors 68000

    mc68k

    Joined:
    Apr 16, 2002
    #25
    Re: Re: Pretty sure it's running an analog signal in there...

    Marketing. You know the whole MHz thing— same applies for RAMDAC. If you don't have a CRT and there's no need for RAMDAC, then it's just "we've got fast RAMDAC, buy me!"

    But they do still need to push RAMDAC due to the large CRT base. Unitl that goes away, then it's still a needed spec.
     

Share This Page