Clarifications About GT 330M Choice of Apple in New MBP's

Discussion in 'MacBook Pro' started by BaronStein, Apr 20, 2010.

  1. BaronStein macrumors regular

    Joined:
    Feb 11, 2010
    Location:
    NY
    #1
    There are many topics and debates going on about the Nvidia's GT 330M GPU's on new MBP's and I wanted to share my thoughts about this new GPU.

    First of all, I don't think that 330M is a bad choice at all. There could be better ones but all together I think it is a good choice. Nvidia is seemingly doing worse compared to current ATI gpu's. Benchmarks, specs, are all pointing on these. However, I don't think benchmarks are very important. In day to day use, things change a lot. For example, most of the people think about gaming performance of these cards, and sometimes, benchmarks fail to demonstrate the exact performance. Or, gaming performance could not mean that the GPU's are also good at rendering tasks.


    Secondly, most of the people are claiming that an ATI 5xxx gpu could work better. They are better actually, for the performance. But they have announced on March, while the new MBP design finalized on January/February. The reason for their release on April is mostly because of Intel, I think and yet they have finalized earlier than March. So in these case, it is almost impossible for Apple to put these new GPU's on the new MBP's. We all know that they are obsessed with tests, reliability etc. So, they just couldn't put them instantly. Therefore, for a product that is finalized on Jan/Feb, their alternative for 330m was either a 4xxx series GPU or a better Nvidia one. 4xxx series ATI gpu's are not very well, except the ones starting from 48xx series, which are significantly different than other 4xxx series GPU's.

    In this point, there becomes few more problems, 48xx GPU's are too high end for Apple's standards, and they don't have the automatic GPU switching. There is optimus for Nvidia's but no good solution for ATI. I know that optimus were made for Windows, but it is easier for Apple to make something equivalent or better for OS X as there is a working good example on PC's. Also, 4xxx series were too outdated compared to Nvidia 3xx series, just looking at announcement dates.

    Moreover, Nvidia GPU's have been used in MBP's for many generations. So, Apple simply wanted to stay with what they know, rather than adopting a different architecture.

    In conclusion, I'm good with Apple's choice, although that I could've prefer something else. If the new MBP's were to released on February, I don't think there would be as high complaints as now. But Intel and iPad, changed the plans a little.
     
  2. therealseebs macrumors 65816

    therealseebs

    Joined:
    Apr 14, 2010
    #2
    I have no information suggesting that the optimus thing was particularly relevant; if anything, I'd think that it came about because Apple developed the technology.

    I dunno; I'll see how it does. If it can handle some light gaming, I'll stop complaining.
     
  3. gordonyz macrumors member

    Joined:
    Nov 25, 2009
    #3
    They could have the GT 335M put in.:mad:
    GT 335 is better it have 50% more CUDA cores.
     
  4. BaronStein thread starter macrumors regular

    Joined:
    Feb 11, 2010
    Location:
    NY
    #4
    This is computer world. Things depreciate a lot. Yet, people tend to complain almost about anything. So there is no end for what could've put. And there won't be.
     
  5. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #5
    I see what you're trying to do with this thread, but unfortunately most of your logic is flawed. Benchmarks are what video cards are rated on, it's all they do. Rendering tasks and gaming are both measured by benchmarks, so they are the best way of measuring performance.

    ATIs GPUs are available to manufacturers before they are available to consumers, otherwise how would they be released in new computers? Don't forget that the 330m is a re-branded card, so it is nothing new at all.

    Automatic switching is a null point since Apple created their own solution and didn't even use optimus. If they wanted to do it with an ATI card, they could have.

    I can understand people defending Apple on some things, like the new 13" MBP, but in this case, anybody knowledgeable about computer technology would say that Apple cheaped out in the GPU department.
     
  6. therealseebs macrumors 65816

    therealseebs

    Joined:
    Apr 14, 2010
    #6
    It's not that I blame them for making a business decision to go with a cheap part, or a part they already had the software for, or whatever. It's that I don't care for them pretending that this is a "killer" GPU when it's obviously not.
     
  7. DesmoPilot macrumors 65816

    Joined:
    Feb 18, 2008
    #7
    These two posts sum it up exactly.
     
  8. murdercitydevil macrumors 68000

    murdercitydevil

    Joined:
    Feb 23, 2010
    Location:
    california
    #8
    All your points are totally spot on and I agree completely, but I must ask in regards to the above, "were we really expecting them to NOT cheap out in the GPU departmenT?"
     
  9. Wolfpup macrumors 68030

    Joined:
    Sep 7, 2006
    #9
    Regarding doing the switching with ATi hardware-it's not possible. This is partially hardware based. It's Optimus for OS X. An ATi GPU would require a manual switchover taking 10 seconds or whatever, like Nvidia's second gen switchable graphics needed.

    That ATi currently has higher end notebook hardware is a moot point too, as the GT 330 is low end, and the equivalent ATi part being used would also be low end. I mean that would be a point if Apple was using a Geforce GTS 360 and you still wanted more, but given Nvidia has much better stuff available, it doesn't matter.

    I'm personally not a fan of having ANY Intel graphics in a system, but...it doesn't surprise me at all Apple went this direction.
     
  10. therealseebs macrumors 65816

    therealseebs

    Joined:
    Apr 14, 2010
    #10
    That's awesome that you found concrete evidence to support this, but sadly, you seem to have forgotten to link to it.

    Doesn't follow. They used the 330, rather than another part, because it fit their power budget. If they used an ATI card, it'd be a card in the same wattage range, which would be quite a bit faster at drawing graphics.
     
  11. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #11
    This is totally wrong, ATI has had seamless switching for a while now, but they've only used it in their own solutions so far. The hardware capability is there.

    Also, ATI's offerings at the same pricepoint and TDP as the 330m are better performers. It doesn't matter how you spin it, ATI would be the better solution. Also as others have said, Apple is touting the 330m as a top end GPU which it isn't even close to.
     
  12. admiraldennis macrumors regular

    Joined:
    Aug 19, 2002
    Location:
    Boston, MA
  13. therealseebs macrumors 65816

    therealseebs

    Joined:
    Apr 14, 2010
    #13
    You seem to be implying that Apple's solution, which they advertised and demonstrated long before nVidia started doing optimus, is secretly based on optimus... But you haven't exactly supported this.

    Engineers from both companies have said that they're basically unrelated, so far as I know.
     
  14. admiraldennis macrumors regular

    Joined:
    Aug 19, 2002
    Location:
    Boston, MA
    #14
    Beat my ninja edit.

    They didn't advertise seamless switching before optimus...

    I'm not saying Apple's system is optimus at all, but if nvidia added features to the graphics chips to partially-faciliate optimus, these features could probably be leveraged by Apple

    I redacted because this is indeed completely baseless speculation

    e:

    "Optimus is a completely new technology, one that would require next-generation Nvidia graphics cards, new motherboards, and Nvidia software to make the switch between graphics automatic and seamless. "
    src: http://www.pcmag.com/article2/0,2817,2358928,00.asp

    I'm sure Apple's software implementation doesn't share anything with optimus, but it's hard to believe this is even possible without some form of hardware support

    Apple's software *is* very different from optimus: optimus users "profiles" to determine when to switch, Apple's is on-demand with a use-based heuristic. But that doesn't mean there isn't some (even minor) hardware switch to facilitate this
     
  15. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #15
    You are right all the reasonable people did not really expect Apple to put in a strong GPU. There was hope and with the good performing low power ATI cards it was even reasonable but I guess business relations are more important to Apple than getting the best Tech.

    The whole GPU switching argument though is annoying. Some badly informed people like the thread starter act like there has been no GPU switching before Optimus and this Apple solution. Optimus has absolutely nothing to to with this GPU switching solution of Apple. Also the tech to switch between an Intel IGP and a dedicated GPU of Nvidia or ATI was there since montevina. There was not that much of a solution Apple actually had to develop. All they did is change the drivers in OSX in a way that they can handle the switching on the fly and without a glitch. I bet all Apple really did was software work that would have worked as well with an HD 5650 or anything else.
    The whole hardware thing was already there only many manufacturers were to lazy and cheap to put in all the demux chips and wires to make it possible, because the chipset alone was never enough. Except for Optimus which is a lot cheaper and easier to implement which is why many manufactures will probaly use it.

    There is actually one good thing about the 330M. It sucks in benchmarks and even more in actual gameplay but it is better in GPGPU stuff including OpenCL than ATI. I do not believe this is the reason they sticked to Nvidia though as the iMacs have ATI too. More likely is that they want fermi based cards and don't break all ties with Nvidia or that they don't want to mix Nvidia and AMD in their MBP Line as with the 320m they didn't have any choice.

    PS: The only feature Nvidia had in GT2XX cards an newer was a kind of copy engine that transfered completely rendered frames over the PCIe to the Intel IGP to forward it to the display(without an impact to performane of the remaining GPU, because by driver means only ATI could to the same but they would suffer a performance hit). Since in Apple's solution the 330M has a direct output to the display they don't need this at all. All those transistors are wasted. Apple uses hardware-wise the exact same solution as HP on Envy or Dell in XPSS or other notebooks that can switch graphics.
    Really there is enough information on the web about both technologies. No need to speculate so much nonsense. You really like to speculate here in this forum.
     
  16. admiraldennis macrumors regular

    Joined:
    Aug 19, 2002
    Location:
    Boston, MA
    #16
    How can you be "absolutely" sure? It *seems* unlikely -- Apple never had seamless switching before optimus, Apple's first seamless switching comes right after optimus and works on an optimus-enabled chip...

    Don't forget -- Apple doesn't "modify" or custom-print the graphics chips, they purchase them. They do use custom firmwares, but again, it seems pretty damn unlikely that seamless switching is possible without some level of hardware support -- and even if ATI and nvidia both have this in some form, I don't know the hurdles involved in taking advantage of them.

    Obviously Apple and nvidia's OS-side software is very different; they don't even work similarly.

    How do you know what's involved with engineering something like this?

    This is very possible and mutually exclusive of "nvidia's chips easily facilitating implementing this"

    That's seems like a pretty baseless assertion

    We're talking about seamless switching, not slow-mux switching
     
  17. cluthz macrumors 68040

    cluthz

    Joined:
    Jun 15, 2004
    Location:
    Norway
    #17
    I think apple spoiled us when it came to performance with the nvidia 8600GT.
    Mid 2007 the 8600GT was a really good performer, and after that GPU performance haven't had any upgrade at all, until now.

    As far as I understand the 330M is only a bit faster than the 8600M GT was, and much of the reasons the 330M scores better is that it's tested on systems with twice the RAM, faster RAM, and much faster CPUs.

    If someone told you that there wouldn't be any faster macbook pros in terms of graphics power until 2013, what would you say?
     
  18. Wolfpup macrumors 68030

    Joined:
    Sep 7, 2006
    #18
    If for some reason you don't believe me, read up on the technology on Anandtech.

    No it's not. ATi's tech is equivalent to Nvidia's second gen tech. It's not seamless, which is apparently what Apple wanted.

    It has everything to do with it. It's hardware based (as well as software).

    Guys, read the relevant articles before claiming this. What Optimus does is NOT the same thing that's been available previously.

    Regarding the "Ati more powerful at given power level" comments, it's not likely there's any drastic difference one way or the other. They're both made on the same 40nm process by the same company.

    If this ISN'T relying on Optimus, I'd love to read more on it, but either way, you're not going to get a drastic performance difference out of two chips with the same transistor count on the same die process made by the same company.
     
  19. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #19
    Are you serious? ATI has had dynamically switchable graphics for a while now. http://www.amd.com/us/products/technologies/switchable-graphics/Pages/switchable-graphics.aspx

    They have one tech to work with intel integrated GPUs and another to work with their own chipsets. You can't physically yank out the GPU like nvidia is so proud of, but it accomplishes the same thing: graphics switching that is invisible to the end user.

    It's the same thing, switch GPUs depending on the situation. ATI uses plugged in or not to determine when to switch, but the conditions can easily be changed.
     
  20. Wolfpup macrumors 68030

    Joined:
    Sep 7, 2006
    #20
    No, it's not the same thing. Again, check out the Anandtech articles on Optimus. ATi's current solution is equivalent to Nvidia's last gen (2nd gen) solution. It can switch without rebooting, but it's not seamless as Optimus is.

    (Note, I don't WANT Optimus, and if I had to have Intel graphics on a system at all, would want to be able to completely disable it, not have the GPU I want as my real GPU just hanging off a PCIe connection-would rather just have the ability to completely switch over like ATi's current and Nvidia's last gen implementation-but in any event, that's not the same thing as what Nvidia is currently doing in their third gen implementation.)
     
  21. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #21
    You're arguing semantics. How it does it isn't as big of a deal as what it does. Have you used ATI switchable graphics? It is seamless to the user, you don't have to log out like you did with nVidia's last gen stuff. There may be some delay on the backend, but it accomplishes everything apple would need.
     
  22. Wolfpup macrumors 68030

    Joined:
    Sep 7, 2006
    #22
    That is not semantics. It is not the same thing. It is not seamless, nor instantaneous. As I keep saying, it's the equivalent to Nvidia's SECOND gen switchable graphics, not their third.
     
  23. therealseebs macrumors 65816

    therealseebs

    Joined:
    Apr 14, 2010
    #23
    I don't see anything showing that Apple's is based on nVidia's.

    You'd have to offer a precise definition of "seamless". Doesn't require reboot is good enough for me, though...

    Uh, "likely" doesn't enter into it. "Likely" is what we say with things that haven't had thousands of benchmarks run on them.

    In fact, yes, there's a drastic distance -- comparable-wattage ATI mobile chips outperform the 330 by a huge margin.

    Sure you are. Architecture exists, and matters, and can make huge differences in real-world performance.
     
  24. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #24
    Once again you completely ignore the nature of what I'm saying. In it's raw state, ATI's tech is better than nVidia's last gen tech. It doesn't require a reboot or log out. It's literally one screen flicker, and even this only occurs on ATI/Intel setups, ATI's own solutions don't even have the flicker. A tiny bit of engineering and even that would be gone. Apple could have easily implemented a solution, but they didn't for the sake of saving a buck.
     
  25. Wolfpup macrumors 68030

    Joined:
    Sep 7, 2006
    #25
    What GPUs would you put in that category?

    Neither does Nvidia's last gen tech.

    You guys aren't listening-if you want to know how Optimus works, read the Anadtech articles on it. It is *NOT* possible with ATi's current hardware. Again, ATi's current hardware works like Nvidia's last gen (second gen) switchable graphics.
     

Share This Page