Retina Macbook Pro 15 - With Hashwell. Will it get rid of the GPU entirely?

Discussion in 'MacBook Pro' started by luffytubby, Sep 19, 2012.

  1. luffytubby macrumors 6502a

    luffytubby

    Joined:
    Jan 22, 2008
    #1
    By the end of this video review by Anandtech ( http://www.youtube.com/watch?v=94OenZ71ADY ), Shimpi asks; "Well do you buy it?"

    He explains that the question if not if, but when. He mentions that in 2013 and 2014, Intel will release Hashwell and Broadwell. Both are said to dramatically increase the IGPU (Integrated GPU) performance.

    HD 3000 in Sandy Bridge was doubled in HD 4000 with Ivy Bridge. At least in theoretical application. The census is right now, that the HD 4000 was not made or intended to run at such resolutions that the Retina display provides. In other words, as said by Anandtech in their review;



    Remember the Macbook Pro 13? They took out the 320m Nvidia graphics and replaced it with HD 3000. There was no explanation either. From a engineering perspective it makes it easier.



    Now I am asking - When next years updates rolls around, is there any chance that they might remove the 650m and completely decide to not have discrete graphics in the Macbook Pro 15 Retina line?


    Advantages;

    1) less heat
    2) cheaper price
    3) more battery



    The benefits are monumental, and given Apples track record I am afraid they might do it. Hashwells IGPU might be twice as fast as the HD 4000 in Ivy Bridge, but I dont think its enough. Not enough for video editing and not for gaming. I am afraid that Apple might not care, and just decide; "IGPU or go away".
     
  2. Cassadian macrumors regular

    Joined:
    Sep 4, 2012
    #2
    No. Yes, it will greatly improve performance and allow Apple to improve battery life by extending the number of applications that can solely rely on the iGPU (at this moment even opening Skype activates the GT 650M, which I find ridiculous). The battery life will be extended because the computer won't have to reply on the high-power consuming dGPU for as many processes. The idea of getting rid of the dGPU is just a terrifying thought considering the amount of power you need for video editing. Sure integrated graphics cards are making strides but almost anywhere amongst educated consumers iGPU is a taboo word.
     
  3. theineffablebob macrumors regular

    Joined:
    Jul 29, 2012
    #3
    I hope not. Haswell will be fast, but the newest NVIDIA GPU out by then will be faster.

    The 650m in my rMBP is fast enough to run the latest games. I've been playing Borderlands 2 at high settings, FXAA on, 1080p at around 40 FPS which is pretty excellent.

    The GPU is Haswell is unlikely to surpass the performance of the 650m so, if the next Macbook got rid of discrete graphics, it'd be a step back in performance.

    Apple really focuses on the GPUs in their iPhone and iPad SoCs, so I'd be surprised if they regressed in that aspect in their line of notebooks.
     
  4. Stetrain macrumors 68040

    Joined:
    Feb 6, 2009
    #4
    I think Haswell might negate the need for a dedicated GPU in a 13" RMBP. It might also be one of the things that makes a Retina Macbook Air a reality in the future.

    I don't think they will sell a 15" Retina MBP without a dedicated GPU anytime soon. They did that once with the 2009 models to reach a lower entry price but quickly reverted to all 15" Macbook Pros having dedicated GPUs.
     
  5. stevelam macrumors 65816

    Joined:
    Nov 4, 2010
    #5
    um what does the iphone and ipad have anything to do with macbooks? absolutely nothing besides being made by the same company.
     
  6. luffytubby thread starter macrumors 6502a

    luffytubby

    Joined:
    Jan 22, 2008
    #6
    I did not know this! Thanks!
     
  7. Freyqq macrumors 68040

    Joined:
    Dec 13, 2004
    #7
    Even if an integrated GPU gets to be amazing, a dedicated GPU will always be more amazing. It's a simple argument if you look at die size. Dedicated GPUs have way more transistors dedicated to graphics than the integrated GPU. There is no way around that other than using dedicated vram and dedicating a huge amount of the CPU die to GPU stuff. But, there is always a tradeoff. There is no reason for Intel to build in an amazing GPU if most of their customer base has no intention of needing that kind of power. Dedicated GPU vendors cater to a different audience. Apple may reduce the dependence on a dedicated GPU, but the highest-end products will use them. Who would buy a $2000 computer without a dedicated GPU anyways?
     
  8. Tankmaze macrumors 68000

    Tankmaze

    Joined:
    Mar 7, 2012
    #8
    I agree, haswell and broadwell would benefit for the space constraint laptop such as the 13" mac lineup.

    getting rid of the discrete gpu really only make sense to apple if the integrated gpu can match or even exceed the performance of the available discrete gpu at the time.
     
  9. theineffablebob macrumors regular

    Joined:
    Jul 29, 2012
    #9
    Well, Apple shares a lot of strategies between their various sectors. See how iOS features have been rolling into OS X. Hardware designs are shared, too, such as aluminum bodies and retina displays.

    Every year Apple touts significantly improved graphics performance in their devices. How would it sound if they had to say that the graphics were weaker in their newest product, even if it did mean better battery life and a thinner design? Doesn't the rMBP need more graphics power to drive a smooth UI at that resolution, anyway?
     
  10. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #10
    What they might do is make the baseline entry 15" an iGPU only model.
    With the highend having a dGPU as opposed to those ridiculous VRAM games.

    In any case.
    The battery life they could simple improve with a better switching mechanism that works like Optimus with proxy drivers. Would be way easier and better in 99% of the situations.

    In terms of speed an iGPU has always the problem of bandwidth. They might put some extra GPU framebuffer cache on it but depending on the engine there might be some serious bandwidth shortage the more EUs they throw in and DDR4 is not on the horizon.
    If a extra fast GPU cache was so simple to offload some bandwidth need AMD and Nvidia would have done it already. I would first like to see what it gets.

    They won't drop the dGPU. The retina design can handle the heat and for battery life they could do other things if they wanted. Price difference is negligible. Those 650M are in $700 notebooks they don't cost all that much.
    On a 2000 bucks machine that really shouldn't be a deciding factor. If it now costs 1800 bucks instead won't really make a difference compared to how poor the specs look.
    More likely they would make some 15" Air instead that sells on mobility.
     
  11. Archon macrumors member

    Joined:
    May 21, 2008
    #11
    Apple could drop dgpu in favor of igpu only. Happened before.

    My opinion, they will if they can get close to the 650m performance for video and photo editing. That's Apple bread and butter pro target, not gamers I'm afraid.
     
  12. kevink82 macrumors newbie

    Joined:
    Sep 26, 2010
    #12
    I dont think it will, if you look at haswell spec it has 2x the performance as current gen for its GT3 parts.

    GT3 is the top igpu for haswell, well 2x the performance puts it in the ballpark of a current GT640M, but that is in theory and it says up to not at least 2x the performance.... the AVX2 instruction set looks promising though since it is part of wut makes sandy bridge series so fast at video encoding and decoding.
     
  13. terraphantm macrumors 68040

    Joined:
    Jun 27, 2009
    Location:
    Pennsylvania
    #13
    That's essentially what they did when they moved to the HD3000 from the 320M
     
  14. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #14
    Not really the HD3000 was about equal to the 320M in some game this one was fast in another the other. At the same time CPU speed made a two generational jump.
    The OSX drivers for the Intel were also better than for Windows. On OSX the real world speed of the HD 3000 was great even if on Windows nvidia drivers still rule.
    It is about a 1:1 with a great payoff on the CPU side. In some new games on low settings the HD 3000 is actually better for gaming because of the accompanying cpu.

    Also there was no alternative except a redesign so it is not really comparable.
    They had a 25W CPU + 12-15W GPU. Now they wanted the new 32nm 35W CPUs without a redesign adding a 15W GPU would have resulted in a slightly overheating notebook and some difficulty with fitting all the chips on the logicboard.
     
  15. cmChimera, Sep 20, 2012
    Last edited: Sep 20, 2012

    cmChimera macrumors 68040

    cmChimera

    Joined:
    Feb 12, 2010
    #15
    Haswell is really not that big of a deal. Broadwell is supposedly a major architecture change.
     
  16. terraphantm macrumors 68040

    Joined:
    Jun 27, 2009
    Location:
    Pennsylvania
    #16
    The CPU upgrade constituted a large upgrade, yes. But the integrated GPU was definitely weaker in most cases. Apple dropping the discrete GPU with haswell/broadwell would be a very similar move. Not saying they'll do it, but you can't cite iOS without considering what they've done to the actual mac line.

    BTW, if they really wanted to, Apple could've used used one of the LV parts that go as low as 17 or 25W
     
  17. RealEyes macrumors regular

    Joined:
    Jun 23, 2012
    #17
    ok when does this come out? 2013? and Broadwell?
     
  18. Dark Void macrumors 68030

    Dark Void

    Joined:
    Jun 1, 2011
    Location:
    Cimmerian End
    #18
    I don't see the pay off for removing the dedicated GPU. It's a high-end 15'' laptop - doing away with a dedicated GPU would be a step backwards.
     
  19. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #19
    In most but not in all and the difference was not all that significant. Mostly Nvidia was better of with its drivers.

    The fastest Haswell GPU with 40EUs still has to share CPU memory bandwidth. The 4000 is already half starving which is shown by how much faster memory gets you. 40 EUs instead of 16 with the same two channel DDR3 memory is going to have some troubles. It remains to be seen how well that actually scales and how much the sideport memory helps.
    They won't get anywhere near a 650M. Even if they double performance it is still only half of 650M.

    The step down would be far greater than between 320M and HD 3000. Also there is nothing to make up for it. If they wanted maximum mobility they could just make a 15" Air but that wouldn't have quad cores.

    They could have use LV parts that is true, but I honestly don't get the GPU obsession anyway. In OSX my Intel Arrendale which is a fraction of a HD 3000 is fine for anything I ever do in OSX. I only need the 330M for games in Windows and when I attach an external monitor but that is a design fault.

    Broadwell is a tick on Intels plan. Not many changes just 14nm. Haswell is a tock.
    I don't know. From a speed perspective probably neither is going to be special but Haswell is supposed to greatly reduce platform power. Most of the idle power draw today is not the CPU but chipset and so on. If Haswell cuts down there it will mean much better battery life.
    Especially for Ultrabooks with the SoC design this will mean a big change. Broadwell will apparently mean SoC for the faster chips too. This will not change much in speed by itself just power consumption, cost and size of the logicboard.
     
  20. bogatyr macrumors 65816

    Joined:
    Mar 13, 2012
    #21
    The 320M was an iGPU, not dGPU. It was part of the nVidia chipset. Intel forced nVidia out of iGPUs by building their iGPU into the CPU. An iGPU is not in the same ballpark as a dGPU.
     
  21. voyagerd macrumors 65816

    voyagerd

    Joined:
    Jun 30, 2002
    Location:
    Rancho Cordova, CA
    #22
    Ivy Bridge was also a tick, but there were still major changes in graphics.
     
  22. DTKblaster macrumors member

    Joined:
    Aug 3, 2012
    #23
    Yeah, and its expected to be another "tick+", again with a huge jump in GPU performance.
     
  23. EnderTW macrumors 6502

    Joined:
    Jun 30, 2007
    #24
    Agreed, I don't think they can call this a "pro" machine without a dedicated GPU. It'll probably be whatever the fastest is out there.
     

Share This Page