Dream: 13" retina quad core

Discussion in 'MacBook Pro' started by Sifinity, Aug 3, 2014.

  1. Sifinity macrumors 6502

    Sifinity

    Joined:
    Jun 11, 2014
    Location:
    Texas
    #1
    i just bought a 15' retina and I'm coming from a cMB 13' , all i have to say is damn the screen and the size is big lol , i wish there was a 13' retina quad core , it would be powerful as a 15' yet portable and light
     
  2. anthorumor macrumors 6502a

    Joined:
    Jun 16, 2009
    Location:
    Sydney, Australia
    #2
    Keep dreaming, and enjoy your recent purchase already!
     
  3. Dilster3k macrumors 6502a

    Dilster3k

    Joined:
    Jul 20, 2014
    Location:
    Frankfurt, Germany
    #3
    I think packing a quad core processor within a 13'' Air Chassis with a retina display would be a true killer. It's probably years far off... But I still do prefer 15" over 13" for that extra screen space, even if it compromises portability.
     
  4. Sifinity thread starter macrumors 6502

    Sifinity

    Joined:
    Jun 11, 2014
    Location:
    Texas
  5. leman macrumors G3

    Joined:
    Oct 14, 2008
    #5
    Well, that will take some time. It was rumoured that the quad core becomes standard with Intel's Skylake, but latest leaks show that the suitable CPUs will still be dual-core. So it will probably take another 3-4 years for quad core configurations to hit the low-power segment. I am still skeptical about their usefulness there, to be honest. With low TDP ceiling, a higher-clocked dual core CPU might be just as efficient as a severely underclocked quad-core one.
     
  6. duervo macrumors 68020

    duervo

    Joined:
    Feb 5, 2011
    #6
    Lack of quad-core CPU in the 13" size is the only reason I went with a 15". (I run a lot VMs, with nested virtualization ... dual-core just wasn't cutting it for me.)
     
  7. yjchua95 macrumors 604

    Joined:
    Apr 23, 2011
    Location:
    GVA, KUL, MEL (current), ZQN
    #7
    +1. I run at least 3 VMs simultaneously, and I have to assign at least 2 processor cores to it, because VMware treats one thread as one core, so to assign a full core to a VM, I have to assign at least 2 threads.
     
  8. leman macrumors G3

    Joined:
    Oct 14, 2008
    #8
    What do you mean by 'thread'? A virtual HT-core?
     
  9. SarcasticJoe macrumors 6502a

    SarcasticJoe

    Joined:
    Nov 5, 2013
    Location:
    Finland
    #9
    Considering the quad core chips they put into the 15" machines have a TDP of 47W when the dual core chips they put into the 13" machines are 28W I don't that would be a good idea.

    First of all the battery life would be crap, a "hungrier" chip like that is going to draw more power even when it's not doing much and the unibody machines already fit a battery as big as possible for the frame. Second of all it would probably run so hot that you'd be as comfortable resting your hands on an iron (one of those things you use to get wrinkles out of your laundry) as you'd be putting your hands on the palm rests.

    The reason why the Air gets such good battery life is that it runs a chip that has an even lower TDP of 15W. Why doesn't Apple then use a quad core chip with the same TDP as the one currently in the 13" Pro? Simply because Intel doesn't make any. The lowest TDP quad cores Intel makes are 37W ones with crap in graphics slower than what's in the Macbook Air.

    Don't be fooled by the Razer Blade 14, because it's pretty pretty close to the same size as the 15" due to the bigger bezels around the screen.
     
  10. maflynn Moderator

    maflynn

    Staff Member

    Joined:
    May 3, 2009
    Location:
    Boston
    #10
    I don't think it will happen, apple positioned the 15" model as the more powerful alternative to the 13"
     
  11. yjchua95 macrumors 604

    Joined:
    Apr 23, 2011
    Location:
    GVA, KUL, MEL (current), ZQN
    #11
    Yes, that's a thread.
     
  12. leman, Aug 4, 2014
    Last edited: Aug 4, 2014

    leman macrumors G3

    Joined:
    Oct 14, 2008
    #12
    As someone with a CS diploma, I feel hurt inside when terminology is butchered in that fashion, but sure :p

    A side note: when you assigning two virtual cores to your VM, its not like you are dedicating a 'full' physical core. You aren't actually dedicating anything at all. You are simply enabling two VM-internal threads to be executed at the same time. Whether they are actually executed on the same physical core or on different cores, is purely a matter of chance and what is going on in your host OS. Edit: in OS X, there is an API to schedule a group of threads for execution on the same core, but I have no idea whether VM hypervisors use that API.

    At any rate, I would recommend allocating 2 cores for any VM, simply because modern operating systems expect multi-core CPUs and can operate more efficiently when they get them. In general, there is also not much sense in giving it more then 2 cores, unless its a server that has to process a large amount of requests (and also able to do that in a non-blocking fashion).
     
  13. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #13
    Intel says that 14nm delivers about 30% more efficient chips. Now somebody should try and see what 37W - 30% is. ;)
    A 28W quad core is very much possible. Considering the Quad Core SoCs are coming it is far from unlikely that there will be a Quad Core 13" retina option. Just depends on whether Apple wants it. They never showed any interest in the 37W chips either. It is always that fear that a quad core 13" might cannibalize 15" sales too much, I guess.

    With todays power gating a quad core would not consume any more power unless it is used and then it depends on the TDP and what the power managment engine allows it to clock at. Today the cores themselves consume virtually nothing when shut off. The only reason the current quads suck more today is that it is still a 2 chip design. Once they go SoC they can power down to the same levels the current 15W and 28W chips can.

    Actually the HD 4600 beats the HD 5000 of the Air. The naming scheme is deceiving. The only reason Intel doesn't offer a 37W with a bigger GPU is because nobody is asking for it. They intend those current chips for dual GPU notebooks (where anything bigger than a HD 4600 would be a waste of money) and Apple obviously didn't show any interest in a special 37W Quad for them. In theory by setting the TDP manuall lower on the 47W chips one could even get one. It is just an expensive solution and Apple prefered to go down to 28W over upgrading performance with the old chassis and the two fans.
     
  14. SarcasticJoe macrumors 6502a

    SarcasticJoe

    Joined:
    Nov 5, 2013
    Location:
    Finland
    #14
    You can always speculate about how products on the horizon are going to change things, but we won't be seeing any Broadwell chips until probably well into next year.

    Sure, it's possible if they cut down on the clock speeds, voltage and also performance, but there wouldn't be much point to a much bigger piece of silicon that performs and consumers as much power as a smaller piece of silicon.

    If Intel's 47W quad cores ACTUALLY consumed as little power under low stress as the 28W dual cores the Pro would be miles ahead in terms of batterylife. The reality is that it's not miles ahead, it's pretty close with the 15" machines having much bigger batteries.

    Sure, the HD4600 does beat the HD5000 in some tests, but it's still a modified HD4000 while the the HD5000 is a new GPU and both still fall well behind the HD5100 (a.k.a Iris) in the current Pro.

    Are you trolling me or just stupid? If they put in a bigger GPU it would become a 47W chip, it's that simple. The TDP figure also includes the GPU, not just the CPU, so complaining why they don't have lower wattage quad cores with good GPU's is like complaining why there aren't any pickups with a milage as good as a Prius...
     
  15. Sifinity thread starter macrumors 6502

    Sifinity

    Joined:
    Jun 11, 2014
    Location:
    Texas
    #15
    id say around cannon lake maybe we may get some 13' quad cores as the chips will be cooler and require less power but preform more efficently
     
  16. paolo- macrumors 6502a

    Joined:
    Aug 24, 2008
    #16
    I had the same feeling when going from 13" to 15", a few weeks past and I didn't mind the weight/size. Now I feel like I'm looking at a computer through a peep hole when using a 13".
     
  17. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #17
    @ paolo- I always had 15" notebooks and never could quite fathom getting by with these tiny 13". I guess you can get used to a lot of things but I never wanted to get used to small 13" displays. That is tablet size for me.

    It would still perform better. You can always get some extra efficiency by running more cores at a lower speed.
    You do know that the Quad Cores got an extra PCH chip while the 28W dual cores are only one chip with lower platform power. It is not the cores that are the problem it is what once was called the uncore and the extra chip and data connections between those chips. The cores themselves go down to 0,31W when they aren't used on the 47W quad core. They virtually don't matter.
    Intel said in some slides that Broadwell will bring the same one chip solution for quad cores as well (at least some of them the HQ models most likely).
    It beats it in all tests that matter. The HD 4600 runs at max clock on the quad core when it is needed all the time. The 40EU HD 5000 needs almost 25W for the GPU alone at max clock. That isn't even in the TDP budget for more than a few seconds. Unless the CPU has nothing to do the HD 5000 might as well be a slightly more efficient HD 4400. It is just an example that a bigger chip can eek out some extra efficiency but in that case I think it was never worth it. There is a reason aside from Apple almost nobody uses those chips. They aren't worth the money and Apple probably asked for them so they had to take them.
    Calling me stupid. LOL I hit a nerve or something? All the Intel chips since Ivy Bridge have an adjustable TDP that isn't set in stone. There is part of the chip which is quite big actually that does nothing but manage power distribution and shutting of small portion on the chip and it does so based on a TDP rating it is programmed for. That setting can be changed up and down. You can set a 47W chip - the big one with edram and big GPU - to run at a 30% lowered TDP and the power control unit will determine the clock rates it can deliver to each core and the EU array of the GPU. That is possible since Ivy Bridge and Apple could just run any 4770HQ and run it at a TDP that the old 13" with the two fans could handle. 28W is probably not in the cards until 14nm but 37W was absolutely possible. The clocks wouldn't even be that much lower. Efficiency scaling of clock rates isn't linear. Lowering clock rates 20% can deliver 30% lower power or more.
    So you'd get a 1.8Ghz Quad Core at 37W with HD 5200.

    You sound like some Apple apologetic with only a superficial understanding of the tech you talk about.
     
  18. SarcasticJoe macrumors 6502a

    SarcasticJoe

    Joined:
    Nov 5, 2013
    Location:
    Finland
    #18
    No it wouldn't... Just look at how lots of Intel's dual core chips beat AMD's quad cores at the same or lower TDP figures. All it would do is offer more physical threads at the cost of real life performance. The only way to get more power out of more cores is to at the same time ratchet up the power consumption and even then you're looking at a lower per thread performance.

    PCH's don't consume anywhere near enough as much power as you might think and with workloads as heavily multi threaded as today you don't just turn cores off completely except during idle (not to be confused with sleep or being turned off). Sure, OSX does since 10.9 do "micronaps" trough clever scheduing, but dual core chips also get the take "naps" the same way as quad core chips.


    Yes, it beats the GPU of the Air in most tests, but that still leaves it well behind the 5100 (which should really be the focus here), not to mention the 5200 (a.k.a Iris Pro).

    Yes, it runs at max clocks when it's needed like all modern GPU's, however the point of the HD5000 is that it consumes less during low workloads and is intended for machines that are most of the time taxed with low workloads. This is why the Air has such a good battery life. Battery life tests are low workload tests.


    Setting clocks, voltages and functional units on and off dynamically is something that goes back much further than Ivy Bridge. The basic tech has gone under many names over the years, Intel introduced it as "SpeedStep" back in the Pentium 3 days (1999) while AMD introduced it as "Cool' n' Quiet" in the original Athlon 64 (2003).

    Also, taking a whole 10W off a 47W chip will lead to a much bigger clock drop than just one down to 1.8GHz under stress. You may be able to run it at that frequency in low stress, but it will seriously hamper performance under heavy stress. It would basically be a quad core Macbook Air where you can brag about the number of cores, but it'll basically crumble down to it's knees when under stress.

    TDP figures are figures for what the machine will draw when going all out and under moderate or low stress, chips generally don't consume even half of that. Actual TDP does not scale linearly with clock frequency, it scales MUCH worst than that even when you also adjust the voltage at the same time.

    Yes, one that pretty much crumbles to it's knees and downclocks itself to oblivion when you put it under stress... Even then it'll consume more power than an equivalent dual core when it doesn't have to downclock itself to oblivion.

    The only way to describe a machine like that is pointless when a dual core consumes less power under low stress and probably even performs better when it doesn't have to downclock itself to oblivion under stress.

    ... and you come off as an Intel fanboy by droning on about what Intel is promising in chips that won't become available until well into next year, moving the goalposts by arguing about the HD4600vsHD5000 (rather than HD5100), ignoring any higher stress and confusing idle with low computational load.

    Another way of putting it is that you're completely fixated on the number of cores rather than actual performance and power consumption. This kind of fixation on pure specs is the same kind of idiocy you get from the smartphone fanboys who like to go on about how the current iPhone is a low or mid range device because it's only got a dual core CPU, when in reality it can easily punch with the high end Snapdragon 800 devices.
     
  19. Steve121178 macrumors 601

    Steve121178

    Joined:
    Apr 13, 2010
    Location:
    Bedfordshire, UK
    #19
    No, most vendors will just shrink their notebooks in size. Most vendors don't give a toss about the users requirements. They think it's all about size & weight.

    However, look out for wireless charging and some other cool features starting with the Skylake architecture.
     

Share This Page