Intel Launches Haswell Processors Ahead of WWDC Mac Updates


Actually Apple has two choices:

  1. Release a 45W 13" rMBP, because the cooling seems to be adequate enough (they use the same mechanism from the 15" version and that has to dissipate 45W from the CPU + 60W from the GPU if both are fully loaded)
  2. They can use the new cTDPdown stepping Intel introduced; with this the CPU + Iris 5200 will be slightly downclocked but will fit into the 35W TDP envelope

Personally I believe they will do the later but I'm hoping for the former. If they'll do I will sell my current 13" rMBP and get the new one, because this will be a HUGE upgrade (quad core + half way decent GPU).

Anandtech even speculated that Apple would drop the discrete GPU in the 15" version and just use the cTDPup stepping (55W TDP envelope) and therefore boosting the power of the Iris Pro to roughly 650M level. While this will of course increase battery life substantially, it will be a step backwards for the graphic card, so I don't agree with Anands assessment here, but who knows.

Also keep in mind how expensive those BGA Haswell chips with Iris 5200 are: Intel wants 650$ for them. That will most certainly mean that the 13" rMBP will go back to the price they had before the (unusual massive) price cut earlier this year.

As for the Desktop version: It is very clear that Haswell was made for mobile gadget in mind. In fact the desktop chips seems to be just "over clocked" versions of the mobile chips. Overclocking on them (even the K series) is very limited, all the while the general improvement are in some case non existent. I will stick to my overclocked Core i5 Ivy-Bridge in my gaming rig for quite a while now.
 
What sucks about having to wait to get the Mac Mini with USB3 is that only 6 months later, the GPU is completely out of date even with Intel, let alone stand-alones and there is no way to just replace the GPU in it. But just because a better integrated GPU is now becoming available, Apple might take another year to update the Mac Mini so I can't just replace it either. The GPU is the weakest part of the Mini, after all and so it's a bit of an irritating situation. Yes, other Macs face similar problems, but they often get updated more often (well, the MBP does, at least).
 
Wrong!

Haswell (v3) is the generation after ivy bridge (v2, and ivy bridge xeons have been available for a while). In fact, if you click on the intel announcement, it actually mentions Xeon E3-1200 v3 as part of today's introduction.

The E5 needed for dual cpu Mac Pros is still a couple months off, but next week Apple could announce new MP with E5/dual versions shipping later but E3 quads shipping immediately.

The one downside to the new generation of chips is that so far they don't have a version of the single socket xeon that's more than four cores. Are those expected later? Or will people have to buy the dual versions even for single socket 6 (or more) core?

Next gen Mac Pro will use Ivy Bridge Xeon E5 wich is not yet released.
Intel-Ivy-Bridge-EP-Launch-Roadmap.png
 
Not arguing for or against the initial point (ARM-based Macs), but there's a few things you've missed in your response.

:rolleyes:

Moving to ARM would be a huge disaster in many ways. Here are a few:

1. ARM is a great power sipping CPU but lacks the architecture to compete with x86 in performance. Having used both, there is just no comparison. Anything that is CPU demanding will crush ARM. It just wasn't designed to do that kind of work. It is a light-weight portable CPU.

I'm not degrading ARM but it is what it is. My lawn mower's engine is awesome but I'm not going to put it in my car.

No doubt that a lawnmower engine isn't suitable for a car. But there are motorcycles which do just fine using engines in the same class as a lawnmower. Not every PC has to be a 'car', just as not every PC has to be a 'truck', 'semi', or 'ore hauler'. Quite a few users would notice very little difference between an ARM-based Mac and a top of the line Intel-based Mac, as demonstrated by the number of people who get by just fine with only an ARM-based tablet. My wife, for example, technically 'has' a laptop, but it got unplugged for about a week, and the battery ran down *unnoticed* because she uses the iPad for just about everything.

2. An ARM CPU would break all current OS X applications. Sure, you could do a Rosetta deal but talk about SLOW. A slow CPU doing instruction set translations on the fly. Yeah, that will be fast.

Yep. An ARM CPU won't run X86-* software. However, software written for OS X doesn't *have* to be for X86. It can also be for PowerPC. It would take very little effort for Apple to enable compiling OS X apps for ARM, and *most* software would compile to a new Universal binary without any significant effort, because *most* software doesn't bother with processor-specific code anymore these days.

Yes, going the Rosetta route would be an obviously bad idea. But Apple tends to spot and avoid those obviously bad ideas (especially when it comes to software solutions), so I don't expect we'd ever see an ARM->Intel 'Rosetta' implementation.

3. Whether you like it or not, it is a Windows world. Windows runs on x86. When Apple moved to Intel, many people (including me) jumped on the bandwagon because I can still run Windows apps via VM or BC. Moving to ARM would kill that. The amount of people that would jump ship would really hurt Apple's bottom line.

Whether it's a Windows world or not has remarkably little impact on Apple's bottom line in the PC space. Only a small portion of Apple's PC sales go to people who use Windows. Those people would (ostensibly) be smart enough to buy an X86-based Mac, rather than an ARM-based one.

4. Intel knows that if they can get an x86 CPU to sip power like an ARM they will rule the market (price being equal). An x86 tablet that can run Windows apps as long as an iPad would dominate the marketplace.

Completely true, but it's all predicated on that really important "if" in the opening sentence, and Intel is having issues there.

5. Intel is smart and they will have a power sipping CPU well before ARM can boost the performance of ARM. Intel has the technology, the people, and the experience.

That goes contrary to current evidence. So far ARM has been closing the performance gap much more quickly than the power gap has been closing, and there's a distinct lack of evidence that points toward that changing any time soon.

Also, you can replace ARM with Apple CPU and it all stays the same.

Yep, because Apple's CPU designs are ARM designs.



This next part is entirely off point, but I'll address it anyway.

People think Microsoft is stupid because they didn't do an iPad before Apple. It is much more complicated than that. If you do x86 you get a wealth of applications that run from the start but you will loose on battery life.

If you do ARM you loose the wealth of applications but gain battery life.

So you have Surface RT (ARM) which is a failure and Surface Pro (x86) which is some-what of a failure.

People think Microsoft is stupid because there's quite a bit of evidence to support that opinion. I'll demonstrate that, using the Surface RT/Pro as an example. Suffice it to say that the CPU difference has been the *least* significant part of the Surface's failure so far.


Microsoft announces the Surface as a tablet finally done right (implying that the best selling tablet of *all time*, the iPad) is in some way deficient.
Microsoft announces that the Surface RT will be available soon, followed shortly by the Surface Pro, which is the 'real thing'. They bungle the releases of both pretty badly, and produce a tablet which is more expensive than comparable iPads (the RT), and a full-fledged laptop which is awkward to use (the Pro). Their own stores don't do a good job of explaining the differences between the two, so *many* RT units get returned because they won't run people's software.
 
I'm not talking about costs, I'm talking about processing power and energy efficiency
If I understood your post correctly, you were saying that for ARM to design a chip that had the equivalent processing power of an Intel chip, it would lose its edge in energy efficiency. I would argue that 1) we won't know until they do and 2) even if what you're saying turns out to be true (that at the equivalent processing power, ARM=Intel in energy efficiency), using ARM chips still has a cost advantage over Intel.

My point was why ARM chips might be a desirable route to go for Apple in *some* of its computers, if and when the day arrives that ARM (or Apple) designs a chip that is sufficient for Macbook use. Most people don't have use for the processing power of Intel's chips. That is being proven by the projection that tablet sales will surpass notebook sales this year, and all PC's combined by 2015.

It stands to reason then that Apple might be thinking of building a notebook computer with the efficiency of a tablet. How they do that is up for debate, but it seems to me a foregone conclusion that if they can, they will do it. Intel sees this of course, which is why they're so hard at work to get into the mobile market. But they are at a cost disadvantage, and considering their x86 margins, I don't see how they'll ever be more appealing than ARM from the point of view of cost.
 
Last edited:
Actually Apple has two choices:

  1. Release a 45W 13" rMBP, because the cooling seems to be adequate enough (they use the same mechanism from the 15" version and that has to dissipate 45W from the CPU + 60W from the GPU if both are fully loaded)
  2. They can use the new cTDPdown stepping Intel introduced; with this the CPU + Iris 5200 will be slightly downclocked but will fit into the 35W TDP envelope

Personally I believe they will do the later but I'm hoping for the former. If they'll do I will sell my current 13" rMBP and get the new one, because this will be a HUGE upgrade (quad core + half way decent GPU).

I really hope you are right - I'd love to see the 5200 (or at least the 5100) in the 13" rMBP. I doubt they'll go to 45W because of battery limitations, but Intel did release a quad core 35W chip (with 4600 graphics).

I wish I could say that Apple would surely not settle for 4600 graphics in its near-top-of-the-line MacBook but I'm not completely crazy. I'm still going to hope for a custom chip that has a 35W quad-core CPU and 5100/5200 graphics though.

Edit: And also an option for 16GB RAM please.
 
Last edited:
What sucks about having to wait to get the Mac Mini with USB3 is that only 6 months later, the GPU is completely out of date even with Intel, let alone stand-alones and there is no way to just replace the GPU in it. But just because a better integrated GPU is now becoming available, Apple might take another year to update the Mac Mini so I can't just replace it either. The GPU is the weakest part of the Mini, after all and so it's a bit of an irritating situation. Yes, other Macs face similar problems, but they often get updated more often (well, the MBP does, at least).

Huh? The Mac mini had USB 3 support with the last refresh. The Mac mini (just as every other Mac) has and will receive annual updates. Apple has their update times in sync with Intel and not with Nvidia/AMD.

Also the Mac mini is just that: a small form factor PC. If you want a gaming rig out of it I suggest to build your own, but I will still be bigger then the Mac mini and especially louder. There is just so much one can do in such a little enclosure. The new updated integrated graphic from Haswell should be powerful enough to play games at low resolutions and medium settings, so they exactly appeal to 99% of potential customers of a Mac mini.
If you want a better graphic either wait for Broadwell next year (which will have a new graphic uArch) or get an iMac which is bigger both in size and price.
 
Iris Pro Graphics have been released. BUT it is just a "soft release". The acutal parts will become available near the end of Q3 2013. Iris Pro 5200 graphics are extremely unlikely.
 
Intel Launches Haswell Processors Ahead of WWDC Mac Updates

I really hope you are right - I'd love to see the 5200 (or at least the 5100) in the 13" rMBP. I doubt they'll go to 45W because of battery limitations, but Intel did release a quad core 35W chip (with 4600 graphics).

I wish I could say that Apple would surely not settle for 4600 graphics in its near-top-of-the-line MacBook but I'm not completely crazy. I'm still going to hope for a custom chip that has a 35W quad-core CPU and 5100/5200 graphics.

Well that's the point a lot of people are forgetting: intel mover the VRM onto the chip die. They probably consume like 5W easily so this chip is more like a 40W chip. It really is not much more then the older chips at 35W because they had the VRM on the logic board. Also down forget TDP does not equal energy used. It is actually a measurement of how much heat is possible generated by the component (of course this is directly linked to how much energy is used). Intel implemented a lot of energy saving features and chances are that a Haswell chip under normal loads will spend more time in its C states then Ivy and hence save more energy.

Also don't forget that with integrating the VRM into the chip the logic board can be somewhat smaller. Perhaps enough to make room for a bigger battery or better cooling solution.


Iris Pro Graphics have been released. BUT it is just a "soft release". The acutal parts will become available near the end of Q3 2013. Iris Pro 5200 graphics are extremely unlikely.

Do you have prove for that? Anand said they are launched.
 
Last edited:
Well that's the point a lot of people are forgetting: intel mover the VRM onto the chip die. They probably consume like 5W easily so this chip is more like a 40W chip. It really is not much more then the older chips at 35W because they had the VRM on the logic board. Also down forget TDP does not equal energy used. It is actually a measurement of how much heat is possible generated by the component (of course this is directly linked to how much energy is used). Intel implemented a lot of energy saving features and chances are that a Haswell chip under normal loads will spend more time in its C states then Ivy and hence save more energy.

Also don't forget that with integrating the VRM into the chip the logic board can be somewhat smaller. Perhaps enough to make room for a bigger battery or better cooling solution.

I'm not an expert so I can't judge your claims, but I do like what you're saying!
 
Actually, Intel says so on their website for the CPUs with 5200 Iris Pro: "Launch Date: Q3 2013"
The other models with 4600 GPU are listed as "Launch Date: Q2 2013"
http://ark.intel.com/products/76086/Intel-Core-i7-4850HQ-Processor-6M-Cache-up-to-3_50-GHz

Also, you can not find CPUs with 5200 Iris Pro for purchase, can you?

Ok, but that could mean anything between July - September :cool:
Also it wouldn't be the first time Apple get's an heads up. Intel understand the halo effect of Apple supporting their new chips in outstanding products fully well, so here's to hoping
 
Also, you can not find CPUs with 5200 Iris Pro for purchase, can you?

One of the linked articles said that 5200 comes only in a CPU form factor intended for soldering to the motherboard. So assuming that's true, they would be selling that version only to OEM and not make that chip available to the public.
 
One of the linked articles said that 5200 comes only in a CPU form factor intended for soldering to the motherboard. So assuming that's true, they would be selling that version only to OEM and not make that chip available to the public.

Correct: it's BGA only and not LGA for sale off the shelf, without a mainboard.
 
I think your friend picked up the last revision of the latest chip. Haswell hasn't even launched desktop chips as far as I know yet.

You are wrong. Newegg for one sells haswell's (desktop) and sager already has (laptop ones) them in their laptops.
 
Last edited:
The most interesting part about the 5200 is the 128MB L4 cache IMO. The 5200 is the only chip that has this right?
 
Haswell E3s don't even support 20 lanes anymore, back down to 16 on the CPU.

That sucks. The "extra" 4 would have been good to use on a differentiation like 10GbE controller(s). I guess that is some of the pins that disappeared.

----------

The most interesting part about the 5200 is the 128MB L4 cache IMO. The 5200 is the only chip that has this right?

The is primarily for video. Effectively improves graphics performance with faster RAM. x86 can leverage it if the GPU isn't consuming it all but under most normal workloads that isn't going to happen much.

----------

Well that's the point a lot of people are forgetting: intel mover the VRM onto the chip die. They probably consume like 5W easily so this chip is more like a 40W chip.

It is not on the die. It is two dies in one slightly bigger package. The eDRAM is a separate die.
Can actually see it here. (the smaller rectangle is the eDRAM )
DSC_0343_678x452.jpg

[from an anandtech story. http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested ]


Eventually Intel may weave them together but not for foreseeable future. It is similar to what smartphone SoC packages do and weave the RAM into the same package as the CPU die.
 
The is primarily for video. Effectively improves graphics performance with faster RAM. x86 can leverage it if the GPU isn't consuming it all but under most normal workloads that isn't going to happen much.

It's a general purpose cache. What is noteworthy is it's size! It is not SDRAM however, but it's located on die so…

A lot of real world algorithms make use of cache locality, which can have huge impact on performance. Or course with the normal frugal cache sizes there are limits to what is possible to do.
 
That sucks. The "extra" 4 would have been good to use on a differentiation like 10GbE controller(s). I guess that is some of the pins that disappeared.

----------



The is primarily for video. Effectively improves graphics performance with faster RAM. x86 can leverage it if the GPU isn't consuming it all but under most normal workloads that isn't going to happen much.

----------



It is not on the die. It is two dies in one slightly bigger package. The eDRAM is a separate die. Eventually Intel may weave them together but not for foreseeable future. It is similar to what smartphone SoC packages do and weave the RAM into the same package as the CPU die.

Yes sorry got chip confused with die...the VRM as we as the Crystalwell eDRAM is on the same chip but not on the same die.
 
Not arguing for or against the initial point (ARM-based Macs), but there's a few things you've missed in your response.
....
Yep. An ARM CPU won't run X86-* software. However, software written for OS X doesn't *have* to be for X86. It can also be for PowerPC. It would take very little effort for Apple to enable compiling OS X apps for ARM, and *most* software would compile to a new Universal binary without any significant effort, because *most* software doesn't bother with processor-specific code anymore these days.

Part of the reason Apple ripped the universal binaries out of OS X is that primary software distribution mechanism now is over the Internet. "Fat binaries" ( programs encoded multiple times ) means bigger bulk transfers.
Apple is a major software distributors now. I don't think they going to be interested in double their bandwidth for "nothing".

Just because Universal binaries mechanism is there doesn't mean have to use it. It is largely more flawed also ( as I pointed out in another response) in that the ARM is just 32-bit (from application perspective) and OS X is 64. It isn't Universal binary of a single 64 app talking about here. It is spliting the app into implementations too. That isn't going to get you many cheers at WWDC. It is basically piling more work on developers.




Yes, going the Rosetta route would be an obviously bad idea. But Apple tends to spot and avoid those obviously bad ideas (especially when it comes to software solutions), so I don't expect we'd ever see an ARM->Intel 'Rosetta' implementation.

Doubtful also. When iOS goes 64-bit (when ARM goes 64 bit) then it is more likely it would just 'wipe out' OS X if there was a shift totally to ARM. A shift to ARM implies both AMD and Intel have failed ... which isn't very likely. One of them perhaps, but not both.
 
Part of the reason Apple ripped the universal binaries out of OS X is that primary software distribution mechanism now is over the Internet. "Fat binaries" ( programs encoded multiple times ) means bigger bulk transfers.
Apple is a major software distributors now. I don't think they going to be interested in double their bandwidth for "nothing".

The reason is that the PPC transition is now over. If it wasn't, it would still be in use, and if the need arise in the future, it will be used again.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top