Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
To be fair, most Intel launches have slipped 6+ months in recent years. Doesn't prove they're sat on unsold inventory. I suspect Microsoft have most of that tied up in unsold Surfaces. :D

https://forums.macrumors.com/showthread.php?p=18980930&broadwell#post18980930

https://forums.macrumors.com/showthread.php?p=18951356&broadwell#post18951356

https://forums.macrumors.com/showthread.php?p=18793315&broadwell#post18793315

Just to prove I wasn't making this up, hehe!

I agree it's not always down to an inventory stockpile but it has been the reason for the last 3-4 years.
 
Well if there's anything that can be gained from this it's AMD, which is apparently now flush with cash from Sony and Microsoft, getting a bit of breathing room. I'm not saying this will mean that we're going to see AMD chips in macs, but rather that they'll be able stay in competition and we know what happens when Intel gets to run it's own show for too long (Prescott).

As for Apple's own ARM based solutions, I don't think they'll scale up that well and Apple simply doesn't have the engineering resources to keep up with Intel in chips for larger systems as the more high end the chip, the more work it's going to be developing it. Both Intel and AMD have fairly huge development divisions in countries with skilled, but cheap, labor. For Intel this is India and AMD it's Malaysia.

The only place where I can see Apple trying out their luck with in-house chips is the Air...

If they had the resources before why not again, now that they are in a totally different place than 2005's "Chip-Swtitch"?

They can pair-up with one of those to combine designs of an integrated chip THEN stand to gain billions as a new chip provider!”
 
Yes, lets stop progress just because everything is good enough for most people. We can use 2ghz dual core chips for the next 20 years until AMD gets back in the game.

Personally I mostly use my computers to browse the internet, even a 5 year old computer is good enough for me. But for the people who need to speed, prepare to pay over $2000 for a newer CPU. Oh don't worry, if Haswell is good enough, those (slow) Haswell chips that will be still sold in the future will still sell for a high price of $200-$300 each, just cause Intel can.

Some people just don't have a clue.

14nm chips are not really about speed although they are obviously quicker than their 22nm counterparts, it's about efficiency, power efficiency & the ability to make devices smaller & thinner.

Again, if people are unable to do something with a Haswell CPU then Broadwell isn't going to help them out very much.
 
What's wrong with Intel? Core 2 was a great leap, i7 and Sandy Bridge offered massive performance and Ivy/Haswell drove down power consumption to much more manageable levels. Broadwell will finally bring it down to fanless levels on some ultra books.

Intel offer
- great process
- great circuits
- adequate microarchitecture because of
- lousy architecture

As a consequence Apple can match their performance at low frequency in smaller area, lower power, and less advanced process. Apple using 28nm and no FinFet can mostly match a Haswell i3 manufactured at 22nm.

Now this is not a fair comparison.
(a) Intel offer a few kickers to boost performance in some specialized cases, from hyperthreading and AVX2 to 4 (and more) cores to the ability to get to 4GHz.

(b) Apple could, if they wanted, easily enough throw in four cores (or five to match hyperthreading) and could extend their vector width. Those are minor tactical decisions based on the target market.

(c) More interesting is the frequency question. COULD Apple, if they wanted to ditch Intel (eg because of this issue of being linked to their update schedule) run their core at 4GHz (using, of course, desktop like 35 or 45 or 65W rather than phone-like 3W)? I honestly don't know. Doing this requires
- some microarchitectural tricks to split the pipeline nicely and to limit the damage from memory being so many cycles away (I'm confident Apple could do this)
- some circuitry tricks (not so sure)
- process tricks (not at all sure)

On Apple's side, they have a much simpler design target, and because of this they have the flexibility to try some daring ideas that may be too much for Intel (eg the CPU consists of two layers of silicon rather than one --- ie very simple "3D" design. Schemes like this allow multi-ported structures to split the ports between the two layers with consequent substantial (30% or so) improvements in speed and simultaneously lower power.)
There are also a number of tricks we know Intel use which Apple apparently aren't yet using, so they have a much clearer path ahead of them.

All indications are Apple get access to a 20nm process this year, and Finfets next year. Each of these should be good for around a 30% speed boost at the same power. Throw in some obvious micro-architectural improvements Apple is not yet using (eg split L2 caches and a much larger faster L3 cache, better memory controller) and I'd expect Apple to be able to generate at least 1.4x speed boost for the A8 and A9. Point is, Apple appears to be on its way to matching Intel's single-threaded performance a whole lot faster than Intel appears to be raising that performance.
Which is still VERY DIFFERENT from claiming that Apple can ship an Intel iMac replacement chip this year. But in late 2016???
 
You can't compare the two, they use different applications and OS, have different memory management systems, etc. Literally apples and oranges.

Besides, by switching to ARM and recompiling OSX and applications to run on it, what would we gain exactly?

Nevermind.
 
Last edited:
You get different scores running OS X or Windows on the same computer too.

Yes, but Windows and OS X run on the same x86 processor - ARM and x86 are different processor architectures, it would be like comparing MS Word 2010 for OS X to MS Word for Windows RT.
 
Well, I took your comment on another angle... Your comment suggested developers need to optimize code rather than rely on new processors constantly. That made me think about the Pentium 4, where Intel themselves weren't worried about optimizing anything and instead just sped the thing up (to the point of frying).

That would make sense if Intel continually pumped out garbage. However, that is not the case. Netburst was only going to be decent at those high clocks, and Netburst was the microarchitecture that Intel had bet with. I am unsure of what you expected Intel to do at that time .. scrapping the product was out of the question, the "fix" was a higher clock rate to make things work as they had hoped.

The P4C 2.4ghz that I had at that time was a very good chip and AMD was doing nothing that gave it a run for the money (unless you consider renaming their chips... that's all). The irony now is that while Intel is far ahead, AMD has picked up the Mhz race with 4+ Ghz chips to try and compete.

Regardless, there is still much room for code to be reviewed and to be made more lean. The point still stands.
 
Yes, but Windows and OS X run on the same x86 processor - ARM and x86 are different processor architectures, it would be like comparing MS Word 2010 for OS X to MS Word for Windows RT.
I wonder if you would be able to benchmark using Office macros for both platforms.
 
I tried Linux (Ubuntu). If operating systems were like health care options,

OS X is like a major regional hospital
Windows is like a walk-in commercial care facility
Linux is like performing an appendectomy on yourself

Unpleasant.

I love linux. Tried plain Debian or Mint? Ubuntu is bloated.
 
I tried Linux (Ubuntu). If operating systems were like health care options,

OS X is like a major regional hospital
Windows is like a walk-in commercial care facility
Linux is like performing an appendectomy on yourself

Unpleasant.

Try Linux Mint 17 Cinnamon and see whether you feel the same way :)
 
So, where is this "Best Product Pipeline in 25 Years Coming Later This Year From Apple" going to start?
Maybe this pipeline/dream will consist of one single product: finally an Apple product that doesn't initiate self destruction sequence when that slippery thing slips first time from your hands.
 
There is nothing wrong with Intel. Intel's "problems" are twofold.

First Apple wants the yearly OS/hardware fashion show where perfectly good models of both items are replaced with (sometimes) slightly better ones.

Second there are a bunch of whining consumers who think that a complicated industry, whose science they don't even understand, somehow owes them new products every year. You'll find them slobbering over all the latest gotta have buzzwords line "Retina", "liquid metal", blah blah blah.

I'm not one of the whining idiot consumers who barely understands the tech in their device and I don't need new devices every year but, I don't think changes like retina displays or using liquid metal are anything to scoff at.

High DPI Retina displays are a huge leap forward and one of the best additions I've seen on either my phone or laptop. Intel should have new chips year on year, it's called tick-tock. Only AMD offer no competition so Intel has no reason to push itself, just look at how quickly they advanced their Core 2 architecture after AMD handed them their ass.

Liquid metal is something I look forward to, quite an interesting blend of zirconium and more common elements like copper, aluminium etc.
Apple does innovate and I enjoy their innovations. Most consumers are idiots, just look at the most popular reality TV and it says it all.
 
I tried Linux (Ubuntu). If operating systems were like health care options,

OS X is like a major regional hospital
Windows is like a walk-in commercial care facility
Linux is like performing an appendectomy on yourself

Unpleasant.

With a Linux forum, where when you ask "should there be this much blood?" you get "UGH WHAT A N00B! Everyone knows the answer to that <Extremely Sarcastic Rant Here>" then never gives you the answer as you bleed out on the table...

I use Linux/UNIX (several server flavors) for work, but Linux communities have some of the biggest jerks I've ever seen. Not that everyone that uses it is, but it only takes a few...
 
I love linux. Tried plain Debian or Mint? Ubuntu is bloated.

Try Linux Mint 17 Cinnamon and see whether you feel the same way :)

It's just for the first time in decades, I felt totally lost. Didn't really know how to grasp the file structure, how to run a program (where's the .exe? where's the package file? where's the any key???). Just seemed too different. Plus, any Linux person on the web freely spouts convoluted terminal commands, with bashes and sudos and other things I don't get and don't really have the energy to learn.

Is Mint more . . . usable?
 

Personally I prefer to get my understanding of the world from people who aren't wild ranters obsessed with obtaining overclocked CPUs and certain that Intel is out to screw them (some sort of weird Stockholm syndrome there with simultaneous hatred/anger at Intel and insistence that no-one else, AMD IBM, ARM, will ever replace them).

But hey, that's just me. If you can point to an article discussing the Broadwell delay that's written by a normal human being with the interests and priorities of a normal human being (ie NOT obsessed with unlocked CPUs and certain that the only reason Intel hasn't released a 16GHz CPU is because they hate gamers and modders) I'd be happy to consider alternative explanations to the one I have proffered, namely that this is all about an inability to get 14nm to run at high frequencies.
 
Apple's custom silicon isn't at a point where it can power a Mac, but based on the pace at which Apple is developing its own CPUs, you have to wonder how far off that day really is.

Totally agree. I suspect Apple already has an internal roadmap to switch to their own processors for the Macintosh line, as this is not the first time Intel has delayed chips. Intel's chip offerings are starting to look like the PowerPC in the late 90s. My guess is Apple would test the waters first, and might see an Ax chip appear first on the Macbook Air.

"Ax" for mobile, "Ax Pro" for Mac?

The only downside is this will make cross-platform apps from Windows products harder to maintain (e.g. Office, Adobe products, games) unless Apple works some magic.
 
With a Linux forum, where when you ask "should there be this much blood?" you get "UGH WHAT A N00B! Everyone knows the answer to that <Extremely Sarcastic Rant Here>" then never gives you the answer as you bleed out on the table...

I use Linux/UNIX (several server flavors) for work, but Linux communities have some of the biggest jerks I've ever seen. Not that everyone that uses it is, but it only takes a few...

If we judged an OS by their fans, few people would be using OS X.

It's just for the first time in decades, I felt totally lost. Didn't really know how to grasp the file structure, how to run a program (where's the .exe? where's the package file? where's the any key???). Just seemed too different. Plus, any Linux person on the web freely spouts convoluted terminal commands, with bashes and sudos and other things I don't get and don't really have the energy to learn.

Is Mint more . . . usable?

Nowadays, you just go to the softare center and look up whatever it is you want. For the few things that weren't in there, the site had a lovely .deb file that was opened up WITH the software center and quickly installed. I also don't know what you mean when you say you "didn't grasp the file structure".

Actually, when you said something about "where's the any key" I quickly realized this is an attempt at humor.

----------

Totally agree. I suspect Apple already has an internal roadmap to switch to their own processors for the Macintosh line, as this is not the first time Intel has delayed chips. Intel's chip offerings are starting to look like the PowerPC in the late 90s. My guess is Apple would test the waters first, and might see an Ax chip appear first on the Macbook Air.

"Ax" for mobile, "Ax Pro" for Mac?

The only downside is this will make cross-platform apps from Windows products harder to maintain (e.g. Office, Adobe products, games) unless Apple works some magic.

Oh, another (huge) downside is their chips aren't as good.
 
That future roadmap from Intel is just...depressing. Two more Haswell refreshes before Broadwell? And we're all fooling ourselves if we think for a second that they'll launch Broadwell and Skylake together. Skylake will be pushed out to 2016, whether it's ready or not. Intel isn't going to just scrap Broadwell by releasing its successor at the same time in the same market. \

This is bad thinking on Intel's part. Broadwell isn't nearly enough of an upgrade over Haswell for anyone to go out and upgrade anything. After another year of Haswell, people will be itching to though. They'd be better off realizing that they can't execute on tick-tock anymore, or at least are slower at it and just skip to Skylake to drive high end sales.

It is typical corporate doubling down on strategy instead of really looking at sunk costs and potential benefits.
 
This is bad thinking on Intel's part. Broadwell isn't nearly enough of an upgrade over Haswell for anyone to go out and upgrade anything. After another year of Haswell, people will be itching to though. They'd be better off realizing that they can't execute on tick-tock anymore, or at least are slower at it and just skip to Skylake to drive high end sales.

It is typical corporate doubling down on strategy instead of really looking at sunk costs and potential benefits.

I thought that Broadwell was supposed to considerably reduce power draws. It's not a big CPU speed boost, but the power cost savings alone will certainly be a big plus for the ultrabook/macbook lineups around the world, no?
 
That bites - so much for seeing new rMBPs this year :(
Or Mac Minis! July next year is really going to suck if that's what we're waiting for!

I guess we can hope Apple might get some early, but if there's no Mac Mini announcement by the end of this year I'm going to have to seriously consider a hackintosh…
 
This is good. We actually need less frequent updates to CPU's so we upgrade less. When we trash a computer, even though Apple claims it's recyclable, most of the rare metals are wasted. If we keep going through them like this we won't be able to build computers in the near future.

PLEASE! Have you not heard of capitalism? A business must produce new products to satisfy the bottom line. No company can survive without new products and sales. I mean Apple makes its living off of a new iPhone/iPad every year.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.