Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
More motivation for Apple to consider moving to ARM based chips

So they can go from having to wait to having to ramp up designs to keep up with what they're waiting for?

Or do you think Apple has designs ready for a chip that can keep up with even an i7 Sandy Bridge?
 
iOS app runs on OS X will be ****** experience and you will stuck with one app at a time limit. It just does not work well. Also Apple choose Intel not only because it had better performance than PowerPC back then, but also the Windows compatibility. The reason Macs sales was up year over year AFTER intel switch was mainly because of Windows compatibility.

If Apple choose to down with ARM road, we will see the death of Mac. I will certainly switch back to Windows.

Well the iOS benefit would simply a perk and also enable developers to code for both platforms (OSX and iOS at the same time). Perhaps it could run on the side or in a windowed environment. It's not perfect, but neither is running Windows.

While I agree that the lure of Windows compatibility was a draw to people at the time of the last big switch (and still is to some extent), it was one that was initially explicitly denied by Apple. If you recall, some nerds bundled together a prize fund for the first person that could get Windows booting onto the Mac. It was a horrible mess so soon after Apple was forced to offer their own legitimate avenue (bootcamp) for Windows users keen to explore this new Apple world. I mean it would be an awful experience to first time users if they had to use some hacked emulated BIOS to get around the EFI booting that Macs do.

That is to say, despite the massive revenue spike, I don't think Apple was particularly keen to allow Windows booting on their hardware, which just undermines application development for their own OS and ecosystem.

AutoDesk is a case in point, it took them 7 years to port an Intel copy of AutoCAD and they've done absolutely nothing else. Whilst much more simple, their iOS development is growing exponentially and likely makes much more money.
 
They're not a monopoly. Apple can use AMD or even their own processors if they wished too.

True, but the alternatives offer nowhere near the performance, power consumption, reliability, manufacturing plants, or acceptable temperatures. If Apple jumped ship and made worse products as a result of that, they'd upset a lot of people.

So in that sense, although Intel may not be a monopoly, they're essentially the only viable option for a company such as Apple.
 
I'll look forward to pulling this quote out in a few years. By making an overarching statement like this, you're going to look like one of those that says "technology will never get to the point where ___". I mean, the fact that you say just shows how limited your vision is.

I think he/she is speaking to the CURRENT ARM designs. While the CPUs are getting faster and faster, they are not yet to the point where I want them in a general purpose machine. ARM can be made to compete with Intel, but this means destroying the one thing that ARM has over Intel right now: power consumption.

He/she I believe is referring to why someone would want to buy something new (just for the sake of buying something new) that performs poorer than the current iteration of products.
 
My MBP and iMac from 2008 are still going strong. Your 2010 MBP has got years of life left. :D

I went from a baseline early 2008 MBA (3 years) to a late 2012 Mac Mini, I feel like I'm set for life.

----------

So in that sense, although Intel may not be a monopoly, they're essentially the only viable option for a company such as Apple.

There's always VIA (hue hue). :rolleyes:
 
I went from a baseline early 2008 MBA (3 years) to a late 2012 Mac Mini, I feel like I'm set for life.

----------



There's always VIA (hue hue). :rolleyes:

Strange enough, Toms Hardware has an article about VIA building a new x86 CPU.
 
I think you mean oligopoly.

Thanks for posting the proper term, and it's fun to say! :p

-------------
I fear the Intel > Apple A-chip switch because Windows is simply a requirement for many people. Even certain software that is designed for both Mac and Windows often falls dramatically short of the Windows counterpart.

The only reason I was able to push myself to Macs a few years ago was knowing that, if the occasional need presented itself, I could run Windows. If Apple takes this ability away I will either have to leave Mac (unlikely) or buy a second computer for Windows use (simply annoying).
 
Last edited:
So they can go from having to wait to having to ramp up designs to keep up with what they're waiting for?

Or do you think Apple has designs ready for a chip that can keep up with even an i7 Sandy Bridge?

Wasn't the A9 scoring in the 2009 MacBook Pro benchmark area? Even if Apple could create a chip that could match 2011 dual or quad core systems in raw scores I still don't think it would translate to real world use.

Intel's chips have been around a long time, software is highly optimized to take advantage of multiple cores and virtual cores, etc.

I think (but I'm not an engineer) that if you put the A9 in a MacBook Pro and ran it side by side with a 2009 MacBook Pro you would see a great difference in favor of the intel chip. Again, I don't know, just my speculation.
 
I find the whole ARM alternative idea bit absurd. With the significant focus and increase we've seen from Intel with Haswell and upcoming Broadwell, I feel like it will take less time for x86 to be efficient as ARM than ARM to be as powerful as x86.
 
So you want pay for Sam amount of money but with much lower performance? That seems huge step backward to me... And then you stuck with same position with Microsoft did for their Windows RT, you get no apps. Maybe you will able to run all iOS apps, but you are limited with one app at a time? That is. It a desktop OS should do.

ARM is fine for most things, but not fine for a laptop.



My point is very soon an ARM processor will be able to run OSX, with the levels of multitasking most MBA entry level users require. its not a backward step, and it could result in increased battery life or lower cost.

Running a browser, lync and a mail program 90% of my day is hardly CPU intensive.
 
Battery Life

It seems like we've reached the point of diminishing returns on Battery Life. In most real world uses, the battery of the current laptops will make it through a full offsite work day. Probably we've reached the point that folks can do weekend trips and leave the laptop charger at home. I'm assuming that you are taking your trip to some new place and not planning on staring at the computer all weekend.

Your iPad will also make it through the weekend easily, even if the kids play with it. But that doesn't matter because your iPhone won't, so you will need that charger and it can also charge the iPad.

But you definitely still have to bring the laptop charger for a week long trip even if you aren't planning on being online too much. That would seem to me to be the next big step and I can't see us getting there anytime soon.

I've got pre-Haswell HP, so I only get 3. 5 hours of real world work time on a charge. That means I can go to the coffee shop without a charger, but I have to either bring the charger or head back home. And if I have day long meetings off site, I need my charger with me. So all this awesome Mac battery life seems like nirvana to me be it Haswell or Broadwell.
 
Wasn't the A9 scoring in the 2009 MacBook Pro benchmark area? Even if Apple could create a chip that could match 2011 dual or quad core systems in raw scores I still don't think it would translate to real world use.

Intel's chips have been around a long time, software is highly optimized to take advantage of multiple cores and virtual cores, etc.

I think (but I'm not an engineer) that if you put the A9 in a MacBook Pro and ran it side by side with a 2009 MacBook Pro you would see a great difference in favor of the intel chip. Again, I don't know, just my speculation.

I think the A7 was equal to ... I forget what in terms of Intel CPU. But you can't just take CPU performance. Intel has been making great strides with their GPU, which I'm not sure ARM has been able to follow.
 
Disagree. Generally speaking, when computers are binned, it's not due to the processor revision -- it's when the computers stop working. More updates, constant work and a pipeline of innovative CPU revisions is paramount to engineering products of the future; furthermore it means you won't be caught out with a competitor's product due to laziness.

Deliberately holding off on developing the pipeline for future products/revisions, or feeling self-secured in your current position, is one of the reasons Nokia and Blackberry are where they are now. It also annoys the people who rely on your products and encourages them to look elsewhere.

If Intel adopted this mentality, I'd argue that it would push Apple towards developing their own in-house chips; similarly to why Apple moved from PowerPC to Intel.

So what are you disagreeing with? With the amount of rare metal reserves on earth or how frequent we trash our computers? CPU increases are the main reason developers get lazy about writing optimised code. They let the hardware compensate for their bad code. If the hardware gets updated less frequently, developers would do better work and we'd still get the same amount of performance. There just isn't enough metal on earth to go through silicone at the speed we do today. So this has to change at some point, the sooner the better.
 
True, but the alternatives offer nowhere near the performance, power consumption, reliability, manufacturing plants, or acceptable temperatures
I'll not contest that, Intel certainly has a superior product the manufacturing muscle to back it up.
 
Running a browser, lync and a mail program 90% of my day is hardly CPU intensive.

You'd be surprised, a lot of websites these days run pretty poorly on slow CPUs (though, that isn't always necessarily the fault of the slow CPUs).
 
They haven't run into physics yet, they've run into capitalism.

Oooo, cynical truth. I like it. Plus, it assuredly plays a large role.

I'm sure Intel is trying, but with limited competition and no real threat of losing the market they don't have the pressure to push out their products quickly.
 
I find the whole ARM alternative idea bit absurd. With the significant focus and increase we've seen from Intel with Haswell and upcoming Broadwell, I feel like it will take less time for x86 to be efficient as ARM than ARM to be as powerful as x86.

I'm no fan of the rumour personally, not for professional use at least; tablets have come a long way and they've admittedly replaced home computers for many things.

And that's fine, I can see an iMac powered by one or several ARM chips to make a cheap machine that's sufficiently fast for Facebook and Youtube, while paired with a larger display than the one on the iPad.
 
You do realize that Intel makes good and bad decisions like every other company , right? Do you understand the reasons why AMD was able to catch up to Intel during the early P4 days, or are you just spouting off to spout off?

of course, but they also are driven to compete by market conditions.
in a market where they have little competition, I believe as others have said, that INTEL are probably holding back or at least not pushing the boundaries..... why should they???


My desire is to bring back the incentive..... I think with a competitor, and I doubt AMD's APU is really going to cut it, they would be doing more.

With the size of the A series chips now and the internal teams apple is building with chip expertise, I dont see why an A series chip could not be used to drive a MBA to the same levels of performance seen today, but with better battery life of reduced cost to Apple (and hopefully end users). I have little desire to put IOS apps on my laptop, I just think that Apple is skilled enough to pull off migration such that entry level could be A series and Pro-User could remain on INTEL, but an INTEL with the incentive to accelerate its roadmap.
 
Don't worry. The need to upgrade is very close to being over. We'll all be keeping our newly bought tech longer than ever before from here, on out. I suspect Broadwell will be a nail in the coffin. Phones and tablets are actually there, now. Manufacturers know it, they just don't want us to know it.

Totally disagree with you. I think hardware is going to go through major step changes in the next 20 years. There's tons of amazing tech that's in r&d phases.
 
I'm no fan of the rumour personally, not for professional use at least; tablets have come a long way and they've admittedly replaced home computers for many things.

And that's fine, I can see an iMac powered by one or several ARM chips to make a cheap machine that's sufficiently fast for Facebook and Youtube, while paired with a larger display than the one on the iPad.

Very true, yet I don't see Mac line up switching to ARM in order to meet those needs when iDevices can perfectly accommodate said tasks. If anything we should expect further development on iDevices' input, UI and ultimately UX (to enable multi-tasking, better typing experience out the box without any third-party keyboards) that may completely get rid of need for OS X.
 
I guess it's a good thing I decided to pick up a new iMac now instead of waiting. It would be a loong time without a computer. ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.