Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Do you really believe you have a desktop/workstation level processor in your phone? Think about that for a moment. The same chip that can only run one single app on screen at a time. 2 tops on an ipad. You really believe this same chip can somehow be better than a $300-$1000 cpu? o_O

I think he is comparing an A11 to something more comparable, such as a MacBook processor. I don't think anyone is thinking an A11 could replace a high end desktop/workstation, yet.

No problem and thanks for asking.

here's a workflow example that would be killed by a change in architecture.

The databases I frequently work with, I may copy to my laptop for maintenance and testing. currently, since it's x86 to x86, that's a simple process of taking last nights full backup (100gb) and restoring it on my laptop, then getting to work. While the database engine I use does not have a MacOS equivelant (and they have said they never will support MacOS), I can easily run the engine in a VM on the desktop, or even boot straight to linux.

that would no longer be possible under an ARM architecture. First, should they emulate x86, the performance impact would be too big for efficient use. Second, because of the change in architecture, I would have to export the entirety of the data to flat text file and then reload it under the new architecture. doing this with 100GB database is.. well, not ideal. And there's no way to convince Financial institutions to go out and throw away 20+ years of software development that runs their entire financial institution just for MacOS compatible platform.

Another task that would be impacted is VM testing and integration. While you can do many things on remote EXSI hosts, not all the maintenance tools have MacOS versions (or full featured), thus Parallels / VM's offer enough compatibility. these would need to be written in MacOS (which they never really did even now). So without these tools, also would render Mac's unusable for many IT professionals.

But one of the benefits now, I can create a VM while i'm out on the road or at home for testing purposes. get into the office, and migrate it to the cluster (or vice versa). with an architectural change, this is likely not possible.

I think for the average home user who can get by on a MacBook today, the change would be invisible. As mentioned, most users who login to cloud services don't care whats running those cloud service. they just know they put their email address, password into a box and they get their data / Email. however, All those services are likely running linux and windows in the back end (Exchange is the most common email service platform that relies on Windows and Active Directories.). Heck, even Apple's own cloud services back end are linux based (they run on Google's cloud platform currently).

as Is aid, from an end user perspective, this is all invisible and "under the waterline" stuff. But for those of us who have to support these things daily and professionally, there are massive differences in how we operate vs the average home user. and we're not a small group of people anymore. IT departments are massive investments to power and operate just so that the average user never has to even think about it. But we're not a small group of people anymore as the world has become more and more technical.

Makes sense, appreciate the perspective. In short, if the required apps/tools are created, then the possibility to use this platform is there. I get the sense from many on here that they just can't see any other possibility but x86. I am more so thinking about the next platform, not the one that exists today. I get it though, this is people's livelihoods and I respect that.
 
Well once you throw out "viewing animals on social media", it certainly feels that way. I respect your opinion, but I try not to engage with those who are so stuck in the past.

I'm stuck in the real world. Look at my tag - "use the best tools for the job".

If you can do what you need to on iOS, great. For others, iOS isn't enough, need something more flexible.

I'm certainly NOT in favour of dumbing down Mac OS into the level of a mobile OS, which what iOS Desktop most likely would bring.
 
  • Like
Reactions: elmaco
Take a single Xeon core, limit it to the thermal envelope of an iPhone 7, and see how well it does. For all we know, in some hidden Apple lab, they have an experimental A12 or A13 processor core fabbed in TSMC's next-gen high-performance process, and packaged with a heat sink capable of 100W+ heat dissipation. The pounding could easily be going the other way. If so, that's why Apple is thinking of using their own processors, and sticking as many cores on a die as they think a product needs in 2020.
I would love to see a Apple Server comeback based on these A13-A20 Chips ...
 
With your knowledge, if they were to make a true desktop processor, how likely do you think it would be that they could get something significant, say 50%-100% more performance per watt than comparable Intel CPU's?

Physics is physics. Static power is mostly a function of the quality of the fabbed transistor, so let’s assume intel is slightly ahead there. Dynamic power is where most of the power is consumed, and that’s cv^2. So assuming the same operating voltage, you want to drive less capacitance. Two ways to do that - fewer switching wires, and better physical design. Better physical design may buy you 20 percent. And you get probably 20 percent fewer switching wires in risc by not having the complex instruction decoder and memory management units you need in x86. So I’d say I’d expect Apple can do 25% better performance per watt in total (20percent of 20 percent, don’t add).
 
I think he is comparing an A11 to something more comparable, such as a MacBook processor. I don't think anyone is thinking an A11 could replace a high end desktop/workstation, yet.



Makes sense, appreciate the perspective. In short, if the required apps/tools are created, then the possibility to use this platform is there. I get the sense from many on here that they just can't see any other possibility but x86. I am more so thinking about the next platform, not the one that exists today. I get it though, this is people's livelihoods and I respect that.
A11 is not there yet it could very well be comparable to a Dual core i5 on a Macbook Air 2013.
 
This is probably a fake article. I can tell because its headline isn’t “top 5 apps Mac Rumors recommends” or other non-news, non-rumor stuff which is the new norm.
 
TLDR; ...
These are very basic needs that are currently not being met by Apple. See to these first, and then go and innovate on top. Just please, make products that start with a customer focus, not just tech for its own sake.

A thoughtful and complete answer. I do believe, however, that arm CPU's would have waaaay better battery life in a laptop form factor than what we see from Intel and far less expensive. Assuming the cost savings would be passed along to the customer, there are clear technical and competitive advantages and "not just tech for its own sake".
 
This is probably a fake article. I can tell because its headline isn’t “top 5 apps Mac Rumors recommends” or other non-news, non-rumor stuff which is the new norm.
Bloomberg Reported it so unlikely!
[doublepost=1522764443][/doublepost]
Ah crap! Better sell my iMac Pro 18-core while it still has value.

View attachment 756809
Yup i too hated it!! Maybe i am old school.
 
Physics is physics. Static power is mostly a function of the quality of the fabbed transistor, so let’s assume intel is slightly ahead there. Dynamic power is where most of the power is consumed, and that’s cv^2. So assuming the same operating voltage, you want to drive less capacitance. Two ways to do that - fewer switching wires, and better physical design. Better physical design may buy you 20 percent. And you get probably 20 percent fewer switching wires in risc by not having the complex instruction decoder and memory management units you need in x86. So I’d say I’d expect Apple can do 25% better performance per watt in total (20percent of 20 percent, don’t add).

Thanks, that's just about as expected but still informative.
[doublepost=1522764821][/doublepost]
A thoughtful and complete answer. I do believe, however, that arm CPU's would have waaaay better battery life in a laptop form factor than what we see from Intel and far less expensive. Assuming the cost savings would be passed along to the customer, there are clear technical and competitive advantages and "not just tech for its own sake".

Agreed. If cost savings are passed on to the consumer and they get same performance for significantly less, then that's a clear win. If battery life indeed ends up being significantly better at same performance then again a clear win. We'll have to wait and see.
 
Exactly which is why we sometimes need restart, but my experience with OSX has been pleasant and it is very stable. Linux on the other hand has a lot of user-space software not carefully designed, even though the linux kernel is awesome.

the user space issue is more to do with the nature of development of software, than the platform. with Linux being mostly based around open source, you get community design, which can be messy (GIMP is super powerful. you can do nearly the same with it as Photoshop, but good luck figuring out HOW). Thats the one bonus of commercial software. Design focus. the platform is irrelevant really as long as those developing for it have clear direction and intentions.
[doublepost=1522764892][/doublepost]
I think he is comparing an A11 to something more comparable, such as a MacBook processor. I don't think anyone is thinking an A11 could replace a high end desktop/workstation, yet.



Makes sense, appreciate the perspective. In short, if the required apps/tools are created, then the possibility to use this platform is there. I get the sense from many on here that they just can't see any other possibility but x86. I am more so thinking about the next platform, not the one that exists today. I get it though, this is people's livelihoods and I respect that.


i'm honestly of the opinion that I don't give a crap what powerse the platforms, as long as they do what they're supposed to.

But that's where I currently see the barrier with Apple going ARM. there are barriers here That have been outlined by many of us that Apple would have to get around.
 
Just buy a keyboard for your iPad Pro. If you are happy with iOS applications, that are cutdown versions of their PC counterparts, you really don't need a desktop / laptop computer. If you want more security on Mac OS today, you can. Just install ALL your apps from the Mac AppStore.

Mac OS has stagnated because Apple has allowed it to.

Many Mac applications cannot get onto the Mac AppStore due to Apple's requirements ( sandboxed etc ), because technically it is not feasible. These applications, personally, I find the the most useful and use day to day for home and work.

Company I work for uses mostly Macs. If Mac OS became iOS like, as you suggest ( walled garden - because thats how you'd get better security ) , we'd be forced to stop using Macs, which would be a shame , because we wouldn't be able to use the software we need to. Many other companies would be in the same boat.
im sorry but iPad apps really dont have to be cut down versions of their desktop counterparts given their display size. Secondly you can use size classes to handle screen size issues.

Id like to be able to develop ine app that works across phones tablets and computers.
 
Last edited:
If Apple really is planning a move to in-house chips in 2 years, they're probably already working with big companies like Adobe and Microsoft and coming up with solutions for a smooth transition. Who's to say other companies won't see value in shaking up Intel's hold on the CPU market as well.
 



Apple is planning to transition from Intel chips to its own custom made Mac chips as early as 2020, reports Bloomberg.

Apple's initiative, reportedly code named "Kalamata," is part of an effort to make Macs, iPhones, and iPads work "more similarly and seamlessly together" according to unspecified sources that spoke to Bloomberg. Apple already designs its own A-series chips found in iPhones and iPads.

imac-pro-after-effects-800x660.jpg

The Mac chip plans are said to be in the early stages of development and the transition from Intel chips to Apple chips could involve multiple steps, starting with the "Marzipan" initiative coming in iOS 12 and macOS 10.14 to allow developers to create a single app able to run on both iOS and macOS.

With its own chips, Apple would not be forced to wait on new Intel chips before being able to release updated Macs, and the company could integrate new features on a faster schedule.Apple has already begun using custom designed T1 and T2 chips in its MacBook Pro and iMac Pro machines, and the company is said to be planning to integrate additional custom co-processors in Macs coming later this year. The custom chips will also be used in the upcoming Mac Pro, which is in development.

The T1 chip, included in the MacBook Pro, powers the Touch Bar and authenticates Touch ID. The T2 chip, in the iMac Pro integrates several components including the system management controller, image signal processor, SSD controller, and a Secure Enclave with a hardware-based encryption engine.

Previous rumors have suggested Apple is interested in creating its own ARM-based core processor chips for its Mac lineup in order to reduce its dependence on Intel. Apple is also rumored to be pursuing development of its own modem chips to also reduce reliance on both Intel and Qualcomm.

A move away from Intel would have a major impact on Intel, with Apple providing approximately five percent of Intel's annual revenue. Intel stock has already dropped following the news.

Article Link: Apple Plans to Ditch Intel and Use Custom Mac Chips Starting in 2020
[doublepost=1522766474][/doublepost]Oh me. Pllleeeaze Apple, think twice and forego your higher margins. Windows compatibility is key to your market success.
 
  • Like
Reactions: Solomani
We knew it's coming.

Yes, we knew it's coming. But people are fools to think that Intel chips will suddenly "disappear" from all Macs once 2020 arrives. That's foolish at best. It will be a gradual "weaning off Intel" process. Apple will start the "weaning experiment" with the low-end Macs first…. maybe the (future if any) Mac Minis, or low end MacBooks, or even the entry-education iMacs at first. And with the high end desktop Macs, they'll most likely keep the best Intel chips for the expensive CTO (customized to order) models for some few more years to come.
 
Last edited:
If Apple really is planning a move to in-house chips in 2 years, they're probably already working with big companies like Adobe and Microsoft and coming up with solutions for a smooth transition. Who's to say other companies won't see value in shaking up Intel's hold on the CPU market as well.
No one can design a new architecture to rival Intel in 2 yrs, this surely would have been in the development pipeline for many many yrs.
 
Well once you throw out "viewing animals on social media", it certainly feels that way. I respect your opinion, but I try not to engage with those who are so stuck in the past.

Well, iOS is the past .
It's usability and interface are a throwback to the late 90s .

It's fine for what it is, a portable PA with a touchscreen, but in terms of computing it's not even on the map .
 
  • Like
Reactions: Delgibbons
Yes, we knew it's coming. But people are fools to think that Intel chips will suddenly "disappear" from all Macs once 2020 arrives. That's foolish at best. It will be a gradual "weaning off Intel" process. Apple will start the "weaning experiment" with the low-end Macs first…. maybe the (future if any) Mac Minis, or low end MacBooks, or even the entry-education iMacs at first. And with the high end desktop Macs, they'll most likely keep the best Intel chips for the expensive CTO (customized to order) models for some few more years to come.
Of course this is what Apple will do. They are already using Intel's ARM like chips in their MacBooks.
 
Rosetta was a dog. And it preceded that fun moment when recent ultra-premium hardware was no longer eligible for OS updates. And you had to buy your software again. It was fun.

People played along because a) personal computing was still relevant back then, b) Apple was going towards better, more open hardware (yay x86), which was exciting, and c) modern smartphones and tablets hadn't been invented yet so people used their personal computer a LOT.

Now your personal computer is just one of many devices you own.

Will people throw away their $3000 iMac because it can't get OS updates, then buy another one just because they are brand loyal?

Some will but out of the billions of computer buyers only a few tens of millions will be Mac buyers. If that.

And that, friends, is why Apple just can't get ahead. They shoot themselves in the foot at every opportunity.
Sez you.

Apple's move to Intel was VERY successful, and VERY wise.

Rosetta was a SOFTWARE emulation, and thus, was bound to be a bit "doggy", especially considering the Intel CPUs weren't really a match for the PPC CPUs they were attempting to Emulate. But Rosetta wasn't hideous, certainly nothing like trying to run Windows under SoftPC at the time (although, in Microsoft's defense, the ONE version of SoftPC they published after their acquisition of Connectix was MUCH better, likely due to some massive code-cleanup, and some "secret sauce" calls that Connectix simply didn't have information on. It's a shame they killed it off; but it was instantly obsolete once the Intel Macs happened).

But, now that Windows actually DOES run on ARM (and even with x86 Emulation built in!), it might not be so bad this time around. It will suck for the Linux crowd; but they need to just give it up and get behind macOS, anyway. Because, if they don't MS is going to go ahead with their Embrace, Extend, Extinguish plan they have already started for that platform.

But you're right that this is a different world than it was in 2006, when Apple switched to Intel, and one of the most important differences is that Apple creams the rest of the planet in terms of ARM development, and, being able to out-agile Intel as far as "Processor Roadmap", may be exactly what Apple needs to make them the "must have" computers, again.

Make Macs Great Again!(tm). You heard it here first, LOL!

But, I also wouldn't ignore the possibility that Apple may end up putting an ARM in a lower-cost Chromebook-Killer, as a way to cut their teeth on "real" CPU development, and keep the current roadmap of "Intel with ARM Helpers" that they seem to be following, for the more "serious" Mac models. For example, I'm pretty sure the iMac Pro, modular Mac Pro and MacBook Pro and maybe the iMac will have an Intel CPU for the foreseeable future; but the (non-pro) MacBook/MacBook Air, and possibly a low-end iMac would have Apple ARM CPUs.

Possibly.
 
  • Like
Reactions: pankajdoharey
Second, because of the change in architecture, I would have to export the entirety of the data to flat text file and then reload it under the new architecture.

I might have misunderstood this part, but what does the data have to do with the CPU?

Ah crap! Better sell my iMac Pro 18-core while it still has value.

I find this strip extremely short sighted. We're old folks, friend. We are comfortable in our "real work" OSes, with physical keyboards and mice. But these "retarded" kids are just in another reality and a lot of them will never touch a desktop in their lives.
 
So the Apple eco-system goes from iPhone/iPad/Multi-purpose computers => iPhone/iPad/iPad with built in keyboard. It would make a niche market even niche-ier!

They've been going that way for years with the slimmer/smaller is better mantra. It is a shame but unless they sold MacOs as COTs for hackintoshes, it would be the end of many current uses of their operating system. I can only assume that would be their intent.
[doublepost=1522767832][/doublepost]
I might have misunderstood this part, but what does the data have to do with the CPU?

The architecture of the processor dictates the bit order of the data. Search for "endianess".
 
The more tighter Mac OS and iOS get's the more likely of a 'hash' either one becomes. They are different beasts and should remain so. Abandoning Intel chips seems unlikely at the high end but if its going to become more a Chromebook/App Store only experience than I can imagine even the most dedicated of Apple fan's will look elsewhere, if they have not already recently with many of the issues that have been coming. I don't know why a back to basics situation policy has not happened yet to solve issues in iOS and Mac OS, tidy up the product line and do a Job's style clean up.

Pro users have been given the shove, prosumers would with move away with intel, who's left, consumers...who one might say don't need a Mac in an iPad-iphone-iwatch world. Even the original apple fans - before the modern fan club arrived- seem to be wavering.....

Hell it's long overdue.
 
Last edited:
Sales and market penetration. Global adoption. Any metric that a vendor would go by?

There is no metric in which apple is not suckling at the hind teat in a market that they created and had a multi-year head start in.

That's how.
That's because people keep comparing a PLATFORM (Android) to an individual OEM's (Apple, Samsung, LG, etc.)'s sales and "market penetration".

Samsung, who is the most successful Android OEM, has absolutely SUCKY sales (of any of their smartphone models) compared with Apple's iPhone models (any of them); but no one seems to notice. Instead, they add up all the freebie shitbox-Android phones and say "See? Look how many units ANDROID (WHICH IS NOT AN OEM!) has Sold!!!"

Not. The. Same.

Period.
 
No one can design a new architecture to rival Intel in 2 yrs, this surely would have been in the development pipeline for many many yrs.

The rumor is that they'll be using their own chips in 2 years. I didn't speculate about how long it's been in development.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.