Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Transistor count and performance are extremely weakly correlated.

And most of the transistors in a11 are in arrays (memory structures like caches buffers). Much larger percentage of Xeon transistors do actual computations.
That is true, my whole point was as far a complexity and dense CPU manufacturing is concerned Apple has the means what they may or may not lack is CPU Design skills which we have yet to see. Though if this rumour is true Apple and Tim Cook would have thought it through before making any decision. I dont think Apple is that naive to take such decisions which have far reaching consequences ever so lightly.
 
Maybe but the reason i switched from Windoze was those frequent formats every month or so. Or spontaneous slowing down of windoze on its own after a couple of months of updates, i never felt such things on a Mac or Linux desktop. On windows i could never have uptimes of 1-2 months, but i rarely restart my mac. OSX is actually that good. Even my current uptime is 5 days.

not much of a factor post Win7. by Win8 there was a whole core re-write and simplification leading to stability and uptime (though uptime isn't necessary a great measurement for other reasons, sometimes it's not good to have long uptimes)

Windows 2012 server is rock solid. WIndows 8 and Windows 10 desktops also rock solid (in most cases). as a user of windows, linux and OSx, there is a LOT of parity between them these days.

OSx is great. that's not discounting your use, but it has some siginificant caveats for enterprise... and 5 days? that's cute ;), got alinux work server currently with an uptime of 900+ days.
 
  • Like
Reactions: Stella
No it isn't shortsighted. Apple like control. This would give Apple the opportunity to put everything behind a walled garden and give more control to 3rd party software, at the worse case scenario. An iOS desktop would almost certainly be MORE restrictive for 3rd party apps than what we have today, which would be fine for those who use their computers for viewing animals on social media, but for those who use their Macs for a living, would be a definite negative.

The quality of MacOS is purely down to Apple, and no one else.

There are still great applications out there, that are not on the Mac AppStore, and will never be on the Mac AppStore.
[doublepost=1522762623][/doublepost]

Again, benchmarks. LOL.

Not real world. Different architecture. Apps and Oranges.

Ah, makes more sense now. Another person who thinks no one can use an iPad for real work. Moving on.
 
  • Like
Reactions: Mescagnus
Yup A10X already is at par with many a intel chips https://forum.quartertothree.com/t/apple-cpu-vs-intel-cpu-fight/130182
A11 Bionic should be faster than A10X.
Yea, and there were benchmarks back in the day showing how much faster the ppc was than intel. Except it wasn’t in real world use, and we all knew it.

Apple is going to create an exodus of software engineers and scientists with this nonsense if they break x86 instruction compatibility anyways. I seriously doubt ARM cores can do my data analysis loads.
 
  • Like
Reactions: ssgbryan
Some here forget that those ARm chips in iPhones and iPads operate with big energy and heat restrictions that are much smaller in a desktop Mac. Performance would be much higher.
Yup the CPU clock of an A11 is much lower to conserve battery and heat restrictions compared to an Intel that runs on 95W+ hot with heavy CPU cooling Fan.
 
That is true, my whole point was as far a complexity and dense CPU manufacturing is concerned Apple has the means what they may or may not lack is CPU Design skills which we have yet to see. Though if this rumour is true Apple and Tim Cook would have thought it through before making any decision. I dont think Apple is that naive to take such decisions which have far reaching consequences ever so lightly.

I disagree slightly. They definitely have the cpu design skills. I know many of the guys, and worked with them at exponential or at AMD. And their results so far show it.

They haven’t yet had to design anything as complex as a true desktop processor, because even though the transistor count is identical, it is a lot less work when those transistors are in big arrays. But they’ve done complex designs at former employers. The real test will be whether TSMC can keep up with intel in the fab.
 
Yea, and there were benchmarks back in the day showing how much faster the ppc was than intel. Except it wasn’t in real world use, and we all knew it.

Apple is going to create an exodus of software engineers and scientists with this nonsense if they break x86 instruction compatibility anyways. I seriously doubt ARM cores can do my data analysis loads.
Lets see, Do keep in mind that these are cold benchmarks for a CPU running at clock speeds 1/3rd of Intel without any CPU Cooler
 
And, of course, they won’t lose their entire Mac customer base, because the percentage that care about Windows is small.

As for gaming, there are a lot more hours spent in the world playing iOS games than windows games.

Well, both responders so far looked only at the "gaming" part of my post, which I suppose is understandable. But read the rest of the post - I agree that gaming is just the "fringe" example (although, I would argue that Apple can ill-afford to lose ANY Mac user-base at this point, they should be doing everything they can to make the platform appealing).

And you are correct, only a small portion "care" about Windows, but unfortunately a much-larger group of people are forced to use it, even though they don't "care" about it and would prefer not to touch it. But that isn't the current real-world scenario, and it may never be.

So let's just go back to a single, but very important, sector: Education. The education sector (unfortunately) relies on an array of apps that are, and may-well remain, Windows only (i.e. State Reporting apps, testing apps, etc.) Apple really does risk losing a huge customer-base if people can't easily and reliably run those one or two or three mission-critical Windows apps reliably, easily, and quickly in a good VM (NOT a emulated environment - there is a difference). What do you do then? Do you invest in a Mac and a Windows PC for every staff-member in a school-district? Of course not - you just go Windows. Then what about the students? Do you run a dual-os ecosystem with staff on Windows and students on Macs? Of course not, that's nuts - all you are doing is increasing IT costs when you do that. A single ecosystem is almost always best from a cost point-of-view. (One simple example to illustrate the point: VM containers can be backed-up and restored over networks simply using Mac backup software, CCC etc. Physical PCs will need a separate backup solution - this means more hardware, more software, and more labor - increasing overall complexity and cost.)

This is NOT something Apple should be playing-around with. Again, if the rumor is true, and Apple is smart, then they will build chips that are fully Intel-compatible and simply support additional functions that are accessible in the MacOS environment - that would be a fine way to handle things as it wouldn't break Windows support. But if they kill proper Intel support I anticipate they will have a big problem on their hands.
 
I disagree slightly. They definitely have the cpu design skills. I know many of the guys, and worked with them at exponential or at AMD. And their results so far show it.

They haven’t yet had to design anything as complex as a true desktop processor, because even though the transistor count is identical, it is a lot less work when those transistors are in big arrays. But they’ve done complex designs at former employers. The real test will be whether TSMC can keep up with intel in the fab.
WoW! Good to know that they have a good silicon team. A fun fact Intel uses TSMC to manufacture mobile modem chips.
 
Ah, makes more sense now. Another person who thinks no one can use an iPad for real work. Moving on.

ROTFL. I never said that.

Depends on the work doesn't it. An iPad isn't going to replace my laptop any time soon, or anyone else's in my office.

However, the person in the coffee shop, sure - use it as a checkout for example.
 
Last edited:
not much of a factor post Win7. by Win8 there was a whole core re-write and simplification leading to stability and uptime (though uptime isn't necessary a great measurement for other reasons, sometimes it's not good to have long uptimes)

Windows 2012 server is rock solid. WIndows 8 and Windows 10 desktops also rock solid (in most cases). as a user of windows, linux and OSx, there is a LOT of parity between them these days.

OSx is great. that's not discounting your use, but it has some siginificant caveats for enterprise... and 5 days? that's cute ;), got alinux work server currently with an uptime of 900+ days.
For me uptime is a factor i dont really like to restart my machine, i would rather use it like an appliance, that is always on. Nothing against Linux, i love linux but the loads the software on a server in very stable, a user linux distro keeps changing we keep upgrading new libs, software , keep removing uninstalling stuff. Servers dont change system software for Decades sometimes.
 
Lets see, Do keep in mind that these are cold benchmarks for a CPU running at clock speeds 1/3rd of Intel without any CPU Cooler
Do you really believe you have a desktop/workstation level processor in your phone? Think about that for a moment. The same chip that can only run one single app on screen at a time. 2 tops on an ipad. You really believe this same chip can somehow be better than a $300-$1000 cpu? o_O
 
Thanks for the info. So if Apple developed a new chip/architecture for their laptops and desktops and created the tools needed for development and support of the platform, you don't feel that developers would consider adoption of that platform? I am trying to understand the details, but it seems like wouldn't consider any other platform viable in the future. I'd imagine it is iOS based, but built for the point and click environment. Your insight is valuable here so I appreciate it, I know my views can be short sighted.

No problem and thanks for asking.

here's a workflow example that would be killed by a change in architecture.

The databases I frequently work with, I may copy to my laptop for maintenance and testing. currently, since it's x86 to x86, that's a simple process of taking last nights full backup (100gb) and restoring it on my laptop, then getting to work. While the database engine I use does not have a MacOS equivelant (and they have said they never will support MacOS), I can easily run the engine in a VM on the desktop, or even boot straight to linux.

that would no longer be possible under an ARM architecture. First, should they emulate x86, the performance impact would be too big for efficient use. Second, because of the change in architecture, I would have to export the entirety of the data to flat text file and then reload it under the new architecture. doing this with 100GB database is.. well, not ideal. And there's no way to convince Financial institutions to go out and throw away 20+ years of software development that runs their entire financial institution just for MacOS compatible platform.

Another task that would be impacted is VM testing and integration. While you can do many things on remote EXSI hosts, not all the maintenance tools have MacOS versions (or full featured), thus Parallels / VM's offer enough compatibility. these would need to be written in MacOS (which they never really did even now). So without these tools, also would render Mac's unusable for many IT professionals.

But one of the benefits now, I can create a VM while i'm out on the road or at home for testing purposes. get into the office, and migrate it to the cluster (or vice versa). with an architectural change, this is likely not possible.

I think for the average home user who can get by on a MacBook today, the change would be invisible. As mentioned, most users who login to cloud services don't care whats running those cloud service. they just know they put their email address, password into a box and they get their data / Email. however, All those services are likely running linux and windows in the back end (Exchange is the most common email service platform that relies on Windows and Active Directories.). Heck, even Apple's own cloud services back end are linux based (they run on Google's cloud platform currently).

as Is aid, from an end user perspective, this is all invisible and "under the waterline" stuff. But for those of us who have to support these things daily and professionally, there are massive differences in how we operate vs the average home user. and we're not a small group of people anymore. IT departments are massive investments to power and operate just so that the average user never has to even think about it. But we're not a small group of people anymore as the world has become more and more technical.
 
  • Like
Reactions: ssgbryan and docfuz
ROTFL. I never said that.

Depends on the work doesn't it. An iPad isn't going to replace my laptop any time soon, or anyone else's in my office.

However, the person in the coffee shop, sure - use it as a checkout.

Well once you throw out "viewing animals on social media", it certainly feels that way. I respect your opinion, but I try not to engage with those who are so stuck in the past.
 
For me uptime is a factor i dont really like to restart my machine, i would rather use it like an appliance, that is always on. Nothing against Linux, i love linux but the loads the software on a server in very stable, a user linux distro keeps changing we keep upgrading new libs, software , keep removing uninstalling stuff. Servers dont change system software for Decades sometimes.


I don't disagree, I hate rebooting, but long uptimes often have unforseen issues with memory leaks from some applications, and updates generally require restarts (when you've got anything that requires security and latest patching, uptime can be evidence of a lack of application).

restart when necessary, but don't needlessly look for uptime as a measure of expertise / stability.
 
I disagree slightly. They definitely have the cpu design skills. I know many of the guys, and worked with them at exponential or at AMD. And their results so far show it.

They haven’t yet had to design anything as complex as a true desktop processor, because even though the transistor count is identical, it is a lot less work when those transistors are in big arrays. But they’ve done complex designs at former employers. The real test will be whether TSMC can keep up with intel in the fab.

With your knowledge, if they were to make a true desktop processor, how likely do you think it would be that they could get something significant, say 50%-100% more performance per watt than comparable Intel CPU's?
 
Well the transistor count on A11 Bionic and 15-core Xeon Ivy Bridge-EX is the same for starters. 4.3 Billion Transistors. A11 Does a lot more than just being a processor, it is a SOC, with GPU and other peripherals and is only a 2x4 (6 Core) Chip. Imagine Apple Producing a 15 Core A20 ?
https://en.wikipedia.org/wiki/Transistor_count

Transistor count means nothing as has been noted. Designing a chip for a iPhone is a lot different than designing for the desktop.

With that said, it would be interesting to see how the A10 or A11 fairs without the heat restrictions iPhones and iPads place on a device.

I'd say with this move, pretty much every developer that isn't Apple only would stop buying Apple kit.
 
Do you really believe you have a desktop/workstation level processor in your phone? Think about that for a moment. The same chip that can only run one single app on screen at a time. 2 tops on an ipad. You really believe this same chip can somehow be better than a $300-$1000 cpu? o_O
Why not ? The CPU Frequency is way lower and no Heat Sink/CPU Cooler. And besides i think that Apple isnt naive to decide on something without considering these factors isnt it?
 
Well, both responders so far looked only at the "gaming" part of my post, which I suppose is understandable. But read the rest of the post - I agree that gaming is just the "fringe" example (although, I would argue that Apple can ill-afford to lose ANY Mac user-base at this point, they should be doing everything they can to make the platform appealing).

And you are correct, only a small portion "care" about Windows, but unfortunately a much-larger group of people are forced to use it, even though they don't "care" about it and would prefer not to touch it. But that isn't the current real-world scenario, and it may never be.

So let's just go back to a single, but very important, sector: Education. The education sector (unfortunately) relies on an array of apps that are, and may-well remain, Windows only (i.e. State Reporting apps, testing apps, etc.) Apple really does risk losing a huge customer-base if people can't easily and reliably run those one or two or three mission-critical Windows apps reliably, easily, and quickly in a good VM (NOT a emulated environment - there is a difference). What do you do then? Do you invest in a Mac and a Windows PC for every staff-member in a school-district? Of course not - you just go Windows. Then what about the students? Do you run a dual-os ecosystem with staff on Windows and students on Macs? Of course not, that's nuts - all you are doing is increasing IT costs when you do that. A single ecosystem is almost always best from a cost point-of-view. (One simple example to illustrate the point: VM containers can be backed-up and restored over networks simply using Mac backup software, CCC etc. Physical PCs will need a separate backup solution - this means more hardware, more software, and more labor - increasing overall complexity and cost.)

This is NOT something Apple should be playing-around with. Again, if the rumor is true, and Apple is smart, then they will build chips that are fully Intel-compatible and simply support additional functions that are accessible in the MacOS environment - that would be a fine way to handle things as it wouldn't break Windows support. But if they kill proper Intel support I anticipate they will have a big problem on their hands.

Apple has already ceded the education market (from macs anyway). At their announcement in Chicago, it was all about iPads.
 
  • Like
Reactions: phillytim
Of course, today's Apple might just decide that they can throw 20% of their customer base under a bus and still make more money by extracting even higher margins from the loyalists... We'll see how that plan works out over the next year or two.

As a litmus test, I'll wait and see if they launch - or, at least pre-announce - the promised new Mac Pro at WWDC. Not that I'm likely to buy one (I might have done if it was out a year ago) but as an indication of their priorities. If they don't - or if its another triumph of form over function - then, regardless of the ARM thing, I think enthusiasts and power users should start long-term planning for their current/next Mac being their last one. Not that the Windows/Linux world is notably more inviting, but at least you get a vast choice of hardware...

Exactly my thoughts over the last several "courageous" years. Apple has shifted away from the general purpose computer market to luxury mobile customers. When folks expressed chagrin over the removal of phone jacks from phones or removal of magsafe from laptops, the typical "loyalist" retorts that one should "get over it and move into the modern world", often comparing missing features like the phone jack to griping about lack of floppy drives. The current laptop selection has nothing to offer me (for the high price) that I can't get with a top end PC laptop except for MacOS. I haven't used Windows ever since retiring 10 years ago, but I have shifted largely to PC laptops on which I run Linux. My most recent, and possibly my last, Mac purchase was for an iMac a few months ago, as I still find its advantages as a desktop to still be worth the expense. If major CPU changes in the future further wall one into the Apple mobile centric garden, I'll likely look elsewhere.
 
  • Like
Reactions: pankajdoharey
Pretty sure a 22-core Xeon will pound that little A10 chip into the dirt.

Take a single Xeon core, limit it to the thermal envelope of an iPhone 7, and see how well it does. For all we know, in some hidden Apple lab, they have an experimental A12 or A13 processor core fabbed in TSMC's next-gen high-performance process, and packaged with a heat sink capable of 100W+ heat dissipation. The pounding could easily be going the other way. If so, that's why Apple is thinking of using their own processors, and sticking as many cores on a die as they think a product needs in 2020.
 
I don't disagree, I hate rebooting, but long uptimes often have unforseen issues with memory leaks from some applications, and updates generally require restarts (when you've got anything that requires security and latest patching, uptime can be evidence of a lack of application).

restart when necessary, but don't needlessly look for uptime as a measure of expertise / stability.
Exactly which is why we sometimes need restart, but my experience with OSX has been pleasant and it is very stable. Linux on the other hand has a lot of user-space software not carefully designed, even though the linux kernel is awesome.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.