I swear, this thread is highly entertaining for its wealth of misinformation.
I know how that feels. Transitioning from 680x0 to PowerPC to Intel as well as Palm to iPhone etc has taught me that one can’t depend on HW/SW companies to keep one’s favourite apps and workflow the same over time, lest the data. So whilst companies need to adapt and stay ahead, is users need to be smart in designing our workflow to ensure data compatibility to the future. That unfortunately is life. Of course, there could always be a company called Orange that will provide a better set of solutions and we may all jump ship. Stay agile, even Steve Jobs knows that and adhered to it.Not unhappy campers, but likely it wull ruin some my workflow/production again, but i can resist using old/outdated hardware/software for years at transition time.
To everyone freaking out about windows support, you do realise that Windows on ARM is already happening right? There are products coming out now such as the HP Envy X2. It’s the first gen so they aren’t powerhouses but by 2020 it’ll pick up.
So basically, you don’t care about us being screwed over even though you don’t even use the platform you want to be changed?
But you run into the same issue with desktop CPUs as workloads increase if you have thermal issues.
I don't think you will double the performance by adding a slug, heatsink, etc.
But looking at Qualcomm and Cavium is an example of ARMv8 competing against Intel.
But once again, they aren't scaled up mobile chips but server class chips from the ground up.
That's the point I've been trying to make.
Can Apple do it? I have no doubt that they can.
But I'll go back to my previous points.
1. I have yet to see the hiring of the types of people it will take to get it done here in the valley or in Austin.
You only have two places where you are going to hire the talent in numbers great enough to get this done.
There is a finite pool of CPU and ASIC/SoC talent at the experience levels to get this done. You have to hire them from somewhere.
2. I haven't seen any announcements from those that supply fab capacity about long term agreements.
TSMC typically won't do those agreements and makes everyone get in line. Apple tried an agreement with TSMC for guaranteed capacity and TSMC said no. Apple isn't building a fab. I haven't seen any up for sale. Not at 10nm anyway.
AMD uses Global. But they got their agreement because they sold fab assets to Global. IBM sold their commercial foundry business to Global. Maybe Apple goes to Global Foundries?
For me it's about the logistics and timeline.
I know what it takes to make chips and replacing Intel in two years as touted by Bloomberg doesn't seem realistic.
These are the same folks that said the same thing two years ago or so. They are also the same group that said Apple would be using their own modems by now and would have switched from Qualcomm and Intel.
Bloomberg analysts just don't know jack about the silicon industry.
It’s not windows RT mate. It’s proper Windows. Runs both Windows apps coded for arm and x86 emulated.Just google "Windows RT"
It’s not windows RT mate. It’s proper Windows. Runs both Windows apps coded for arm and x86 emulated.
(Edit) What I am saying is that we will likely be able to still run windows on our Macs as full Windows is already available on ARM.
The "significant upside" to all this is for Apple to finally only allow apps onto it's new ARM platform from an app store which it takes 30% of. Apple is a business and (in Apple's eyes) the sooner it ditches user installed x86 code the better.I didn't say anything about PC gamers buying Macs to run Windows. It's more of a selling point for Apple, that customers know they have the option to run Windows as well. I suspect Apple customers who rely on this "convenience" is not as absurdly small as you make it out to be.
Besides, I'm trying to figure out what the benefit is for doing this. What does Apple gain by bringing Mac OS and iOS closer? Running touch apps on a Mac? Running Mac apps on touch devices? Gross. We've seen this done by Microsoft and it's a freaking mess as well as the kind of UI nightmare Apple avoids like the plague. Until someone can explain to me a significant upside to doing this, I will continue to view losing compatibility with the Intel world as a huge loss for Apple and its customers.
So basically, you don’t care about us being screwed over even though you don’t even use the platform you want to be changed?
Everyone does have a different need. The iPad works great for you. I love my iPad Pro. It isn't capable of doing any work for me. I could write a document or two... maybe a spreadsheet if needed... But it won't work for me and I would assume that it wouldn't work for a majority of those that have Macs.
Just remember. Those apps you love on the iPad are developed on a Mac.
I believe Apple is more than capable to do that. Probably several times. But PowerPPC also managed that, they had several winning processors.I think the Fundamental Question people are asking is does Apple have the necessary CPU Design Skills to produce a Desktop Class processor that could rival and Outperform Intels? And Other people are trying to defend that A11 Bionic is already Desktop Class so designing new chips wont be a difficulty. Thats the whole battle. I am not sure if Apple has the skills but i am confident that Apple would have thought something before taking this decision, afterall Tim's reputation is at stake here and i dont think he would take any decision where his ass is handed out to him by Intel, if Apple is not confident of the outcome they probably wont do it.
As I mull over this change, and I have time to digest the news, one thing I want to point out regarding Apple's prior platform change.But PowerPPC also managed that, they had several winning processors.
But in the end, for the lack of whatever, probably economies of scale, PPC wasn't able to keep up with intel and its continuing revolving offer of ever improving products.
I highly doubt that anyone is particularly interested in "running Windows" on its own. What people are interested in is running x86 windows apps without too many compromises. You can't do that today. It's highly unlikely that this would change substantially by 2020.
[doublepost=1522915321][/doublepost]It's really funny this forum.
People who have never written a line of code in their life are trying to explain to seasoned developers what's easy or difficult about an architecture transition and porting apps.
People who have never built a hardware component in their lives are trying to explain to experienced CPU designers that adding performance is just a matter of slapping on a few more cores.
Content consumers are trying to explain to producers/creators how production workflows work.
People who are not influenced by an architecture shift at all are trying to explain to people who have genuine needs of a particular architecture how the architecture doesn't matter.
The arrogance is absolutely amazing.
The of the amount of technology needed in the 1940's to be able to remotely view a video signal was amazing. But a TV was not a computer; a modern TV that is entirely computer based is still a TV. It's not a computer. It's not the tech inside that matters but what you do with it.
A web browser/video watching appliance is not a computer. If you're saying it's tech inside and not what you do with it, is my electric toothbrush a computer? It even has inductive wireless charging. What about my coffee maker that even has a computer-based programmable timer?
I agree.
I think people are really stretching on what a computer is.
It's true that digital watches, TVs, coffeemakers, etc have a computer. A computer by definition is just a device that stores and processes data in binary form.
I suppose if we want to get into proper terms, we're really talking about a personal computer, which is a laptop or desktop. The iPad I would not really count as a PC because you cannot do ALL the work from a PC on an iPad.
I appreciate both sides of this discussion, but those thinking that only people who support this decision are the arrogant ones need to take a closer look. Every other comment seems to put down those who think differently on the subject and are open to the change. I know this can have a negative affect on people's livelihoods so I am trying to keep an open mind to that as well. I know I can be part of the problem, but I am working to understand.
My intent and purpose is to use a computer to do parametric 3D modelling and send it to a 3D printer and develop microcontroller firmware.
For those intents and purposes, an iPad is as much a computer as a loaf of bread is a computer.
For consumers, its quite possible that an ARM based Mac will be very successful, though for power-users (many MacRumor members) it may be a poor fit, though this change seems to be met with open arms from a lot of MR members.
I'm sure somebody has already made this point but...
This move carries very few benefits and a whole lot of risk for Apple.
I'm hoping this is just an exploratory/in case of emergency measure/something wild and crazy that none of us have thought of, and not something Apple is (internally) committed to (at least as reported).
The only real benefits are Apple getting to develop chips at its own pace and more potential profit for Apple by cutting Intel out.
The disadvantages however, are huge. Just to list a few:
1. Outside of phones, tablets, and some specialized servers, the world runs on x86. Ditching x86 not only harms compatibility, but robs Apple of the ability to take advantage of these economics of scale.
2. Even if Apple manages to develop a competitive (with Intel) A series chip suitable for high end notebook/desktop macs by 2020-2022 there's no guarantee they will continue to be able to do so. All it takes is some incorrect assumptions during architecture design to put a company at a major disadvantage that can take years to correct (see the G4, Pentium 4, Bulldozer), and would put Apple at a major competitive disadvantage.
3. How well can Apple's A series really scale up, and is the additional R&D expenditure worth it.
4. It ties the future success of the Mac platform to the continued success of the iPhone. If anything were to happen to iPhone sales (whether due to continued competition, or a paradigm shift in mobile devices), Apple could find itself in a position where it no longer makes sense to invest large sums into CPU R&D.
I can understand Apple is tired of of being held hostage to Intel's roadmap, but the solution is actually really simple. Just dual source CPUs from Intel and AMD. Then Apple can play the two off of each other, getting the benefits of higher performance, lower cost, and more innovation, without the huge risk that would come from switching away from x86.
I don't think there is any downside to this, intel Macs only represent a fraction of what apple sells. They have hundreds of millions of people on iOS so makes a lot of sense for apple to utilize their technology across the board. Apps are now being created on IOS that mimic intel apps so just a matter of time for intel to die on apple. Also, these chips are low power which is needed on desktops...Have you used an iPad Pro lately, its blazingly fast for what it can do with graphics. I wish they had something today instead...
With the upcoming developer conference I'm sure they will start to push this somehow.
I think that's a fair point to make, but we need to consider how those apps will be developed. iOS has a much bigger user base and the majority of revenue for Apple, it is only logical to try to unify the platform. If Apple is able to somehow make the transition seamless when it comes to existing apps and new app development, it could be a big win for everyone.
I can understand Apple is tired of of being held hostage to Intel's roadmap, but the solution is actually really simple. Just dual source CPUs from Intel and AMD. Then Apple can play the two off of each other, getting the benefits of higher performance, lower cost, and more innovation, without the huge risk that would come from switching away from x86.
There is very little logic in unifying the two platforms to that extent, because that means dumbing down the Mac. iOS apps have inherent limitations. What makes sense is the continued development of iOS for the mainstream, while preserving Macs and improving on them for those that need it. And that would not be a big win for many people. Again it would be the loss of the ability to run almost every desktop application out there.
But you can't do all the work from an iPad on a PC either. You also can't do all the work from a Windows machine on a Mac. Again, what's a computer to you?