Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The magic there was from a third party who provided Rosetta. ...

[snip]

... And then lets further hope that the Apple that neglects Macs for upwards of years at a time will find new motivation to regularly upgrade them when upgrading adds the tasks of developing new CPUs too instead of just using CPUs created by an expert vendor heavily focused on exactly that. And let's imagine that all the non-Apple software that we may lean on regularly can be upgraded to "just work" in only a day or three, perhaps with simple recompiles or similar so that we don't find ourselves waiting for months or years to finally get a version- if the devs even opt to bother for such a smallish niche- that works with a proprietary platform.

Let’s consider the possibility that, while Apple was “neglecting” Macs for years, it may have been internally developing the Rosetta 2 that you assume does not exist. No other company in the world would have more motivation for this as Apple, and Apple totally has the financial wherewithal to make this happen.

I am frequently amazed at how doom and gloomy people can be about the largest market cap company in the world.

And on top of that, they still make pretty kick ass hardware and software. Whether or not they extend that to chip manufacturing — I doubt it, because the cost is so high — they still have the wherewithal to do the hardware and the software redesign.
 
Let’s consider the possibility that, while Apple was “neglecting” Macs for years, it may have been internally developing the Rosetta 2 that you assume does not exist. No other company in the world would have more motivation for this as Apple, and Apple totally has the financial wherewithal to make this happen.

I am frequently amazed at how doom and gloomy people can be about the largest market cap company in the world.

And on top of that, they still make pretty kick ass hardware and software. Whether or not they extend that to chip manufacturing — I doubt it, because the cost is so high — they still have the wherewithal to do the hardware and the software redesign.

I don't necessarily agree with software, but hardware yes. Keep in mind that the previous poster was not saying this is a "doom and gloom" situation, but using historical context as an indicator for what could happen if Apple were to continue this route. There have been plenty of companies that have risen to the top and have fallen due to complacency.

I'm not saying Apple is complacent here, but I wouldn't also assume that the largest market cap company in the world would be impervious.
 
  • Like
Reactions: HobeSoundDarryl
Watching the decreasing emphasis/focus by MS on Windows in favor of VR and non-traditional PC form factors leads me to believe Apple is headed in similar fashion away from Mac desktops and laptops. It's still difficult to speculate how increasing interest in cloud and iOS/mobile software might affect Apple hardware offerings over the next decade. I can see them phasing out Mac hardware within that time - in some respects, they have already begun that process with lack of development of Mac Pros and Minis, as well as in slowing innovation in laptop and iMac offerings. A parallel question to yours: if Microsoft isn't too bothered about Windows anymore, why should Apple be bothered with Macs and MacOS? I should think the time is getting ripe for a major "sea change" with respect to what we've come to call desktop/laptop computers. VR and increasing portability of needed hardware may offer hints for the future, particularly if network bandwidth improves by orders of magnitude.

You’re right, we are probably heading into the endgame of the Mac. But I suspect it probably has two decades or so left.

Apple will most likely announce when they move to ARM that they’re laying the foundation for the next decade.

And indeed they will be:

  • It’s still important for professional productivity
  • It’ll be the engine that people use to create VR and AVR apps etc. for forthcoming Apple products that use these technologies.
  • They’ll want the creation tool to share a similar architecture to these new products (glasses etc)
  • And they won’t want to risk another platform being the primary place that AVR and VR is created on thus losing control of key forthcoming tech.
  • Plus they can bring over technologies created for iOS to the Mac and cut any wastage by having to work with PC x86 related tech.
  • I’m sure they see a gradual migration of productivity pros to iOS devices and as that happens, at least the Mac is fairly efficient for the company to keep in investing in as it’ll share so much tech from iOS devices (the main cash cow)
As to why they’ve let various Mac product lines stagnate. I suspect it’s frustration with Intel being late in the first instance.

Then I think that they seriously considered slowly phasing our the Mac and moving to iOS only - based on shipment numbers and that iOS is the future.

However, then I think realised that it would a dumb move to do and have reinvested in the Mac - but a condition of that reinvestment was that they could only justify it by moving to ARM.

It would’ve been dumb for the reasons outlined above but also in that to grow the iPad into taking up the productivity tasks that a Mac could do would’ve been impossible in the desired timeframe and rushing it would’ve risked them turning iOS on the iPad into their Windows 8.

But I truly belong that Apple came very close to moving to phase out the Mac. Let’s all thank whoever the faction was in the senior leadership team there who made everyone else see sense.
 
Watching the decreasing emphasis/focus by MS on Windows in favor of VR and non-traditional PC form factors leads me to believe Apple is headed in similar fashion away from Mac desktops and laptops.

Microsoft's current CEO (Nadella) used to run Azure Cloud Services so he was not indoctrinated in the "Windows everywhere" dogma that Ballmer was and by it's nature Azure was OS-agnostic so it did not require Windows clients to use it. And Microsoft still focuses very heavily on Windows - especially the "Enterprise" server-side version of it.

As for the Mac, while it now accounts for around 11% of the company's revenue, it's still five to seven billion a quarter which is enough to put it on the Fortune 500 if it was an independent company and therefore still more than worth keeping around for the long term.
 
Last edited:
You're the first person I've ever seen include the 1992-1997 time period in Apple's golden age. You do realize that Apple almost declared bankruptcy, right?

The computers were fun and solid though. They made the newton.
 
They already have this technical expertise at their existing sites. An expansion of manpower, particularly at a distinct site, would indicate a new scope of work beyond their current design endeavors.

No it doesn't necessarily indicate that at all. As the process sizes go down the number of transistors goes up. Apple hasn't done a 5-7B transistor processor. Intel has. The scale is growing which means your verification task is growing also. Apple doing this with a fixed set of folks in Cupertino isn't necessarily sustainable ( especially with normal attrition of folks looking for jobs elsewhere) . So it wouldn't be particularly be unusual to scale the team up and with people who are comfortable with the complexity at this magnitude. Intel has worked at larger transistor budgets at higher densities (then most of the rest of the market ) longer than any other design house. ( Intel is being passed this year with 7nm rolling out at other places, but Intel's densities have been higher for years. If you tap folks who build their skillset during those years. )

The iPad A--X SoCs are typically bigger than the mainstream iPhone. But like I said in an earlier post if Apple forks off the iPad to a larger degree this additional verification could be put on that. Apple is also folding other chips into their SoC which also drives up complexity ( transistor counts and verification complexity ) There is very little "desktop" or Intel specific necessarily going on here.





While iOS device core counts could continue to increase, they will certainly increase in a design intended for higher power budgets. The job posting is pretty explicit about non-iOS usage cases as well. I would say surprisingly so.

Yes, team size goes up. They don't have to all live in Silicon Valley to do that. The "non-iOS" stuff is most likely where iOS has been, it is not indicative of its entire future. You couldn't run multiple apps at the same time several years ago. Now you can. The validation would be for an iOS chip 2-3 years out from now. Two more years of incremental improvements to iOS gets you want in respect to concurrent programs? Apple has only done their first iteration on the "AI" Bionic stuff. Crank that up and you will have higher concurrency at a model iOS has historically never seen.

They’ve already done this, with the early high resolution iPads. Those had 128-bit busses to off-chip RAM. Moving forward, I would expect them to keep the RAM on package in iOS devices, particularly with the forthcoming 2.5D and 3D packaging options that will lower the power cost per bit transferred.

Again this presumes that the iPads are constrained to be same arch as the iPhones. There is space for RAM on iPad Pro (Tear down https://www.ifixit.com/Teardown/iPad+Pro+10.5-Inch+Teardown/92534 Step 10 two RAM chips and not stacked. ). There is no good reason to stack those on top of the SoC with the larger GPU and extra TDP.

The familiarity comes from their iOS devices that feature LPDDR memory. The usage in the MacBook line is intended to show the fact that a transition to a lower power ARM MacBook would be straightforward from a memory perspective because they’d be using a LPDDR interface they were already familiar with from iOS devices.

There is "back end" to the memory controller that has do with either running multiple controllers and/or concurrency of the requests. Yes the memory chips are the same and no don't have to deal with std physical SO-DIMMs. It is the 'other' side that doesn't necessarily put this in the "already done it" context. Furthermore, if want to focus on the "outside" coupling to RAM chip components once again indicative that this is something where there is no "desktop" distinction here. Non only Apple but every other ARM implementor has a LPDDR integration. Which CPU (xARM , x86 , MIPS ) doesn't?
 
No it doesn't necessarily indicate that at all. As the process sizes go down the number of transistors goes up. Apple hasn't done a 5-7B transistor processor. Intel has. The scale is growing which means your verification task is growing also. Apple doing this with a fixed set of folks in Cupertino isn't necessarily sustainable ( especially with normal attrition of folks looking for jobs elsewhere) . So it wouldn't be particularly be unusual to scale the team up and with people who are comfortable with the complexity at this magnitude. Intel has worked at larger transistor budgets at higher densities (then most of the rest of the market ) longer than any other design house. ( Intel is being passed this year with 7nm rolling out at other places, but Intel's densities have been higher for years. If you tap folks who build their skillset during those years. )

The iPad A--X SoCs are typically bigger than the mainstream iPhone. But like I said in an earlier post if Apple forks off the iPad to a larger degree this additional verification could be put on that. Apple is also folding other chips into their SoC which also drives up complexity ( transistor counts and verification complexity ) There is very little "desktop" or Intel specific necessarily going on here.







Yes, team size goes up. They don't have to all live in Silicon Valley to do that. The "non-iOS" stuff is most likely where iOS has been, it is not indicative of its entire future. You couldn't run multiple apps at the same time several years ago. Now you can. The validation would be for an iOS chip 2-3 years out from now. Two more years of incremental improvements to iOS gets you want in respect to concurrent programs? Apple has only done their first iteration on the "AI" Bionic stuff. Crank that up and you will have higher concurrency at a model iOS has historically never seen.



Again this presumes that the iPads are constrained to be same arch as the iPhones. There is space for RAM on iPad Pro (Tear down https://www.ifixit.com/Teardown/iPad+Pro+10.5-Inch+Teardown/92534 Step 10 two RAM chips and not stacked. ). There is no good reason to stack those on top of the SoC with the larger GPU and extra TDP.



There is "back end" to the memory controller that has do with either running multiple controllers and/or concurrency of the requests. Yes the memory chips are the same and no don't have to deal with std physical SO-DIMMs. It is the 'other' side that doesn't necessarily put this in the "already done it" context. Furthermore, if want to focus on the "outside" coupling to RAM chip components once again indicative that this is something where there is no "desktop" distinction here. Non only Apple but every other ARM implementor has a LPDDR integration. Which CPU (xARM , x86 , MIPS ) doesn't?

Transistor count is a very poor measure of verification complexity.
 
I know it's quite a hassle to switch CPU architecture, but this seems really exciting to me. Kinda feels like the old Apple.

I agree. The updates to all, if they indeed work as advertised, are all based on solving problems for people. They draw on changes under the surface, but they solve human problems. They're really fun, too.
[doublepost=1528141495][/doublepost]
Boy do you have the rose-tinted glasses on. It's pretty easy to have "radically new designs" for computers when your competitors just make beige boxes. Quality is higher now for almost all OEMs. We live in a better age.

It's good that we now have near-parity when it comes to aesthetics and quality. It means you get a fine product regardless of which ecosystem you're in.

The beginning of the '90s was exactly when Apple started its death march to destruction. Three months to bankruptcy, let's bring Steve back, maybe we screwed up when we thought success lay in business school and marketing horse crap.
[doublepost=1528141810][/doublepost]
The computers were fun and solid though. They made the newton.

Steve axed the Newton, or what was left to it. He put his guys at work at 1 Design, 2. OS X, 3. Mobile, and 4. Intel. They all paid off. Come on, just from business, he created the most dramatic turnaround in any corporation, ever, and it wasn't by paying attention to business school stuff. He showed the Apple way was better for most people. He changed the world. If you have some powermac running System 9, good on ya. But I'll take an iMac Pro any day of the week, and an iPhone X, too. And Billions agree with me.
 
I don't see this as a disruptor until Apple makes it available outside of their ecosystem as an OEM.
The current surge to use arm on Mobiles and Upgradation of the arm CPU core is all because Apple chose this direction. I have no doubt if Apple choose to move into Arm CPU's for Laptops, the rest of the industry would follow with their own CPU designs or Qualcomm trying to take that space. This might be a disruptor in CPU Space specially considering ARM CPU Cores are much cheaper to manufacture than the x86 CPU's of Intel costing in excess of $700.
 
I think I'll wait a few more years before commenting on any of this. I'm sure Apple can afford internal teams and secret deals with software companies to get them on board and not repeat painful transition periods between archs.
 
The current surge to use arm on Mobiles and Upgradation of the arm CPU core is all because Apple chose this direction. I have no doubt if Apple choose to move into Arm CPU's for Laptops, the rest of the industry would follow with their own CPU designs or Qualcomm trying to take that space. This might be a disruptor in CPU Space specially considering ARM CPU Cores are much cheaper to manufacture than the x86 CPU's of Intel costing in excess of $700.

It's not primarily because Apple chose this direction. It's because ARM CPUs use smaller instruction sets, are physically smaller, and are more power efficient. That in itself lends to ideal mobile use. Snapdragon released back in 2007.
 
It's not primarily because Apple chose this direction. It's because ARM CPUs use smaller instruction sets, are physically smaller, and are more power efficient. That in itself lends to ideal mobile use. Snapdragon released back in 2007.
Apple never used Snapdragon or any CPU from Qualcomm for that matter. As far as CISC vs RISC architecture debate is concerned there are no real pure RISC anymore. Not even ARM of today.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.