Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What I took issue with is the notion that Boot Camp is used by a large majority of Mac users. I believe a very small minority of Mac users use Boot Camp and then mostly for gaming. Even if you include everyone who uses Boot Camp and virtual machines I do think it is less than 10% av Mac users. Maybe as few as 2-3%.

Nope-still not what I said. Please quit targeting me for cherry picking. I said I use it, and everyone I know. Thats not everybody that owns a Mac, just my circle. I will never use a virtual machine for my application because of their performance.

I only accounted as to how Aspyer and the user may also be benefiting without realizing it; it's called a common thread. Thats not a statement as to anyones workflow, just that it exists. Enjoy it-and leave me alone.
 
Last edited:
There's a lot of factors to process...not sure how I feel about this move.
  1. BootCamp and alternative OS support...
  2. Future macOS support and backwards compatibility...
  3. No hardware upgradeability, everything soldered to the logic board...
  4. Pricing (Apple monopoly on using its' own chips)...
I know the MacBook lineup uses all-soldered components, but this is not the case for Mac desktops.

There is one major advantage to this move: highly customized chips for each product segment.
Do you want more cores? Larger caches? Higher frequencies? More CPU memory bandwidth?
 
I wonder how these new chips will effect Logic Pro. I'm planning on buying a new imac at some point this year

Honestly I've never used Logic Pro-and the little audio I have any XP with was Adobe (basically none). But it looks like if the Pro lines maintain their lifecycle-those in the in-between should be good-but thats just what I get from the overall picture; if it's not just a rumor at all. Only time will tell for sure.
 
  • Like
Reactions: WatchFromAfar
...from the POV of the lead developer of a monolithic OS kernel: AKA one of a shrinking pool of "developers" who still need to give a pair of foetid dingoes kidneys about what CPU instruction set they're running on.
I think you are doing a slight disservice to the man who still to this day has the final say about what changes get committed to the Linux kernel. Android is the World's most popular online OS which happens to run a Linux Kernel so I wouldn't dismiss him as "one of a shrinking pool of "developers"":

https://www.csoonline.com/article/3...the-worlds-most-popular-operating-system.html
 
  • Like
Reactions: Reindeer_Games
I'm reading this on my MBP, and it's going to be my last for a while. Next laptop will be a Windows Workstation.
Running Bootcamp was a mistake and comparing Win10 to MacOS... Win10 has become my favorite.
 
I'm reading this on my MBP, and it's going to be my last for a while. Next laptop will be a Windows Workstation.
Running Bootcamp was a mistake and comparing Win10 to MacOS... Win10 has become my favorite.
What’s your favorite part of win 10? The three completely different UIs for system settings, scattered across multiple locations? The weird text cursor that moves “fluidly,” but lags way behind your typing?
 
  • Like
Reactions: 09872738
Well I started with a Mac Classic on 6800 then PowerPC, then G5, then Intel... Rosetta worked well. We are a production house here and a modular Mac Pro with 4 x processor cards, each with 4 x A13s would be a beast and more then enough power to run any emulation you like probably. And if you need a native PC that badly - then just get one, they're cheap enough. I'd rather look to the future, and what a modern, scalable architecture could achieve.
 
This sounds like Rosetta 2.0 will have to be implemented to allow macOS intel apps to continue to run on the new hardware. The Apple of 2005 could have pulled this off smoothly...

It was a fairly smooth ride when Apple transitioned from 6800 architecture to IBM PowerPC RISC in 1994 and then switched from PowerPC to Intel in 2005. Apple had real software and firmware engineering talent back then.

Today's millennial script-kiddy crowd at Apple kinda stink at engineering, with no discipline and a lack of developer ethics. Function now follows Form and the art-dogs like Ego Ives run the show. Tack on Apple's quality assurance as next to non-existent and this may well end up being a charlie foxtrot.

It will take a year or two for the inevitably aggravating hardware/firmware/macOS problems to shake out, and the 1.0 MacARM hardware purchasers will be abandoned in the process.

What would really make this a circus is if Apple seriously considers the duct tape and bailing wire approach to licensing the bloat of VMWare Fusion/Oracle Virtual Box engines for the transition... whatcha bet-me?

BatteryGate, ScreenGate, BendGate ain't gonna be nothin compared to this... break out the marshmallows and weenies boys and girls, this is going to be a bonfire!
 
  • Like
Reactions: trellus
This sounds like Rosetta 2.0 will have to be implemented to allow macOS intel apps to continue to run on the new hardware. The Apple of 2005 could have pulled this off smoothly...

It was a fairly smooth ride when Apple transitioned from 6800 architecture to IBM PowerPC RISC in 1994 and then switched from PowerPC to Intel in 2005. Apple had real software and firmware engineering talent back then.

Today's millennial script-kiddy crowd at Apple kinda stink at engineering, with no discipline and a lack of developer ethics. Function now follows Form and the art-dogs like Ego Ives run the show. Tack on Apple's quality assurance as next to non-existent and this may well end up being a charlie foxtrot.

It will take a year or two for the inevitably aggravating hardware/firmware/macOS problems to shake out, and the 1.0 MacARM hardware purchasers will be abandoned in the process.

What would really make this a circus is if Apple seriously considers the duct tape and bailing wire approach to licensing the bloat of VMWare Fusion/Oracle Virtual Box engines for the transition... whatcha bet-me?

BatteryGate, ScreenGate, BendGate ain't gonna be nothin compared to this... break out the marshmallows and weenies boys and girls, this is going to be a bonfire!

“A lack of developer ethics.”

This place has clearly gone insane.
 
What’s your favorite part of win 10? The three completely different UIs for system settings, scattered across multiple locations? The weird text cursor that moves “fluidly,” but lags way behind your typing?

My favorite part...actually there are several...but to only name a few: built-in ads on a paid OS, random crashes and BSOD, system wide telemetry, and sign-in just to play a simple game of Solitaire?
 
My favorite part...actually there are several...but to only name a few: built-in ads on a paid OS, random crashes and BSOD, system wide telemetry, and sign-in just to play a simple game of Solitaire?

Ooh. Those are good features. Apple lagging again. It’s also pretty neat that you need two web browsers, and they are about to rip up one of them and try again.
 
The "point" the article makes is the "user" architecture and "server" should ideally be the same. We don't use ARM servers and Apple transitioning to ARM desktop chips won't change this.

I'm just pointing out what someone said; I never gave an opinion that falls on either side of the argument.

You were agreeing with "ARM is not able to replace high-end Intel chips," which Linus was not arguing. And having Apple laptops would of course mean that the two kinds of machines would be the same and change that fact. I don't know who "we" is, but ARM servers have been on sale for years and have been available on cloud services for years.

So as I said, you didn't read the article (very carefully) and you're also contradicting yourself.
 
Wrong.

You are thinking of Windows RT or 10s. Windows 10 on ARM is NOT THAT.

Read THEN Post:

https://www.pcmag.com/news/353637/windows-10-on-arm-runs-all-win32-apps-unmodified

You (as a Software Dev.) certainly CAN recompile/rewrite Windows Applications for ARM, and can develop Native ARM Windows Applications; but, thanks to their JIT-Compiler-Based "Emulation", you don't HAVE to!

See:

https://docs.microsoft.com/en-us/windows/arm/
[doublepost=1550883885][/doublepost]
Why will it be a mess for a few years? Because the past TWO Architecture changes were?

Oh, wait...
[doublepost=1550884084][/doublepost]
And of course REAL COMPUTERS ONLY RUN INTEL, right?

Please.
[doublepost=1550884628][/doublepost]
Wrong.

Hackintoshes remain a rounding-error as far as percentage goes. Apple could care less; so long as they remain a rounding-error.

I stand corrected, I didn't know that they had built such good x86 emulation for the ARM chip.
 
Another courageous decision from Tim Cook? Moving Macs to ARM is of course possible, but I have serious doubts that it's worth the effort. Apple's in-house chip design has been able to stay lean and focused, and drives their main cash-machine - which is driven by a low power, super efficient chip.

That's not the chip which would drive a hypothetical ARM Mac, no matter what "CPU-engineers" in this thread contend, a CPU/GPU is created with a performance envelope. Effectively, this means that Apple needs one chip for their iPhone/iPad and another for their Mac line. If not two.

It's been mentioned in this thread, that Macs are now merely 10% of Apple's revenue, and that number is declining. The future for Apple is not the Macintosh - there's no growth there, and no point in investing in engineering effort in order to support a declining 10% income sector, when Intel and AMD offer decent enough CPUs.

The Macintosh will simply be allowed to sail into the sunset and Apple will start offering ARM based laptops. But they won't be called Macintosh.
 
  • Like
Reactions: trellus
So what Aspyr does is: Windows application on Intel -> Mac application on Intel

What an ARM Mac will require of the developer is Mac application on Intel -> Mac application on ARM. It is a completely different kind of problem and it is extremely simpler since the operating system and therefore the APIs are the same.

We do not know the details, but it would probably only require a recompile of the source code for most application. A recompile takes a few minutes to maybe an hour. Even if you include all the work getting a new executable in Mac App Store or on your website, we are talking a few days.
Right, recompiling games for a completely different processor architecture is so simple that the work of updating 32-bit games to 64 bits on the same processor architecture is hardly even worth mentioning.

This is why almost every single 32-bit macOS game has already been updated to 64 bits in preparation for this September’s 64-bit-only macOS 10.15.

Oh, wait…
 
Last edited:
I think you are doing a slight disservice to the man who still to this day has the final say about what changes get committed to the Linux kernel.

But that's the point: His day job is managing the very code that sits between applications and the bare metal of the computer so of course he's going to be very concerned with what CPU(s) he's targeting.

When it comes to kernel development, I wouldn't presume to question his judgement (and not just because I don't want to get F-bombed back into the stone age :)). His judgement on what the 95% of developers who aren't managing the kernel of the world's most popular operating system think - now, that I'd question. He lives in a different universe to people pounding out games, database forms and social media clients. Even with more sophisticated applications you need a very good reason to be diddling with assembler today.

Android is the World's most popular online OS which happens to run a Linux Kernel

That's a rather odd comment to make as part of an "x86 has won!" argument: the vast majority of those Android phones are running on ARM, but Android also runs happily on Intel, and one of its key features is that most Apps run as processor-independent bytecode on a virtual machine.

The Linux kernel has been ported to just about every CPU short of Minecraft redstone logic - which is part of the reason why an ARM server will likely be running the same software as an x86 server.
 
What’s your favorite part of win 10? The three completely different UIs for system settings, scattered across multiple locations? The weird text cursor that moves “fluidly,” but lags way behind your typing?
Well I can type at business speeds and I simply don't recognise your comment regarding the cursor. I have an iMac in one room and a self-build Windows 10 in the other and I too now prefer the Windows 10 experience to MacOS.

I think MacOS has been starved of development by Apple and when you look at Windows 10 it shows. As for system settings, I couldn't really care if there are a 100 UIs. I know where the settings are located and that's all that matters.
 
which is why almost every single 32-bit macOS game has already been updated to run on this September’s 64-bit-only macOS 10.15.

Unfortunately, if the developer has gone out of business, or the app in question is no longer making money then even a few day's work isn't gonna happen and its bye-bye app. There are going to be a lot of old applications that will never run on 10.15. Mostly, though, they'll be "abandonware" anyway - and the good news is that many of the apps which are likely to be problematic on a future ARMintosh will have already been killed off by 10.15.

I'm not going to defend that as a positive thing - but the reality is that Apple have done it before so there's no reason that they wouldn't do it again.
 
  • Like
Reactions: trellus
Well I can type at business speeds and I simply don't recognise your comment regarding the cursor.

It is a "feature" implemented in MS Office for Windows - I guess you either don't notice it at all, or immediately throw up a little bit in your mouth and google for how to disable it (you won't work it out by yourself). What's hard to see is what value it adds for anybody, even for people it doesn't "trigger"...
 
It is a "feature" implemented in MS Office for Windows - I guess you either don't notice it at all, or immediately throw up a little bit in your mouth and google for how to disable it (you won't work it out by yourself). What's hard to see is what value it adds for anybody, even for people it doesn't "trigger"...

I spent an hour on the phone with our IT department before someone told me how to turn it off. It appears in office and in various other system apps, here and there (nothing is consistent in windows 10).
 
But that's the point: His day job is managing the very code that sits between applications and the bare metal of the computer so of course he's going to be very concerned with what CPU(s) he's targeting.

What I got from the article was that once a network was in-place, only then would there be a possibility of ARM servers becoming mainstream.

It's a "chicken and the egg" type situation.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.