Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
False. Mathematica not only runs only run on an ARM platform, but you can get Mathematica for free on Raspberry Pi's (a popular educational ARM platform) running Raspbian. See: https://www.wolfram.com/raspberry-pi/

Suppose that you have a weather simulation capable of predicting whether it's likely to rain in three days. If the simulation takes 4 days to run-- it's not very practical, though the exercise may still be pedagogically useful.

Some people learn mathematics with the aid of mathematica. Other users are more interested in using mathematica to solve problems.
[automerge]1591805191[/automerge]
Notably amd64 has modes where integer operations are 32 bits even though addressing is 64 bits. My point is you can do the same thing to get to 128 bits. No need to define 128-bit integer operations.

As the Programmer's Manual states

"Most instructions default to 32-bit operand size in 64-bit mode. However, the following near branches instructions and instructions that implicitly reference the stack pointer (RSP) default to 64-bit operand size in 64-bit mode: jcc, jmp loop, loopcc, enter leave, popreg, popfs, popgs..."

A 128 bit CPU implies a 128 bit stack pointer.
 
Last edited:
  • Like
Reactions: Atlantico
This happened with the PowerMac G5 back in the day too. The Intel switch was announced not long after the latest model G5 was announced, basically pissing off people who had just invested in new machines. Apple said they weren't leaving older machines behind just yet, but they did, very quickly. In less than a year nobody gave a **** about PowerPC anymore.

We're going to have the same thing happen if they transition to ARM. I'm pissed because this is unnecessary; the x86 architecture is fully capable. At least with PowerPC -> Intel they had a good reason for the switch.
It was a good move to switch from PowerPC to Intel. It improved performance on the Mac and allowed for dual booting/boot camp and a lot of people who were worried about software availability in macOS bought the Intel Macs because they would be able to
Two points, I want to raise,
First, is the appearance/customer support. How would you feel, if you spent 40,000 dollars a computer from Apple in the beginning of June, to only have Apple announce that they're dropping Intel and shifting to a brand new platform? While its reasonable to expect continued support, its clear that someone who spends just as much money on a single computer as a car, would be very disappointed. Just look at the uproar that occurred a few years ago, when Apple rolled out a new iPad in the Fall, after announcing an iPad in the spring. A device that only cost a few hundred, now we're talking several thousand.

Second point, is that we have to sit and wait to see how the performance will be in a real world situation. I don't believe Apple would shift to a new platform that markedly slower, but its way too early to say that the ARM will beat Intel hands down, or even vice versa.
Apple will probably put ARM processors in the low end Macs for a while. They will wait to see how that plays out. After the dust has settled, and a couple of years have passed and the ARM chips have proven themselves and if the ARM chips are powerful enough, then Apple will put them in the pro machines.
 
  • Like
Reactions: KPOM
Suppose that you have a weather simulation capable of predicting whether it's likely to rain in three days. If the simulation takes 4 days to run-- it's not very practical, though the exercise may still be pedagogically useful.

Some people learn mathematics with the aid of mathematica. Other users are more interested in using mathematica to solve problems.
[automerge]1591805191[/automerge]


As the Programmer's Manual states

"Most instructions default to 32-bit operand size in 64-bit mode. However, the following near branches instructions and instructions that implicitly reference the stack pointer (RSP) default to 64-bit operand size in 64-bit mode:"

A 128 bit CPU implies a 128 bit stack pointer.

Sure. That doesn't imply that you need 128 bit ALUs to have a 128 bit address space. Hell, the stack pointer doesn't even have to be a general purpose register. And, in fact, an instruction set doesn't even have to allow anything but 64-bit relative offsets to the stack pointer.

I designed the x86-64 integer instructions, but a future ARM 128-bit wouldn't have to make the same decisions I made.
 
I wonder if third parties will rewrite all their drivers, I'm thinking about my scanner, printer, Wacom tablet, audio interface, will they become obsolete ?

I won't be surprised if Apple drops Thunderbolt, it's part of Intel CPU chipsets.
 
Yeah this unlikely. Transition from x86 to ARM will break a lot of software. Nobody has time for that.

Now if they put both into the same computer, so there is a low power mode, I will buy it; otherwise no more Macs for me.
Apple made a transition from PowerPC to Intel, and came out stronger as a result. There were growing pains, but it was a win for them. Software companies, though not all, will update their software to run on ARM Macs.
 
That's a specious argument. Apple doesn't sell games. They barely sell TV or movies. You should be comparing revenue from TV sets to revenue from computers :)
I should have been more clear. If you look at revenue for the gaming market overall, it is bigger than TV or movie. And it is growing. What I was trying to imply is that Apple should make their devices first-class gaming devices and create their own first-party games. They have all the pieces to do it.
 
I should have been more clear. If you look at revenue for the gaming market overall, it is bigger than TV or movie. And it is growing. What I was trying to imply is that Apple should make their devices first-class gaming devices and create their own first-party games. They have all the pieces to do it.
I don't disagree.

My only point is that Apple, since the days after Pippin, have been notoriously anti-game. Which is weird, because Apple ][ was a huge game machine, iOS is huge for games, etc.

Maybe Apple Arcade will be the start of something. We'll see.
 
  • Like
Reactions: smulji
Er... if you're testing Mac OS software then it's rather easier to not use the touchscreen that is there than, on an Apple TV, to test apps with (e.g.) the TouchID sensor, webcam etc. that aren't there. An iPad Pro could also emulate the touchbar (much as some people here would like it to go away, Apple clearly don't subscribe to that point of view)

The built in Apple webcam is precisely the one that doesn't need longer term driver development to get working. It is the 3rd party drivers and webcams that need the most work and lead time. Same thing at the peripheral driver level.

The Apps that primarily just touch the high level macOS APIs don't need long lead times.

As for a touch bar ... if they are reviving the MacBook I doubt it would be present anyway. Secondly the touch bar on a subset of the same screen is not testing the actual side car subsystem that folks would use. If want to test that out then use it... not fake it. If just want to fake it could just run the whole thing in a emulator. The reason to have hardware to test on is to get to "down to the iron" issues.


The Intel dev system was, basically, the first Hackintosh, with a PC motherboard built into a G5 Tower case. If Apple build an ARM Mac into a basic mini-tower then developers can go hang because I WANT ONE!!! :)

One that has zero PCI-e lanes and hence a mini tower with no slots? I suspect probably don't. Everything so far is an indication that the first iteration is going to be a derivative of 14X which probably won't have any "desktop" like PCI-e bandwidth or lane widths.

The point of why the first x86 one was an Intel board is likely because Apple was running prototype previously on some generic board they picked out of a bin at Fry's along with a generic case to slap it in.

If they were doing prototyping on some derviative of a iPad Pro / AppleTV product there is about zero chance that prototype to constructed on a ATX board format.

It isn't going to happen the same way it did for the x86 transition the "try outs " were likely on vastly different stuff. ( plus Intel was largely signed up to do the Mac Pro board work anyway. So not a big leap to get one of their boards to slap something out to the door with. Again doubtful this time that some 3rd party CPU vendor is doing to port work this time around. More than likley Apple has exactly nothing ready for the Mac Pro in this activity. It has probably been assigned to some 'get to it later during copious spare time' resource allocation. )


...but, this time round it's probably going to need all the features of a MacBook Pro (maybe Thunderbolt or USB4, too) so they might actually need to build a MacBook-type developer machine. But an iPad Pro would probably come closest.

Pretty decent chance they could slap a iPad Pro logic board into a MacBook one-port-wonder case with not to much modification. Same number of external ports.
 
It was a good move to switch from PowerPC to Intel. It improved performance on the Mac and allowed for dual booting/boot camp and a lot of people who were worried about software availability in macOS bought the Intel Macs because they would be able to

Apple will probably put ARM processors in the low end Macs for a while. They will wait to see how that plays out. After the dust has settled, and a couple of years have passed and the ARM chips have proven themselves and if the ARM chips are powerful enough, then Apple will put them in the pro machines.
First ARM machine will be a 16"-ish MBP. Apple's thought is that developers and other early adopters won't buy a low end machine. You read it here first.
 
Yes. That's what this is all about. Performance per watt. Sexy laptops. Thinner.

The iPad is just the messenger...

If they can leverage Mac ARM margins for profits and more affordability for customers...

Apple could be selling more than 4 million Macbooks per quarter.

This could be huge.

Azrael.
I can see Apple putting ARM chips in entry level Macs and lowering the price a bit. Kind of like with the new iPhone SE. For several years, Apple was gradually increasing the prices on iPhones, but recently seems to be backtracking on prices. They may do the same with their Macs.
 
That's a specious argument. Apple doesn't sell games.

Err.....

https://www.apple.com/apple-arcade/

And the top 20 in the iOS app store are ?
the Games tap in the app store is for what?
[automerge]1591807591[/automerge]
I should have been more clear. If you look at revenue for the gaming market overall, it is bigger than TV or movie. And it is growing. What I was trying to imply is that Apple should make their devices first-class gaming devices and create their own first-party games. They have all the pieces to do it.

You are engaging in double counting. Strip out the gaming revenue that comes from phones and what have left?

The phones are deriving gaming revenue for Apple. Apple didn't purposely push for that direction but it grew organically and now they are most certainly monetizing it.


but skewing the Macs toward it? Not very likely at all. One, they are arleady making money with what they currently doing. Two, the whole chasing the lastest trendy pop GPU card with the latest tweaked driver to eek out that few extra frames per second at the cost of stablity elsewhere. ... not likely at the top of Appel's "to do' list. And when get down to it... no not way more money in that subset than the Mac market generates itself.
 
  • Love
Reactions: Azrael9
Well, for me a computer is a tool and not an object of religious devotion.

Same here. A mac lets me do what I need and Apple's support has been excellent.

I find it convenient that I have software tools on my Mac that 'transition' easily between different operating systems and I daily exchange files and data with very many people without regard of their specific computational platform. I couldn't care less what type of CPU runs in any of these machines, as long as it is an open platform that facilitates these types of workflow.

Then an ARM based Mac. as long as the tools you use are available, would be completely transparent.

I have an 2011 iMac and a 2014 Macbook on my desk. Performance wise these machines run just fine, battery life is still good on the Macbook. What difference does an ARM processor make here? Other than a faster obsolescence of current hardware and potentially even higher walls around the OS in the future...

As with any tech, better performance is an ongoing goal and as a result obsolescence is part of the price for progress. At any rate, if your machine meets your needs today chances are it will long enough into the future. I still have a Mac Mini that is long been declared obsolete by Apple but still does what I need running Snow Leopard.
[automerge]1591807839[/automerge]
Then you’ll also remember when software companies including Apple would charge for an OS release. First it dropped to $99 then free.

I remember when all the of Apple's OS releases were free on desktop machines.
 
Err.....

https://www.apple.com/apple-arcade/

And the top 20 in the iOS app store are ?
the Games tap in the app store is for what?
[automerge]1591807591[/automerge]


You are engaging in double counting. Strip out the gaming revenue that comes from phones and what have left?

The phones are deriving gaming revenue for Apple. Apple didn't purposely push for that direction but it grew organically and now they are most certainly monetizing it.


but skewing the Macs toward it? Not very likely at all. One, they are arleady making money with what they currently doing. Two, the whole chasing the lastest trendy pop GPU card with the latest tweaked driver to eek out that few extra frames per second at the cost of stablity elsewhere. ... not likely at the top of Appel's "to do' list. And when get down to it... no not way more money in that subset than the Mac market generates itself.

Wow. Apple has a games tab on their iOS store website? :OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO

Azrael.
 
I rely on being able to run Parallels all day as I need a Windows environment for work.. Probably stupid question but would Parallels (or Bootcamp) still work on an ARM machine ?
I’m just guessing, but Parallels should be able to do this. Parallels has been around since before Apple switched to Intel x86. It was slower than now because it ran on older hardware AND had to run as an emulator (translating from PowerPC to x86). Of course Parallels would have to update their software to convert from x86 to ARM. This emulation could effect performance. But, it might not have to do this emulation if Microsoft makes its ARM version of Windows easily available to consumers. In this case, you would also need ARM versions of the Windows software. There are already ARM versions of Windows 10 and Linux, but the software running in Windows also needs to be ARM.
 
Parallels has been around since before Apple switched to Intel x86.
You might be thinking of VitualPC, which Microsoft bought many moons ago, Virtual PC, is an X86 emulation program, i.e., it emulates the CPU, whereas Parallels and Vmware Fusion utilize the virtualization technology baked into the intel CPUs to run multiple versions of an operating system. While I have zero doubts that Apple's ARM CPU will include virtualization, it will be to virtualize what can run on ARM, such as certain distros of Linux, not X86 code.
 
The built in Apple webcam is precisely the one that doesn't need longer term driver development to get working. It is the 3rd party drivers and webcams that need the most work and lead time. Same thing at the peripheral driver level.

Who mentioned drivers? Good luck testing your video-conferencing app without a webcam...

I mean, ultimately, you don't even need an ARM-based Mac to develop for ARM - your x86 Mac will compile you an ARM binary and you can run it (slowly) in QEMU. That'll tell you whether it is going ti crash. However, to really test it you need to be able to run it realistically - and that includes having all the peripherals that you'd have on the "real thing".

One that has zero PCI-e lanes and hence a mini tower with no slots?

...and we know that the chip used in the hypothetical dev system won't have PCIe because...? You might as well assume that we won't like the colour... Anyhow, I wasn't really saying a mini-tower ARM mac was likely. However, if Apple are planning to switch the whole range over the next 3 years or so this won't just be about porting personal productivity apps for a new 12" MacBook - developers will need a capable all-round system with things like PCIe and Thunderbolt....
 
Well if they are planning a 12 core processor it could easily work for you, what are the problems you can imagine?

Many problems: First in regards to VMs, if you have to emulate actual CPU you need to run the client OS you can just throw any/all hopes of decent performance out the window.

Second, software doesn't just magically re-write itself properly. In order to work on specific hardware as well as it possibly can it has to be properly optimized. Some of the clearest examples I can point to are in the gaming world. The PS3 was notably hard to program for, hence most developers didn't bother using the hardware properly, while a few did. The really good PS3 titles look damn near as good as many titles do today on the PS4 and compared well with the best PC titles of the era; they even supported things like stereoscopic rendering. The worst titles looked like absolute crap and you would swear you suddenly has a PS2 again. Within the PC world this is why game "a" will look and run beautifully on moderately powerful hardware but game "b" will run like crap on better hardware and might not even look as good. So software needs to be properly optimized to run as well as it possibly can, and this takes time, and money, and talent.

Now, what happens when you are a developer writing software for the 85% or 90% Intel/AMD based market? Do you actually go out and spend the money you have to spend to properly support that company that switched to their own CPUs? Welllllllll?? Probably not. You probably get the software working and that's it, because there isn't enough reason, financially, to do otherwise. A few companies will do in anyway, but most won't. So sure, that great "Apple CPU" might be 10% or 25% or hell, even 100% faster than an Intel/AMD CPU, but that doesn't mean the software will actually run faster. We are actually likely to see the exact opposite in many cases, unfortunately.
 
  • Like
Reactions: pankajdoharey
Many problems: First in regards to VMs, if you have to emulate actual CPU you need to run the client OS you can just throw any/all hopes of decent performance out the window.

Second, software doesn't just magically re-write itself properly. In order to work on specific hardware as well as it possibly can it has to be properly optimized. Some of the clearest examples I can point to are in the gaming world. The PS3 was notably hard to program for, hence most developers didn't bother using the hardware properly, while a few did. The really good PS3 titles look damn near as good as many titles do today on the PS4 and compared well with the best PC titles of the era; they even supported things like stereoscopic rendering. The worst titles looked like absolute crap and you would swear you suddenly has a PS2 again. Within the PC world this is why game "a" will look and run beautifully on moderately powerful hardware but game "b" will run like crap on better hardware and might not even look as good. So software needs to be properly optimized to run as well as it possibly can, and this takes time, and money, and talent.

Now, what happens when you are a developer writing software for the 85% or 90% Intel/AMD based market? Do you actually go out and spend the money you have to spend to properly support that company that switched to their own CPUs? Welllllllll?? Probably not. You probably get the software working and that's it, because there isn't enough reason, financially, to do otherwise. A few companies will do in anyway, but most won't. So sure, that great "Apple CPU" might be 10% or 25% or hell, even 100% faster than an Intel/AMD CPU, but that doesn't mean the software will actually run faster. We are actually likely to see the exact opposite in many cases, unfortunately.

The good news is that if your code is written to Metal, AppKit, etc., then you don't actually need to emulate anything for those SDK calls. The "emulator" can just swizzle the call to call the native functions. In practice, for many apps, the "emulated" portion is a small portion of the overall working set.
 
  • Like
Reactions: Andropov
I wonder if third parties will rewrite all their drivers, I'm thinking about my scanner, printer, Wacom tablet, audio interface, will they become obsolete ?

I won't be surprised if Apple drops Thunderbolt, it's part of Intel CPU chipsets.
Thunderbolt is being incorporated into the USB-4 standard. Apple has doubled-down on Thunderbolt, making it standard across all its Macs, so I doubt that they will drop it anytime soon.
 
  • Like
Reactions: johnnyzg and chabig
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.