Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Probably not; it's usually not as easy as just recompiling. If it were, 32-bit would have disappeared ages ago.

I'm aware it's rarely just recompiling and nothing else (though sometimes it literally is). But point remains that when it's open source you can rewrite the code and make it work.

They said 10.13 would be the last OS that would run 32-bit "without compromise," whatever that means. So they should run in 10.14 in some fashion, but that'll be it.

I believe it either means that the messages warning you about an app not being 32-bit will just come every time you open an app, and the "compromise" is that the system will annoy you with a pop up every time you open said app. Or that it'll endure an extra performance penalty because of some tricker. Or maybe just that they'll removed the least used 32-bit libraries entirely, but still keep the more commonly used ones in the system, so 5% of 32-bit apps will stop working
[doublepost=1516909849][/doublepost]
Yeah, but tell that to many of the posters on this site who think Apple going Ax for their computers would be the best thing since sliced bread.

ARM chips typically have an advantage in the low-power efficiency. So if Apple uses an A-series chip as a co-processor for certain tasks that could work out alright. An ARM chip is already used to run the TouchBar in the MacBook Pro (the T1 chip incorporates an ARM chip). If small operations could be programmed to be sent to an ARM co-processor instead of running on the Intel main CPU, it could potentially save power. Then again, maybe not since you'd then have two chips running instead of one, and that could negate the power benefit.
 
ARM chips typically have an advantage in the low-power efficiency. So if Apple uses an A-series chip as a co-processor for certain tasks that could work out alright. An ARM chip is already used to run the TouchBar in the MacBook Pro (the T1 chip incorporates an ARM chip). If small operations could be programmed to be sent to an ARM co-processor instead of running on the Intel main CPU, it could potentially save power. Then again, maybe not since you'd then have two chips running instead of one, and that could negate the power benefit.

Plus Arm does trade off power for efficiency. If you scaled an ARM CPU up to the same performance levels of the desktop class x86 CPU"s, you're likely going to run into the same thermal / power issues that intel has.

So you either have a computer running slower, low power chip, or you run a computer that has more power but runs hotter and more power. its not like using Arm is going to suddenly negate these laws of physics that we haven't gotten around yet. We've gotten Better at a middle ground of efficiency between full blast heaters and battery sippers, but we still know that the more you clock up a CPU, the more power and heat it generates.
[doublepost=1516911442][/doublepost]
For creative professionals, 64 bit holds huge benefits, especially when it comes to RAM limitations(4gb vs 17 billion gb).
Logic pro and final cut dropped 32 bit plugins over a year ago. Pro Tools went full 64 bit in v12 IIRC.
There are still a good amount of people that use Mac only because of their careers. For those, 64 bit is a huge improvement. There are plenty of freeware and cheap word processors out there.
Any end of life cycle hurts some users temporarily but in the long term, it aids in progress.
There are absolutely benefits to moving to 64bit in MANY cases, especially those where you need large memory address allocation, or large file support, or extremely complex calculations.

But there's not a lot of benefit outside of that. Running 32bit app on a 64bit operating system doesn't change that your 64bit operating system can do all the benefits.

Apple dropping 32bit support doesn't change the OS ability to function at 64bit that it currently does now. 32bit apps don't slow down your 64bit apps that are running. that 32bit app can still use up to 2gb of filesystem space or RAM usage on your 64bit system with 120gb ram and 30 terrabyte hard drives.

something like Word, doesn't NEED to be 64bit to be useful for basic word processing. it's really down to what the application is doing, and what sort of resourcing it requires.

A program hat doesn't do a lot of calculations, uses very little RAM, and very little file system storage is not going to yield benefits from 64bit. in fact, it can be a detriment as those programs might require additional resources for the larger data sets that they might use. (a 64bit Integer I believe takes up to 4x the memory space as 32bit, but it's been a while since I looked up these stats)

Overall, yes. Wee should move to 64bit. Anyone creating anything NOW should just make it 64bit.

But just because something IS 32bit doesn't make it obsolete, or a bad program, or an inefficient program. This is a myth
 
Plus Arm does trade off power for efficiency. If you scaled an ARM CPU up to the same performance levels of the desktop class x86 CPU"s, you're likely going to run into the same thermal / power issues that intel has.

Absolutely. It'd in fact not just hit the ceiling Intel is at, it'd hit a ceiling much quicker. Intel's Core architecture is primarily built to hit a different spot on the performance/watt curve than the most ARM chips, so if you were to try and push something like an A11's design to hit the same performance metrics you'd get from a 45W Intel chip, well, I don't even think it'd be possible with that architecture.
You could of course create a different architecture around the ARM instruction set that was specifically made to target a different point on the efficiency curve, but there's no real point to that I think.

The co-processor mentality I mentioned however, could benefit from both. The Intel chip would be the brain of the system, and the ARM chip would be a 1W or so chip that does tiny tasks, so the Intel chip can stay as power-gated as possible for the low-power tasks it is less efficient at.

Again, question is if it's worth it since you'd still need a minimum charge sent to the Intel chip despite the co-processor doing the work, and it's unknown to me if the efficiency gains would then be negated, or in fact worth the cost.
 
  • Like
Reactions: LordVic
Why would Apple go to ARM on desktop computers? ARM doesn't offer anywhere near the performance as x86 chips even natively. If you have to emulate it'll be like running Java on a 386.
Most people don't mind a slow computer, they aren't crunching numbers, they're browsing facebook. ARM is perfect for this as it provides 20 hour battery life, easily. Using the Big.Little architecture, you can also get intel level performance out of a chip, at least for a little.
 
  • Like
Reactions: Ener Ji
Intel owns a lot of the x86 patents and they don't like to share.

Assuming that Apple plans on going to ARM, and still support 32 bit apps, they would need to emulate the x86 function calls, which would slow everything down, or get Intel to license their IP, which will never happen. Furthermore, putting an ARM to x86 emulator into a PC requires special hardware - at least that's the road MS took.

And I suspect, adding a shim to translate from x86-64 to ARM 64 isn't as involved.

Edit: It could also be a loss of confidence in Intel's ability to secure the x86 instruction set.
[doublepost=1516902451][/doublepost]
I'm not actually sure that's accurate -- especially if you're considering XP SP2, which made it unusable on anything below 256mb of RAM. Meanwhile Windows 10 will happily run on something with a 1.2 GHz processor, 1GB of RAM, and 8GB of storage.
Naturally that would be the case. For example OS X 10.4 Tiger flies on a PowerMac G4 with as little as 512MB RAM but I think the real issue here is how ridiculous things have become in the endless quest for superior specifications which makes the iMac Pro an absurd machine which is complete overkill.

We all forget it was not a million years ago when Firewire 400 and USB 1.1 were cutting edge and the first 1GB Hard Drive we all marvelled at it.

Technology has moved forward a rapid rate in recent years but in reality when Intel introduced the i5 and i7 Sandy Bridge processors that was the game changer which what makes the 2011 iMac on of the finest Macs ever produced maxed out with 32GB RAM and an SSD (which the user could install) it became a killing machine.

But of course back in those days the consumer could run a Mac on their terms.
 
Most people don't mind a slow computer, they aren't crunching numbers, they're browsing facebook. ARM is perfect for this as it provides 20 hour battery life, easily. Using the Big.Little architecture, you can also get intel level performance out of a chip, at least for a little.

It's not just about how fast it is though. All existing software doesn't just work on an ARM chip. One thing is explaining to the average Joe that the ARM chip Mac is slower but lasts longer. Another thing is explaining how 50% of his software won't work on at, and giving him the information needed to understand why and so on. Different architectures are about more than just performance. It's an entirely different instruction set we're talking about. Even the software that is indeed compiled for both could behave differently. And a similar thing to big-little is also more complicated than it seems. Big.little works because it's a single SoC that manages two different sets of cores. Cores that use the same instruction set too. With the combination of Intel and ARM chips, you could not build them on one chip, because good luck getting Intel to agree to that, which would mean you couldn't have the hardware scheduler be aware of the ARM chip and intelligently shift workloads. You also would have two chips that don't share the same instructions, so you can't switch between them on the fly. Thread migration would be impossible

Naturally that would be the case. For example OS X 10.4 Tiger flies on a PowerMac G4 with as little as 512MB RAM but I think the real issue here is how ridiculous things have become in the endless quest for superior specifications which makes the iMac Pro an absurd machine which is complete overkill.

The iMac Pro is not absurd or complete overkill for the crowd it is targeted at. For a regular consumer, absolutely, but for someone who's business' output is partially dependant on render speeds, there is no such thing as overkill.
 
  • Like
Reactions: LordVic
It's not just about how fast it is though. All existing software doesn't just work on an ARM chip. One thing is explaining to the average Joe that the ARM chip Mac is slower but lasts longer. Another thing is explaining how 50% of his software won't work on at, and giving him the information needed to understand why and so on. Different architectures are about more than just performance. It's an entirely different instruction set we're talking about. Even the software that is indeed compiled for both could behave differently. And a similar thing to big-little is also more complicated than it seems. Big.little works because it's a single SoC that manages two different sets of cores. Cores that use the same instruction set too. With the combination of Intel and ARM chips, you could not build them on one chip, because good luck getting Intel to agree to that, which would mean you couldn't have the hardware scheduler be aware of the ARM chip and intelligently shift workloads. You also would have two chips that don't share the same instructions, so you can't switch between them on the fly. Thread migration would be impossible



The iMac Pro is not absurd or complete overkill for the crowd it is targeted at. For a regular consumer, absolutely, but for someone who's business' output is partially dependant on render speeds, there is no such thing as overkill.
So at what stage for whatever application does it become overkill then?
 
So at stage for whatever application does it become overkill then?


Short answer; Never.

Longer answer:

Imagine, in a hypothetical scenario, that you get paid for each operation your processor completes. Surely you wouldn't say that there's a ceiling to how fast a computer you'd want then. The faster it is, the more money you make.

Now let's take financial modelling, because that industry actually works a lot like that. As of now it's not an industry suited to the iMac Pro specifically, but we're working our way up. Financial modelling is mostly single-threaded, but in this field they run processors at above 5GHz to get the best single threaded performance they possibly can, and the faster you can process the data, the bigger the edge you have over your competition.

Similarly, there are multi-threaded workloads that benefit from as much as you can throw at them. If you're paying a film editor to edit your movie, you're also paying him for the render times in which he basically isn't doing anything at all, just waiting for the computer to process data. If you can buy an iMac Pro for your studio you can decrease the amount you pay this editor, and that is likely going to be less expensive than the iMac Pro itself. And it also has the added benefit that you can finish the project faster, and get the return on your investment to then start a new project.

If you're working freelance either as a film editor, or any other field in which the above paragraph is relevant, like programming, 3D modelling, etc. You also could benefit as you could finish your work for each client quicker, thus address more clients.

And the expectations for our computational abilities also goes up with time. A movie that used CGI in the 80's would have worse CGI than a movie using it now, not because the artists are better, but because we have more computing power and it makes more sense to work with higher fidelity particle effects and whatnot. And after a single artist has worked on his/her work on a single machine (that itself needs to be powerful to offer a good workflow), in the larger pro spheres that render job is then sent off to render farms with many, many more cores than 18 or whatnot.

There's also research. Several universities have quite powerful rigs that students and researchers can run simulations and calculations on. Some who're doing long-term research for the university get their personal machine to work on, and with a large enough grant to support their research, I'm sure many such researchers would happily take all the processing power they can get their hands on, and in fact I know of a fair few who'd prefer that to be a Mac. (A fair few meaning exactly three physicists who've done simulation work on respectively non-Pro iMacs and the Mac Pro in the past)
[doublepost=1516921168][/doublepost]
There's no such thing as overkill for computer technology :p


I like the fact that when you quoted Guy Clark, you actually didn't quote him verbatim, but corrected the typo.... :p
 
If anyone installs this beta and has Photoshop CS6, can you please tell me what the plugin or whatever is called that the popup window shows? I didn't pay attention to it the first time and while Photoshop is a 64-bit app, it must use something that is 32-bit as the popup showed and it freezes when running with "nvram boot-args=-no32exec" enabled.
 
The performance hit comes from the OS having to keep both a 32-bit and 64-bit version of shared libraries in RAM.

Oh yeah, my 2010 Mac Pro with 128GB RAM feels really hit by the 32-bit libraries in RAM. It's 2018, [Apple] just put reasonable amounts of RAM into your computers and let people run whatever they need. If I want to run even a 16 bit app, so what? Todays computers have enough power to emulate/provide whatever would be needed.
 
If anyone installs this beta and has Photoshop CS6, can you please tell me what the plugin or whatever is called that the popup window shows? I didn't pay attention to it the first time and while Photoshop is a 64-bit app, it must use something that is 32-bit as the popup showed and it freezes when running with "nvram boot-args=-no32exec" enabled.


It's probably not what you're after, but even in CC, the updater and the desktop manager are 32-bit.

Have a look in System Information under Applications and sort by 64-bit. You can see all the programs that show "NO" in one nice view.

Rince and repeat for frameworks
 
It's probably not what you're after, but even in CC, the updater and the desktop manager are 32-bit.

Have a look in System Information under Applications and sort by 64-bit. You can see all the programs that show "NO" in one nice view.

Rince and repeat for frameworks
Ah. Also the popup tells you which is the one that PS is using that isn't updated but I didn't pay attention to it and it only shows once. Thanks for that tip btw.
 
I'm aware it's rarely just recompiling and nothing else (though sometimes it literally is). But point remains that when it's open source you can rewrite the code and make it work.
Not...likely. ;) I mean, theoretically. But realistically you'd be reliant on the creators to know the codebase and what to do.

--Eric
 
Not...likely. ;) I mean, theoretically. But realistically you'd be reliant on the creators to know the codebase and what to do.


A good open source project is easily accessible with well commented code, an extensive change log with version control, and a broad development team. So it shouldn't be hard to get your head around the codebase if it's a good project.
Not to say that someone who has already gotten intimate with it won't be able to make the changes much much quicker, but it's not at all an impossible task, and even if you and I can't correct a specific codebase for 64-bit operation, somebody will be able to.
 
People said the same thing when they went from 16 bit to 32 bit. Honestly, they did. Stubbornness for change is just slowing down progress. Just look at the cluster that is Windows which still rocks Program Files x86 for legacy purposes.

Apple have always been happy to drop older standards. 32 bit is only the latest in a long list. They did this in iOS and they’ve already dropped 32 bit support/plug-ins years ago for their Pro Apps (LPX), so this shouldn’t come as much surprise.

The reality is that 32-bit support is a handicap for OS X, but it is not for Windows. The 64-bit Windows kernel supports 32-bit apps directly. There are no performance or efficiency penalties to running 32-bit apps on Windows. It's a different story on the Mac, which is why Apple is pushing so hard to eliminate 32-bit support.
 
What is the actual wording of the new message? The only previous statement I'd seen from Apple said that there would be "compromises" with 32-bit apps; if they're being killed off completely then that's bigger news.

Edit: Apologies; the wording was shown in post 121 and I missed it.
 
The reality is that 32-bit support is a handicap for OS X, but it is not for Windows. The 64-bit Windows kernel supports 32-bit apps directly. There are no performance or efficiency penalties to running 32-bit apps on Windows. It's a different story on the Mac, which is why Apple is pushing so hard to eliminate 32-bit support.

Do you have documentation for this?
[doublepost=1516942072][/doublepost]
What is the actual wording of the new message? The only previous statement I'd seen from Apple said that there would be "compromises" with 32-bit apps; if they're being killed off completely then that's bigger news.


They said there'd be compromises after High Sierra. That didn't mean that 10.15 couldn't kill it off entirely
 
  • Like
Reactions: Ener Ji
Anyone of you having missing icons with the dark menu? Even the Apple logo is missing…

Screen Shot 2018-01-27 at 11.54.37.png
Screen Shot 2018-01-27 at 11.56.27.png
 
Yes, external GPU support is advertised on the MacOS webpage, but you'll notice there is a small number 3 next to it and if you scroll down 3 says "Planned for spring 2018." 10.13.4 is the spring release that includes that feature.

I hope it adds support for nvidia too so we have more options.
[doublepost=1517106993][/doublepost]
It uses Metal as its renderer. The speed difference diminishes at higher resolutions as well



A lot of businesses rely on old software, and businesses generally don't like to update. Updating means new bugs and downtime, and you don't want downtime. Not just do I know businesses that rely on 32-bit binaries. I know businesses that keep computers from the early 80's around as their machinery only interfaces with the software on those computers.

Agreed. Microsoft is trying to get businesses to switch to Windows 10, and breaking compatibility with their applications is not a great way to accomplish that task.
 
I mentioned it once before but I am going to do it again. We have less than 20 Years to move away from 32-bit systems/code before we run into time/clock issues again. 20 years sounds like a long time but it is better to start now rather than sometime in 2037 and panic because we don't have the time or money to fix/replace it.
 
I hope it adds support for nvidia too so we have more options.

I'd imagine that could work with the web-drivers, but I don't think you'll get Apple to give you full Nvidia support in the immediate future. They've optimised around AMD GPUs for now, and seem to keep doing that for the moment. Final Cut is a good example, since it runs much, much better on an AMD GPU than an Nvidia one.
[doublepost=1517152098][/doublepost]
Anyone of you having missing icons with the dark menu? Even the Apple logo is missing…


Someone else mentioned this too, but works for me
 
Luckily macOS can be virtualized these days as I cannot see how to move away from a couple of indispensable 32-bit apps. I could do it, but would loose time on it as the alternatives are less good.

The problem is that the virtualized macOS does not support sound (by default; and the workaround doesn't work very well here, at least and I'm only running Snow Leopard in VMWare Fusion) NOR does it support any 3D hardware support what-so-ever. It has about the same support as Windows98 has minus the proper working sound. So no games are going to work worth a darn in a virtualized environment and Mac gaming has the most software that will stop working. I know many on here couldn't care less about gaming, but some companies have gone to great lengths to try and bring proper Mac gaming over the years (especially Valve and Aspyr) and most of it will instantly stop working. You can HOPE that VMWare or Parallels starts to offer better support for MacOS virtualization, but you surely can't count on it.

I'm sure Steam will get updated, but what about all the games that run under it? The Mac needs a sandboxed 32-bit emulation layer so older software continues to function, even if a few things are lost. We'd still have Classic emulation today if Apple had purchased Rosetta instead of only temporarily licensing it! It's not like they couldn't have afforded to do so and so in turn, all that Classic OS software is not only found in a museum or on old Macs.

Apple has had years to develop macOS and all we got were more iOS based features. They couldn't care less about how much application software the Mac has. Their only move has been to try and get you to buy it from them on their App Store (sales there are pitiful as I understand so clearly most are NOT buying it there. Steam provided PC compatible networking versions of Mac games for free you could play online. The Mac App Store provided broken networking, higher prices and few buyers.).

I am NOT a fan of Windows 10 (forced updates and Microsoft built-in spyware), but this is the single worst move Apple could make for someone that is more than a casual Mac user that only uses browser and email junk and the like or Apple's own software packages.
 
Why would any app developer still keep a 32bit version?
its either updated to 64bit or the software is too old to use. If you have relic software then you can keep using it on relic hardware on relic OS. It will still work. Only games will face this problem as games don't get updated.. at least not the retro ones.

Can't remember if we had the same negotiation Windows when that moved to 64-bit... I guess that's easier because 32 *and* 64 can both co-exsist, *without a warning*

Going over to a website and checking, vs a popup (in your face) puts the mind at ease better I think..

But Apple does what it wants.

Not everyone has the patience to upgrade their apps to 64 either (if not there already), and some develops that have long retired them are not gonna hunt round for then again because of this... More likely, they push users ahead to use something else. (because it's easier to do) unless it's a popular app.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.