Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Many of those applications you list are already down on servers either in the cloud or on premise, not a someone's desktop. No company or institution wants their research, product development, industrial operations, etc. impacted by someone losing a computer or a single computer dying.

Many of those applications also aren’t. There’s a difference between storing data on servers and computing on servers. Much engineering and scientific computing is done on the desktop (or distributed across many desktops using something like LSF) and not on a server.
 
Many of those applications also aren’t. There’s a difference between storing data on servers and computing on servers. Much engineering and scientific computing is done on the desktop (or distributed across many desktops using something like LSF) and not on a server.


Many companies I know do much of their computation in the cloud. It is just too easy to spin up a new server with a couple of clicks and get the horsepower that would take a VP signatures, IT support, and days of paperwork to procure.

I work with BioPharams that do most of our work in the cloud. They are developing new drugs and treatments and running scientific experiments all in their private cloud. Having it in the cloud ensures the data and application is always available to a researcher that might have a brilliant idea at 3AM while on vacation.
 
Last edited:
  • Like
Reactions: R2K2 and DNichter
I hope you don't "re-imagine" if it fundamentally changes how it operates on desktop/laptop computers. What I really would like to see, is major work done on optimizing and fixing what's already there. Wring out every last performance optimization, stability checks, compatibility checks, and really fine-tune the existing codebase.

So if by that you mean re-imagine, I'd agree. Every time Apple has to "re-imagine", they have to re-start the entire optimization and fine tuning process all over again. Hence why Catalina etc. has become a bit of a mess as they try to write new functionality to move iPad OS apps etc. over to Mac OS.
 
Many companies I know do much of their computation in the cloud. It is just too easy to spin up a new server with a couple of clicks and get the horsepower that would take a VP signatures, IT support, and days of paperwork to procure.

I work with BioPharams that do most of out work in the cloud. They are developing new drugs and treatments and running scientific experiments all in their private cloud. Having it in the cloud ensures the data and application is always available to a researcher that might have a brilliant idea at 3AM while on vacation.

And I know many companies that do computation on the desktop, because the data is always available on the network anyway, but distributed computation allows interactivity that you can’t get by dumping computation onto servers. Pretty much everyone in the semiconductor industry does it that way.
 
SOUND. If I had a dime for every time I heard that we don’t need higher clock rates or cpu power in general.
That person Is definitely not a computer musician. They definitely don’t know what latency is. They haven’t had to make all sorts of compromises in their code to get their algorithm’s cpu low enough.

Live sound requires good coding, not CPU speed. These days, audio DSP latency is dominated by IO buffer sizes and filter lengths, not processing speed. A Raspberry Pi 4 can easily compute dozens of FFTs on multiple channels of 192k audio in real-time, if you can buffer the data in/out fast enough. A lot of live performance pro audio processing boxes (autotuners, et.al.) have used DSPs slower than the ARM core in a Raspberry Pi 4. And an Apple A chip is an order of a magnitude faster (and has better audio IO).
 
Many companies I know do much of their computation in the cloud. It is just too easy to spin up a new server with a couple of clicks and get the horsepower that would take a VP signatures, IT support, and days of paperwork to procure.

I work with BioPharams that do most of out work in the cloud. They are developing new drugs and treatments and running scientific experiments all in their private cloud. Having it in the cloud ensures the data and application is always available to a researcher that might have a brilliant idea at 3AM while on vacation.

Well, I'm glad to know the health industry doesn't find privacy concerns important, but it's kind of neither here nor there.

It's not that Apple "Apple missed the cloud revolution". It's that they've decided not to pursue a thin client path. Why haven't they? Because of course they haven't, because they'd be stupid to, because the vast majority of their revenues is in selling hardware. How do you keep high hardware revenues? By making good hardware, not by making the hardware cheap and low end and barely good enough.

Apple's approach continues to be very "on-premise"-heavy, and they have no reason at all to change that. They can sell it (correctly) as a massive privacy advantage, too.
[automerge]1592408377[/automerge]
Live sound requires good coding, not CPU speed. These days, audio DSP latency is dominated by IO buffer sizes and filter lengths, not processing speed. A Raspberry Pi 4 can easily compute dozens of FFTs on multiple channels of 192k audio in real-time, if you can buffer the data in/out fast enough. A lot of live performance pro audio processing boxes (autotuners, et.al.) have used DSPs slower than the ARM core in a Raspberry Pi 4. And an Apple A chip is an order of a magnitude faster (and has better audio IO).

But that still doesn't do you any good if you're running it In The Cloud™ as some are suggesting, because the latency would be untenable. So running it locally with adequate perf matters.
 
  • Like
Reactions: Nütztjanix
And I know many companies that do computation on the desktop, because the data is always available on the network anyway, but distributed computation allows interactivity that you can’t get by dumping computation onto servers. Pretty much everyone in the semiconductor industry does it that way.

Curious. What would you run on individual desktop units that could not run in multiple VMs or Containers? I have worked in the semiconductor field at the semiconductor manufacturing level .
 
Curious. What would you run on individual desktop units that could not run in multiple VMs or Containers? I have worked in the semiconductor field at the semiconductor manufacturing level .
Anything interactive you don’t want the latency for. But we run interactive P&R, schematic capture, layout capture, automated P&R, DRC, LVS, static timing, buffer insertion, clock insertion, static checking, etc. all using desktop machines, sometimes via LSF when things are highly parallelizable. Pretty much the entire flow runs on the desktop machines, other than logic verification and a couple other things that just give you a true/false after 24 hours or whatnot.
 
It's not that Apple "Apple missed the cloud revolution". It's that they've decided not to pursue a thin client path. Why haven't they? Because of course they haven't, because they'd be stupid to, because the vast majority of their revenues is in selling hardware. How do you keep high hardware revenues? By making good hardware, not by making the hardware cheap and low end and barely good enough.

Apple's approach continues to be very "on-premise"-heavy, and they have no reason at all to change that. They can sell it (correctly) as a massive privacy advantage, too.
[automerge]1592408377[/automerge]
Apple's revenue is indeed in selling hardware, and that's where their problem is, because they don't make (all) the software people need. Companies like Microsoft used to make a lot of money by selling software, but they discovered that they can make a lot more money if they have your data and you need a subscription to access it. They will include their online apps 'for free' with it. They don't care anymore if a user runs Windows, Linux or macOS, as long as they have a browser. Or do you really think Microsoft Office will exist as a local app in ten years time?
And I'm not saying that this is a good thing (I don't think so), but it is certainly the direction we are all heading.
 
Last edited:
Apple's revenue is indeed in selling hardware, and that's where their problem is, because they don't make (all) the software people need. Companies like Microsoft used to make a lot of money by selling software, but they discovered that you can make a lot more money if they have your data and you need a subscription to access it. They will include their online apps 'for free' with it. They don't care anymore if a user runs Windows, Linux or macOS, as long as they have a browser. Or do you really think Microsoft Office will exist as a local app in ten years time?

Depends on what you mean by "local app". They just doubled down on Office for iPad. That's very much a local app. They unified their macOS/Windows/iPad underlying architecture. Again, local. Yes, they have a web app that's increasingly powerful, but one doesn't exclude the other.

Do I think Word will run offline on a Mac ten years from now? Absolutely.
 
  • Like
Reactions: Michael Scrip
Anything interactive you don’t want the latency for. But we run interactive P&R, schematic capture, layout capture, automated P&R, DRC, LVS, static timing, buffer insertion, clock insertion, static checking, etc. all using desktop machines, sometimes via LSF when things are highly parallelizable. Pretty much the entire flow runs on the desktop machines, other than logic verification and a couple other things that just give you a true/false after 24 hours or whatnot.

Ok. This is package creation/assembly. Makes sense you need real time there. Most of my experience was on the fab side was for research and supervisory systems. Much less time critical.

Most of the other automation stuff I have done for biotech has also had dedicated PCs running robots, often unfortunately with software tied to a specific interface board and version of software.
 
Ok. This is package creation/assembly. Makes sense you need real time there. Most of my experience was on the fab side was for research and supervisory systems. Much less time critical.

Most of the other automation stuff I have done for biotech has also had dedicated PCs running robots, often unfortunately with software tied to a specific interface board and version of software.
Not package stuff - CPU design. But yeah.
 
  • Like
Reactions: jerryk
I guess Catalyst is only a first step so developers can easily get their iOS apps to the Mac App Store. Why?
Because ARM Macs will be able to run iOS apps natively and that's why Catalyst is needed.
 
I guess Catalyst is only a first step so developers can easily get their iOS apps to the Mac App Store. Why?
Because ARM Macs will be able to run iOS apps natively and that's why Catalyst is needed.
Wrong. Its not about the processor, its about frameworks and API. AMD64/x86 Macs run iOS apps natively as well in the iOS simulator (yes. Its running x86 natively, hence the name simulator rather than emulator), it just needs the iOS frameworks.
On macOS ARM the same principle applies, so no ARM advantage here whatsoever
 
  • Like
Reactions: Nugget
To respond simplistically, cloud computing works for the average user in the office, but not people who have (for lack of a better term), specialized jobs that require minimal latency.

If this were true, the Chromebook would be much more popular than it is. I don't know a single person who works off of Chrome OS - but that's just my experience.

I personally do not like working with cloud based apps. They're too sluggish and unresponsive for my liking.

With more and more applications being cloud-based, you could wonder whether faster computers are really necessary. I have many customers where the only application they use on their computer is the browser. Everything else (mail, documents, financial applications, etc) is a browser-based app (Google G-Suite, Microsoft 365). And if you're only using a browser, then the easy-of-use of the OS, and the speed and brand of the computer is not that important anymore. And when everything is browser-based, moving to ARM is not the big problem. The problem for Apple is why you should still use an Apple computer to open your browser.
Apart from that: like Microsoft missed the mobile revolution, Apple missed the cloud revolution. iCloud is a big mess, and only for consumers. If you want to run your business on iCloud, it's just not possible.
So there you've got 2 challenges for Apple for how to stay relevant in world that's moving to the cloud.
[automerge]1592518852[/automerge]
I'm a bit surprised Apple hasn't put more effort into that area. Earlier on, sound was one of Apple's focus, even on the Apple ][.

It would seem developing high end studio solutions would be a natural market for the MP.

I'm not sure what you mean by this. Do you mean they should be producing audio interfaces and studio monitors for example? I'm not going to lie, seeing a set of black studio monitors with illuminated Apple logos below the woofer would be rather cool. :)

I feel Apple puts more resources into music production than most - particularly when you consider the development of Logic Pro. I'm using it more and more (over Pro Tools and Cubase etc) given it's excellent value (no subscription, family sharing, free updates).
 
Last edited:
The K-12 education sector in the USA is dominated by ChromeOS use.
As a sidebar in response to your comment, in our local K-12 school system, the schools have Chromebooks and Macs for use during the day. Because of the COVID shutdown, the schools loaned the computers out to kids who did not have devices at home so they could participate in on-line learning. I know that the crates containing the Macs crates are empty. The kids wanted to use these devices. The Chrombook crates remain full and mostly unused even now. Perhaps I am being presumptuous but I take this anecdote to mean the Apple does not need to worry about Chrome.
 
  • Like
Reactions: groove-agent
As a sidebar in response to your comment, in our local K-12 school system, the schools have Chromebooks and Macs for use during the day. Because of the COVID shutdown, the schools loaned the computers out to kids who did not have devices at home so they could participate in on-line learning. I know that the crates containing the Macs crates are empty. The kids wanted to use these devices. The Chrombook crates remain full and mostly unused even now. Perhaps I am being presumptuous but I take this anecdote to mean the Apple does not need to worry about Chrome.
My wife was assigned a chrome book from the school district she works at. She’s gone out of her way to be her own IT Dept in order to use her 12” MB instead.
 
As a sidebar in response to your comment, in our local K-12 school system, the schools have Chromebooks and Macs for use during the day. Because of the COVID shutdown, the schools loaned the computers out to kids who did not have devices at home so they could participate in on-line learning. I know that the crates containing the Macs crates are empty. The kids wanted to use these devices. The Chrombook crates remain full and mostly unused even now. Perhaps I am being presumptuous but I take this anecdote to mean the Apple does not need to worry about Chrome.
That is interesting.
 
Question: will the Windows world also switch to ARM? Or will it keep using Intel forever? 🤔
A lot of Individual PC builders have already switched to AMD cpus. The big computer companies like Dell and Hp are also starting to add AMD chips to their products.
 
They did that in 1996, too. Then what happened?
Very true but "past performance is no guarantee of future returns" and the AMD chips are excellent in terms of performance and thermals as you know. If the PC makers decide to pair them with capable graphic cards, it may be a lasting game changer this time.
 
Very true but "past performance is no guarantee of future returns" and the AMD chips are excellent in terms of performance and thermals as you know.

They were several times before. They never quite reached critical mass (or AMD management never quite knew how to take advantage of it).

And I really wouldn't extrapolate too much from Zen and Renoir in particular being great and Intel doing poorly. We might see things shift quite a bit with Tiger Lake and Alder Lake.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.