Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It looks like you think native support is any amount of work.

The reality is: You start Xcode, you go to project settings, you turn on the ARM architecture, build - that's it. Developers have the hardware for testing in their hands, so any reasonably important app will be ready when the first ARM Mac is released.
Here you were responding to a post about the need for native AS apps for major pieces of software like Adobe CC and MS Office.

I don't think it's as plug-and-play as you're portraying. I'm not a dev, but I've spoken with a couple of them who are working to create native versions of their apps for Apple Silicon, and they say it seems doable but challenging.

The challenges in porting large, complex software suites like CC and Office should be even more significant. Indeed, Apple reportedly began working with the makers of such suites (this has been specifically reported for Adobe CC) well before they publicly released the DTK, to ensure they would be fully ready by the time the first AS Mac was released.

This directly contradicts your plug-and-play picture, since if it really were so easy to build native code for AS for these complex software suites, the added lead time, and added support from Apple, would not be necessary.

This link provides a more real-world picture of what's involved for devs in porting their apps to AS:

 
Last edited:
Rocket Lake is 14nm, but with many of Tiger Lake's improvements (such as PCIe 4) backported.
From what I understand, Rocket Lake is essentially a 14nm Tiger processor, but for desktops, not laptops.
 
Last edited:
From what I understand, Rocket Lake is essentially a 14nm Tiger processor, but not for desktops, not laptops.

Kind of.

It’s a bit like Comet Lake was to Ice Lake: same generation, but 14nm, higher yields, more cores, less advanced.

Alder Lake will hopefully unify this.
 
I think you're very mistaken if you think professionals want to switch to services and cloud and ARM based Operating Systems with less functionality. I think what we've seen if anything is people have realised they do not own their content on services.... so they want to keep everything local now.
[automerge]1599208337[/automerge]

I didn't suggest that professionals want cloud services. However, they are moving to cloud services.

This is probably because a combination of what is being offered, increasing team tools, network effects that drag them along and enterprise making the purchase decisions. I suppose if you narrow your definition of professionals, and look only at the desktop market, you have a point, but more generally that's not where things are going.
 
Last edited:
It looks like you think native support is any amount of work.

The reality is: You start Xcode, you go to project settings, you turn on the ARM architecture, build - that's it. Developers have the hardware for testing in their hands, so any reasonably important app will be ready when the first ARM Mac is released.

In most cases, for well-written apps that use device-independent APIs, this will be the case.

The tricky bit is where apps are written using Intel specific features. I've no idea what proportion of apps this applies to.
 
That's always something where conspiracy theories fail. "They will do A, B and C to achieve X". "They don't need A, B and C to achieve X". But recognising that requires logical thought. "They will switch to ARM to lock down the Mac". "They don't need to switch to ARM to lock down the Mac". "They will develop a vaccine and use it to inject us all with microchips". "If there was a microchip small enough to be injected, which doesn't exist, you wouldn't need to put it into a vaccine".

Well, RFI chips for animals are "injected", so it's certainly conceivable you could inject a microchip into a muscle (but not into a vein...). And vaccination would be a good vehicle for carrying out such a personalized implant...

Without entering into any conspiracy discussion, there are valid reasons to chip humans, such as for medical alert information, for patients with dementia, children, or anyone else who can't communicate. It gets a bit more questionable if it were used to track people's movement, but it seems valid as a means of ID - although arguably retina scanning is less prone to falsification and is less intrusive.
 
No, switching won't happen. Marketshares for PCs will start to decline soon, because of chrome and apple. It doesn't hurt that Intel's line up is completely incomprehensible.

Professional software is moving to services to that they can charge more. More processing will occur in the cloud, which helps all companies lock in customers. Consumer software is already there.

There is still modest growth in the PC market according to Gartner https://www.gartner.com/en/newsroom...rew-2point8-percent-in-second-quarter-of-2020), possibly helped by the uptake of remote working due to Covid-19.

I would agree that professional software is increasing based on cloud services (SaaS & PaaS) that will potentially reduce the number of fully-fledged PCs (desktops, workstations & laptops) over time, or at least allow these to use more minimal operating systems such as Chromebooks.

I think we will see an even greater reduction in the proportion of desktops & workstations, because fewer applications will require powerful local machines.

Apple-Silicon Macs align well with this model for lower-end models (that sell the most), and with Apple's own declared strategy of making iOS devices "primary computing devices" for most people (i.e. Steve Job's car and truck analogy).

With work becoming increasing mobile (or remote), workers will want something that is portable and sufficiently powerful, and the operating system will become less important - because more apps with be cloud-based.
 
I think you're very mistaken if you think professionals want to switch to services and cloud and ARM based Operating Systems with less functionality. I think what we've seen if anything is people have realised they do not own their content on services.... so they want to keep everything local now.
[automerge]1599208337[/automerge]


Ok show me a benchmark of the iPhone 12 beating Threadripper?

This is not my experience in the world of enterprise computing.

Far more productivity applications are cloud-based (SaaS) now, such as office suites (Office 365, Google Apps, Apple apps), Mail (Gmail, Outlook, iCloud), expenses & time (Xero etc.), wikis - Confluence, Sharepoint etc, task management (Asana, Trello, Jira), communication (Zoom , Skype, Slack, Teams), CRM (e.g. Salesforce), service management (e.g. ServiceNow), financials (Oracle etc.), HR, storage (OneDrive, Dropbox, Google Drive, iCloud, Amazon...), diagrams (Draw.io, Lucidcharts)

I do a fair bit of development and IT infrastructure design and management - pretty much all of it runs in the cloud. My build processes? AWS CodePipeline, Jenkins, BitBucket, Bamboo. My code repos? GitHub. My hardware? AWS, MS Azure, Google Compute Platform. I could even use IDEs as cloud services (e.g. Cloud9), although I prefer to run these locally out of habit.

Just looking at my installed apps on my MacBook Pro, there aren't many that I couldn't find a SaaS alternative for. Maybe Final Cut Pro and Davinci Resolve (video editors), Apple Photos - photo editing is just about OK on-line with a fast connection, for video editing (especially in 4K) still needs a powerful local machine. Some audio or other media software too. But if I didn't have these, I could realistically do 95% of my technical work on a Chromebook with external screens and a good network connection.

And I am not alone in this trend. A large part of my business is cloud migrations for enterprises. Non-IT companies don't want to manage their own IT infrastructure and locally installed software - they want to focus on their core business and are very happy to move all of the above-listed types of application to external services. Even governments (including defence) are happy to move to cloud providers with adequate security certifications.

I'd be happy to see some evidence of more people wanting to run and store their application locally, but it is not the industry trends that I am observing. Which is good for me, because this is my business :)
 
So the rumor is that Tiger Lake-H will go up to 8 cores, and default to 35W, configurable up to 65W.

I was expecting it to go only to 6. Maybe Rocket Lake-H will go up to 10?

Anyway, a Tiger Lake-H with 8 cores could be a contender for one last 16-inch MacBook Pro upgrade. It'd be a fairly major bump in single-threaded performance (I'm guessing somewhere around 20-30%), an massive improvement in the integrated GPU (which means the discrete one needs to spin up less, which helps battery life, too), and various features like faster memory (supposedly from 2933 MHz to 3200), PCIe 4.0 (Apple might put a 6 GiB/s SSD in!), Thunderbolt 4, etc.
 
Well, RFI chips for animals are "injected", so it's certainly conceivable you could inject a microchip into a muscle (but not into a vein...). And vaccination would be a good vehicle for carrying out such a personalized implant...

Without entering into any conspiracy discussion, there are valid reasons to chip humans, such as for medical alert information, for patients with dementia, children, or anyone else who can't communicate. It gets a bit more questionable if it were used to track people's movement, but it seems valid as a means of ID - although arguably retina scanning is less prone to falsification and is less intrusive.


 
  • Like
Reactions: johngwheeler
This is not my experience in the world of enterprise computing.

Far more productivity applications are cloud-based (SaaS) now, such as office suites (Office 365, Google Apps, Apple apps), Mail (Gmail, Outlook, iCloud), expenses & time (Xero etc.), wikis - Confluence, Sharepoint etc, task management (Asana, Trello, Jira), communication (Zoom , Skype, Slack, Teams), CRM (e.g. Salesforce), service management (e.g. ServiceNow), financials (Oracle etc.), HR, storage (OneDrive, Dropbox, Google Drive, iCloud, Amazon...), diagrams (Draw.io, Lucidcharts)
Just because something uses cloud storage doesn't mean it doesn't require a local application. Prime example is Office 365. I don't know anyone who uses the web-based versions for serious work, and the Office apps have fairly beefy requirements in terms of memory and CPU (especially long Word documents and large Excel spreadsheets). I also want to be able to work offline, e.g. when traveling or visiting a customer.

I also use my Macbook for development work, including Linux VMs, all local on the machine. Again, being able to work anywhere without depending on a good Internet connection is often important for me.

I think reports of the death of PCs are greatly exaggerated. ;) We've been hearing about thin clients for decades now, yet Chromebooks haven't made much of an impact outside schools. Rather, what's happening is that ultra-thin but full-featured laptops are getting more and more powerful. Tiger Lake is a good example for that trend.
 
Just because something uses cloud storage doesn't mean it doesn't require a local application. Prime example is Office 365. I don't know anyone who uses the web-based versions for serious work, and the Office apps have fairly beefy requirements in terms of memory and CPU (especially long Word documents and large Excel spreadsheets). I also want to be able to work offline, e.g. when traveling or visiting a customer.

I also use my Macbook for development work, including Linux VMs, all local on the machine. Again, being able to work anywhere without depending on a good Internet connection is often important for me.

I think reports of the death of PCs are greatly exaggerated. ;) We've been hearing about thin clients for decades now, yet Chromebooks haven't made much of an impact outside schools. Rather, what's happening is that ultra-thin but full-featured laptops are getting more and more powerful. Tiger Lake is a good example for that trend.

I agree that the experience with many apps is better when they run as local apps, including MS Office. But this doesn't require a very powerful machine. It runs quite well even on a VDI desktop

My response was aimed at @scaramoosh 's comments:

1) "I think you're very mistaken if you think professionals want to switch to services..."

I disagree - I see a lot of businesses and professional users switching to cloud services, in all of the areas that highlighted. Maintaining employee machines and software, and network, storage & compute infrastructure is a large effort requiring full-time technical staff ($$). Renting services that someone else manages is generally more cost-effective.

I recently joined a new IT company, and pretty much all of my daily software is service-based:
Google G Suite for e-mail, shared storage and Documents
Atlassian Confluence for project documentation
AWS for development / test enviornments
GitHub for source code control
Xero for expenses and timesheets
EmploymentHero for HR
Slack & Google Meet for team comms

The company doesn't have any of its own IT infrastructure at all (we are all BYOD) - and it's an IT company!



2) "... and cloud and ARM based Operating Systems with less functionality. "

Why does cloud or ARM-based OS imply less functionality? Have a look at AWS, MS Azure or GCP and tell me it's less functional than running your own data center? You don't have to choose "managed services" with limited flexibility - you are totally free to install pretty much anything on the VMs they provide (a few exceptions like crypto-currency mining and security threats)

An ARM-based Mac running MacOS should be identical in functionality to the Intel version.

I don't think the "less functionality" statement has any validity.

3) "I think what we've seen if anything is people have realised they do not own their content on services.... so they want to keep everything local now."

We need to define "ownership" here. You don't actually own locally installed software in the legal sense - you licence it. Probably similar with other commercial media such as music and video. Are people worried that they don't "own" the media they consume on Netflix or Spotify? Not that I've seen. Ownership of physical media (e.g. DVD, BlueRay, CD) is steadily declining, especially in the younger demographic.

A lot of folks are pretty happy to keep all of their photos on cloud services too, although I would recommend local backups for anything important. Data is probably much safer on cloud services which have multiple replicas and better security than many people are prepared to implement themselves. Maybe one exception is if you have vast amounts of local data that would be expensive to store online - but then you have to manage backups and replicas yourself.

So the statement "so they want to keep everything local now" doesn't appear to hold much water.
 
I agree that the experience with many apps is better when they run as local apps, including MS Office. But this doesn't require a very powerful machine. It runs quite well even on a VDI desktop
Try opening one of the big 3GPP standard documents (which are published in .doc format) on a slow computer. ;) Even on a fairly well equipped Macbook Pro this can be torture ...
 
Try opening one of the big 3GPP standard documents (which are published in .doc format) on a slow computer. ;) Even on a fairly well equipped Macbook Pro this can be torture ...

I hear you....big Word docs can be a huge resource hog. Makes me wonder what the hell is going on. I've opened million-line files with "vi" on a Linux box and not had any problems.
 
I agree that the experience with many apps is better when they run as local apps, including MS Office. But this doesn't require a very powerful machine. It runs quite well even on a VDI desktop

My response was aimed at @scaramoosh 's comments:

1) "I think you're very mistaken if you think professionals want to switch to services..."

I disagree - I see a lot of businesses and professional users switching to cloud services, in all of the areas that highlighted. Maintaining employee machines and software, and network, storage & compute infrastructure is a large effort requiring full-time technical staff ($$). Renting services that someone else manages is generally more cost-effective....

I think this pretty much sums it up. As you said, performance is often better locally. Hence the reason why IT departments are switching to cloud-based apps isn't because it's better for the user, it's because it's easier for them. With cloud-based apps they don't have to worry about maintaining Suite X on many different local machines, often with different hardware and configurations, and thus with many different sets of issues. And pros who work in businesses run by those IT depts. have no choice but to go along.

I.e., while there are obvious exceptions (you have a job that requires a computer cluster, or your local machine is old and slow), I'd generally characterize cloud computing as convenience, consistency, and cost (for businesses) winning out over convenience and performance (for their employees).

The above were general comments. But let me also offer a specific example that I acknowledge won't have general applicability: Some of the computations I run in Wolfram Mathematica take several hours to complete (I'll just run those overnight). By contrast, for Mathematica online, Wolfram has a 10 minute computation time limit (and most computations in Mathematica are not parallizeable, so if it's several hours on my machine, it's also going to be in the hours range on their cluster). You can request additional computation time, but who wants to have to bother to do that each time?

And it's not just apps that often perform better locally; this applies to storage as well. For one of my jobs, I'm required to save all files on a shared drive, which makes perfect sense. But it's also a PITA. When I trying saving it to the share drive through their VPN, it takes several seconds, during which the document is unavailable. For my work flow, I want to be able to save repeatedly while I'm working on the document without any interruption, which I can only get locally, where saves are essentially instantaneous. Thus, if I'm going to be spending a day working on a document, I'll download it from the share drive, work on it locally, and save it to the share drive at the end of the day, rather than working on it directly in the share drive.

In many ways cloud computing seems like a regression to me, back to the days before we had PC's, when one used a a dumb terminal connected to a mainframe.
 
Last edited:
  • Like
Reactions: FloatingBones
Maintaining employee machines and software, and network, storage & compute infrastructure is a large effort requiring full-time technical staff ($$)
So what are your thoughts on the younger generation entering the workforce and are interested in network or system admin? Would you still recommend that field or do you think cloud computing will obsolete those fields? @theorist9
 
I think this pretty much sums it up. As you said, performance is often better locally. Hence the reason why IT departments are switching to cloud-based apps isn't because it's better for the user, it's because it's easier for them. With cloud-based apps they don't have to worry about maintaining Suite X on many different local machines, often with different hardware and configurations, and thus with many different sets of issues. And pros who work in businesses run by those IT depts. have no choice but to go along.

I.e., while there are obvious exceptions (you have a job that requires a computer cluster, or your local machine is old and slow), I'd generally characterize cloud computing as convenience, consistency, and cost (for businesses) winning out over convenience and performance (for their employees).

The above were general comments. But let me also offer a specific example that I acknowledge won't have general applicability: Some of the computations I run in Wolfram Mathematica take several hours to complete (I'll just run those overnight). By contrast, for Mathematica online, Wolfram has a 10 minute computation time limit (and most computations in Mathematica are not parallizeable, so if it's several hours on my machine, it's also going to be in the hours range on their cluster). You can request additional computation time, but who wants to have to bother to do that each time?

And it's not just apps that often perform better locally; this applies to storage as well. For one of my jobs, I'm required to save all files on a shared drive, which makes perfect sense. But it's also a PITA. When I trying saving it to the share drive through their VPN, it takes several seconds, during which the document is unavailable. For my work flow, I want to be able to save repeatedly while I'm working on the document without any interruption, which I can only get locally, where saves are essentially instantaneous. Thus, if I'm going to be spending a day working on a document, I'll download it from the share drive, work on it locally, and save it to the share drive at the end of the day, rather than working on it directly in the share drive.

In many ways cloud computing seems like a regression to me, back to the days before we had PC's, when one used a a dumb terminal connected to a mainframe.

You make some very good points, and the last sentence struck a chord.

I'm a big proponent of cloud computing, but accept that my view comes from my professional perspective and readily admit that it's not a silver bullet for all cases. However, the user experience is getting much better, and it will become much more widespread.

I generally deal with the creation or migration of enterprise applications to public cloud platforms. In nearly all cases, these applications would already be running on physical or virtual servers running in a data center (or on-site server room) and managed either directly by the owners, or by contracted service providers. The applications are mostly multi-tier web-apps using back-end services such as databases, network storage, integration services and virtualized networks. There are some apps that use remote desktops or Citrix-like interfaces.

For a lot of "business" software, particular software that has any customizations from the standard product, this kind of remote access has been the norm for many years. It relies on low latency network connections, but the actual performance can be very fast, because it depends on the power of the back-end servers (which can be scaled appropriately to give fast response time) and the speed of the client interface (often the web-browser), which are also pretty snappy these days. There is no inherent reason why this should be any slower than software installed locally, assuming sufficient network speeds. Consider streaming on-line gaming services such as Shadow, which provide an experience close to running a graphically intensive game installed locally. Remote desktops (Virtual Desktop Infrastructure) can be a bit underwhelming sometimes compared to a fast local machine, but it's mostly quite good if you give them enough memory. As you say, this is very similar to client-server apps that used to run on mainframes.

Where cloud computing is less compelling is in cases where high storage or data bandwidth is required, i.e. moving large amounts of data very fast. Cloud computing is great where most of the data stay on the backend server (which has fast direct or network storage), but if you have to push it to and from your client computer over the internet, it falls short in some cases. Saving a few documents to Dropbox, OneDrive, iCloud etc. might work quite well, but editing a 50GB video file will not be a great experience for most of us without a 5 Gbps internet connection (which is just about everyone).
Additionally, the cost of storing data online (currently) exceeds the cost of local storage by a fair margin. I recently bought an 8TB HDD for about $200, which I would hope to last about 10 years. Online storage over 10 years would cost a lot more - and be limited by my internet bandwidth c. 5MB/s vs 200MB/s on my local machine - no contest!

I'm not familiar with Wolfram Mathematica, but I'm surprised it only has a 10 minute online computation time limit. I'm used to cloud platforms being scaleable "on demand", to whatever you need (provided you want to pay for it). 128 vCPU machine with 2TB RAM? no problem - yours for about $5 / hour. Again, there is a break-even point depending on your computational needs. If we need to run a 16-core desktop machine flat out for 10 hours a day, 300 days a year, for 3 years, then it probably makes financial sense to own the machine. If you only those 16 cores for a few weeks or months, then it's almost certainly cheaper to "rent" the time.

You are absolutely correct, that cloud computing is about providing benefits to businesses:
1) Reduce capital costs (buying expensive hardware up front) and convert to operational costs.
2) Greatly reduced management and administration overhead - up-time, security, patching, backup etc. all become someone else's problem (for a modest price)
3) Much easier centralized client management - only need to have a basic machine image with minimal client software
4) Cheaper client machines - most of the work is done server-side.

For typical business productivity tasks, the user experience isn't bad, and is only likely to get better as network infrastructure improves.

Having said all the above, I personally like to have a powerful client computer, because I hammer my machine with memory, storage and processor-intensive applications such as video editors, but most business users are not doing this during working hours ;-)
 
So what are your thoughts on the younger generation entering the workforce and are interested in network or system admin? Would you still recommend that field or do you think cloud computing will obsolete those fields? @theorist9

A lot of system and network administration requires similar or identical skill-set to those used in cloud platform administration.

You still need to understand networking topology and configuration, security, storage, operating system administration and service administration (e.g. databases, SaaS applications)

If you know how to administer Windows Server and Active Directory, you would be able to transfer those skills to Microsoft Azure for example.

Applications deployed to cloud services on a Windows or Linux VM will behave almost exactly the same to those on a physical server or desktop.

If I were entering the job market as a system admin, I would make sure I gained experience in cloud platforms in addition to any on-site skills. 95% of Fortune 500 companies are using at least some cloud services from Amazon, Google or Microsoft and this will extend to into small and medium businesses over time.

I think there will be a big decline in on-site IT administration though. Apart from local networking, some file servers, and printers, there may not be much left to manage, and none of this is exactly cutting edge tech. There is a relentless move to a small number of public cloud providers that already consume the majority of global data center capacity and these are expanding rapidly. One report by Cisco claims that by 2021 94% of all compute workloads will be in public cloud data centers rather than traditional, owner-managed centers (https://www.zdnet.com/article/cloud...-traditional-data-centers-within-three-years/).

Another factor is a potential change in working practices, kick-started by the Covid-19 pandemic. We may well see a trend to more remote working and smaller, decentralized offices or hubs, that don't require any of the traditional office IT infrastructure - or people to manage them.

There will still be a lot of client machines: desktops, laptops (the majority) and, increasingly tablets or other thin-clients, but there isn't a lot of well-paid professional work in maintaining these (Apple Genius bar worker? :) ).

This doesn't mean there will be fewer job in the IT systems administration - far from it - just that you will need to have a good knowledge of the cloud platform providers (e.g. AWS, Google, MS Azure, Rackspace etc.) as well as the underlying technologies (Linux, Windows, networks, databases, security, storage, application platforms).
 
  • Like
Reactions: leman
A lot of system and network administration requires similar or identical skill-set to those used in cloud platform administration.

You still need to understand networking topology and configuration, security, storage, operating system administration and service administration (e.g. databases, SaaS applications)

If you know how to administer Windows Server and Active Directory, you would be able to transfer those skills to Microsoft Azure for example.

Applications deployed to cloud services on a Windows or Linux VM will behave almost exactly the same to those on a physical server or desktop.

If I were entering the job market as a system admin, I would make sure I gained experience in cloud platforms in addition to any on-site skills. 95% of Fortune 500 companies are using at least some cloud services from Amazon, Google or Microsoft and this will extend to into small and medium businesses over time.

I think there will be a big decline in on-site IT administration though. Apart from local networking, some file servers, and printers, there may not be much left to manage, and none of this is exactly cutting edge tech. There is a relentless move to a small number of public cloud providers that already consume the majority of global data center capacity and these are expanding rapidly. One report by Cisco claims that by 2021 94% of all compute workloads will be in public cloud data centers rather than traditional, owner-managed centers (https://www.zdnet.com/article/cloud...-traditional-data-centers-within-three-years/).

Another factor is a potential change in working practices, kick-started by the Covid-19 pandemic. We may well see a trend to more remote working and smaller, decentralized offices or hubs, that don't require any of the traditional office IT infrastructure - or people to manage them.

There will still be a lot of client machines: desktops, laptops (the majority) and, increasingly tablets or other thin-clients, but there isn't a lot of well-paid professional work in maintaining these (Apple Genius bar worker? :) ).

This doesn't mean there will be fewer job in the IT systems administration - far from it - just that you will need to have a good knowledge of the cloud platform providers (e.g. AWS, Google, MS Azure, Rackspace etc.) as well as the underlying technologies (Linux, Windows, networks, databases, security, storage, application platforms).
Thanks!
 
The above were general comments. But let me also offer a specific example that I acknowledge won't have general applicability: Some of the computations I run in Wolfram Mathematica take several hours to complete (I'll just run those overnight). By contrast, for Mathematica online, Wolfram has a 10 minute computation time limit (and most computations in Mathematica are not parallizeable, so if it's several hours on my machine, it's also going to be in the hours range on their cluster). You can request additional computation time, but who wants to have to bother to do that each time?

Given the single-processor benchmarks of the M1, Apple's new Macs sound like a spectacular machine for running Mathematica. Stephen Wolfram gave a detailed description of his work computing environment back in 2/2019. At that time, the link to Apple's "Mac Pro" webpage was to the 2013 "trash can" Mac. He had the loaded version: 12 cores, D700 GPU, and 64GB of RAM. OTOH, any of those M1 machines would run circles around this setup. Stephen may have upgraded to the current Mac Pro, but I tend to wish he would just hop to a maxed-out M1 Mac Mini. He could use an M1 Mac Air for traveling and his walks in the woods:

07-popcorn-rig1.png


Besides trimming seconds from his workflows, we would all benefit from Stephen's excursions into Metal and Neural Engine code.

Does the M1 change your equations for running Wolfram in the cloud? It would seemingly be better to run the computations locally. OTOH, 60 seconds of CPU time should deliver far more computation as Wolfram updates their cloud resources to Apple Silicon or comparable server-based cloud resources.

I'd love to hear your feedback when Wolfram releases a Universal 2 version of their code you get to run some trials on an M1 machine. In any case, Intel's September slogan of "The World’s Best Processor for Thin-and-Light Laptops" is now expired.
 
Last edited:
Given the single-processor benchmarks of the M1, Apple's new Macs sound like a spectacular machine for running Mathematica. Stephen Wolfram gave a detailed description of his work computing environment back in 2/2019. At that time, the link to Apple's "Mac Pro" webpage was to the 2013 "trash can" Mac. He had the loaded version: 12 cores, D700 GPU, and 64GB of RAM. OTOH, any of those M1 machines would run circles around this setup. Stephen may have upgraded to the current Mac Pro, but I tend to wish he would just hop to a maxed-out M1 Mac Mini. He could use an M1 Mac Air for traveling and his walks in the woods:

07-popcorn-rig1.png


Besides trimming seconds from his workflows, we would all benefit from Stephen's excursions into Metal and Neural Engine code.

Does the M1 change your equations for running Wolfram in the cloud? It would seemingly be better to run the computations locally. OTOH, 60 seconds of CPU time should deliver far more computation as Wolfram updates their cloud resources to Apple Silicon or comparable server-based cloud resources.

I'd love to hear your feedback when Wolfram releases a Universal 2 version of their code you get to run some trials on an M1 machine. In any case, Intel's September slogan of "The World’s Best Processor for Thin-and-Light Laptops" is now expired.
His website made calculus physics back in college more tolerable
 
Given the single-processor benchmarks of the M1, Apple's new Macs sound like a spectacular machine for running Mathematica. Stephen Wolfram gave a detailed description of his work computing environment back in 2/2019. At that time, the link to Apple's "Mac Pro" webpage was to the 2013 "trash can" Mac. He had the loaded version: 12 cores, D700 GPU, and 64GB of RAM. OTOH, any of those M1 machines would run circles around this setup. Stephen may have upgraded to the current Mac Pro, but I tend to wish he would just hop to a maxed-out M1 Mac Mini. He could use an M1 Mac Air for traveling and his walks in the woods:

07-popcorn-rig1.png


Besides trimming seconds from his workflows, we would all benefit from Stephen's excursions into Metal and Neural Engine code.

Does the M1 change your equations for running Wolfram in the cloud? It would seemingly be better to run the computations locally. OTOH, 60 seconds of CPU time should deliver far more computation as Wolfram updates their cloud resources to Apple Silicon or comparable server-based cloud resources.

I'd love to hear your feedback when Wolfram releases a Universal 2 version of their code you get to run some trials on an M1 machine. In any case, Intel's September slogan of "The World’s Best Processor for Thin-and-Light Laptops" is now expired.
I don't know where WRI is on a native version of MMA for AS. They say they're doing compatibility testing now, but don't say if this is for a native version, or through Rosetta2. But based on discussion on the Wolfram forums, it seems it's the latter, since the one benchmark that has been posted is using that.

Someone on the Mathematica Stack Exchange mentioned there's not yet a native open source Fortran compiler for AS (the only one available is from NAG, and WRI might not want to pay the fee). I don't know if his affect's MMA's performance.

Once they do release a native version, it will of course be very interesting to see how it benchmarks on AS. Though regardless of how well it performs, I don't think that will change my current view of running in the cloud—it's unlikely they'll replace their servers with AS anytime soon (I don't know if they even use MacOS on their servers). Thus, if anything (assuming you have an AS Mac), it becomes even more desirable to run locally. And even if they do eventually upgrade their servers to AS, and even if AS is, say, 50% faster, 2/3 of several hours is still several hours, which means the cloud remains unsuitable.
 
Last edited:
Someone on the Mathematica Stack Exchange mentioned there's not yet a native open source Fortran compiler for AS (the only one available is from NAG, and WRI might not want to pay the fee). I don't know if his affect's MMA's performance.

Once they do release a native version, it will of course be very interesting to see how it benchmarks on AS. Though regardless of how well it performs, I don't think that will change my current view of running in the cloud—it's unlikely they'll replace their servers with AS anytime soon (I don't know if they even use MacOS on their servers).
Why do you think a Fortran compiler is required to accomplish a native build of Mathematica? Wolfram has had the Wolfram Player running on iOS/iPadOS since 2017. That was a port to the very same instruction set that Apple is running on the M1 processor. How did they get that to run, and why wouldn't they do exactly the same thing to build for macOS?

Thus, if anything (assuming you have an AS Mac), it becomes even more desirable to run locally. And even if they do eventually upgrade their servers to AS, and even if AS is, say, 50% faster, 2/3 of several hours is still several hours, which means the cloud remains unsuitable.

Exactly. That's what I said.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.