Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m at a university, we don’t have people who write apps for us or enough money to buy new things :(
My iMac Pro should last another 5-6 years, and hopefully in that time x86 emulation on Apple silicon will be fast enough to run a virtual machine. I do have physical hardware running XP as a backup, but its much easier, and I prefer, to do the work on my Mac.
No money to "buy new things". Right. But I bet there is money to pay salaries. And to waste money on iMac Pro. Fire at least one guy at the IT Dept, preferably the manager, since he allowed this very sorry state of affairs, good riddance. That will free some wasted budget.
 
I have a very complicated system (screen comes down from the ceiling, front projector and flat screen, overhead lighting, etc.)

$10 remote won’t cut it - you’d have to press a dozen buttons just to power on the amp, select the source, select which display to use, adjust the lighting, etc. My wife would divorce me.
Here I was thinking you were Tim Cook's charismatic alter ego with all your Apple evangelism.

Or maybe he has D.I.D. and one of his personalities has a wife?


5% of my work requires Windows, 10% requires 32-bit apps, and 10% requires copious amounts of RAM. Right now a 2019 iMac with Mojave handles all that.

This M1 is fantastic but even if it had 10x the performance at 1% of the cost all at 0.5w TDP, so long as it means I can't do 25% of my work and thus requires a 2nd machine, it's worthless to me.
 
They've been beating the ARM world for years. Android tech has always had "better" tech on paper. More cores. More GHz per core. More RAM. And always lost.
"More cores. More GHz per core. More RAM" - all of that was papering over the cracks of the deficiencies/inefficiencies in the OS and SoCs. Classic tactic.
 
  • Like
Reactions: deputy_doofy
You say "repairability" but you really mean "can stick more ram in it and a faster hard drive later on to save me money"

I don't understand the obsession with being able to upgrade computers. There's literally no other piece of electrical equipment I own that I can upgrade, many of my single objective pro devices that cost far more than my Apple devices which do 1000 things can't be upgraded. Neither can my £4000 OLED TV, or the PS5, Xbox, Audio Interface, Hi-fi components, £1000 Sennheiser Headphones, my fridge, my freezer, my cooker, my heat pump, I could go on and on. In all those things I have to buy the spec I want at the time of purchase and stick with it. Just because computers could be fiddled with the 80's doesn't mean it's something that should stay forever.
The examples you cited are pretty much all single-purpose machines. And it's not true that you can't upgrade your fridge. I did with my fridge's fan and compressor. Even headphones can be upgraded with better ear cushions and cables. I could go on and on.

Most people are in the habit of constantly replacing things not knowing most things can be fixed, upgraded, refurbished, reused or its parts replaced.
 
Last edited:
  • Angry
Reactions: NetMage
I'm talking about multicore score / number of cores. Multicore is the only measure of relevance here.

EDIT: Like this:

Mac mini (Late 2020)
Apple M1 @ 3.2 GHz (8 cores) 7643

Mac mini (Late 2018)
Intel Core i7-8700B @ 3.2 GHz (6 cores) 5476

7643 / 8 = 955
5476 / 6 = 913

A 5% difference for a machine that is 2 years newer. So what is the hoopla about, again??
Well, for Harmony, Animate CC, and after Effects, single core performance are key, beyond 4 cores it doesnt matter as renders and inside app performance are non paralell tasks. Of course A lot of video and audio editors would take advantage of multicore. But intel got only 50% gain in single core in 10 years, so this is a huge step fordward.

anyway your maths are out of place, as it is not 8 vs 6 cores, not anymore, different tech, you should compare ALL
15w Vs 45W
Real multicore advantage
Cheaper chip!!
Think this, compare price, heat and energy, you could fit with same i7 cost a 12 cores M1 in a space where Intel was already exhausted.
Macmini 2020 is much better and is cheaper and cooler and save energy and money
 
Last edited:
Yep. I think people are forgetting that the whole hybrid Mac/PC Bootcamp thing was never a fundamental part of the Intel transition game plan anyway. More like Apple saw an obvious advantage that came along with switching to x86, and chose to exploit it officially as a bonus feature instead of just ignoring it. Why not tempt some potential customers who would normally never even look at a Mac if the capability for booting natively into Windows was literally already built in?

End of the day, it was just a capability that was available for a time due to a (temporary) convergence of architectures. Just like Hackintoshes also have been an (unofficial) thing all this time too. If being able to sell natively dual-boot capable machines was a key strategy for the Mac, they'd have never even considered moving away from Intel. But clearly: no, it wasn't.
You're wrong. Bootcamp was there since the very beginning, having been introduced in Mac OS X Tiger, the first Mac OS X to work on the Intel architecture. It certainly is not a "bonus" feature as supporting Bootcamp requires writing all the custom drivers that go along with it. It's revisionist history you're espousing here.

You have to realize Apple wasn't in the same dominant position that it is in back in 2005. They almost had to do it in order to bring in new converts from the PC world. It was an advertised feature of the Macintosh. And strategies do change, but it was part of their grand strategy at the time.
 
Last edited:
of course RAM in the same package as the SoC has numerous technical advantages. First, you can GREATLY reduce the power of the drivers (on both the RAM and the SoC) because they have far less capacitance to drive. (Capacitance is proportional to the distance between the CPU and the RAM. Though that undersells it, because motherboard traces are also much wider (and taller) than the in-package traces, which means it’s really a much bigger improvement.

Additionally, latency is greatly reduced. It takes signals around 6ps per mm to travel within the package. Slightly slower on the PCB (depending on dielectric materials, etc.). But that’s a lot more millimeters to travel.

That also means that the penalty of a cache miss is much higher. So to keep performance reasonable, if memory is not inside the package, apple would probably have had to increase the size of the L2 cache by quite a bit. Which would, of course, take more die area, burn more power, etc.

In fact, distance between the RAM and CPU is so important that I wrote a section in my PhD dissertation on a scheme to stack the RAM together with the CPU in order to minimize that distance.
So M-series will definitely take a hit either on performance or power efficiency if Apple ever moves the RAM outside of the package. It also stands to reason that, based on what you said, the advantages of Apple Silicon will rapidly diminish on high-end machines where performance is of more importance than power efficiency.

I'd be really surprised if Apple can offer the same kind of battery life on ASi-based MacBook Pro 16" while maintaining the same kind of lead on performance they currently enjoy with the ASi-based MacBook Air.
 
THe M1 doesn't really approach backwards compatibility; its all done in software - by macOS.

Thanks for the explanation and sorry if I wasn't more specific.
My question was regarding how ARM handles backwards compatibility with previous versions of the ARM architecture and instruction set.

One of the limitations regulary cited for Intel is the baggage of the x86 design and that even if they invested tons of money in new designs they would still never be able to match ARM in the long run because they need to keep compatibility with existing software.
They tried creating the Itanium and did a terrible job in emulating existing x86, probably because they had to do it real-time, unlike Rosetta 2 which can do ahead of time translation because Apple also controls the operating system.

I was curious if on the ARM side things are more flexible and have evolved over time, or it also has to carry some baggage back from when it was first invented in 1985. For example the M1 only needs binary compatibility with 64bit iOS apps dating from 2013.
 
I'd be really surprised if Apple can offer the same kind of battery life on ASi-based MacBook Pro 16" while maintaining the same kind of lead on performance they currently enjoy with the ASi-based MacBook Air.
Oh ye of little faith. :) I bet an AS-based 16” does both. Bear in mind that the 16” has a 72% larger battery than the 13” so there is some headroom.
 
  • Like
Reactions: NetMage
No money to "buy new things". Right. But I bet there is money to pay salaries. And to waste money on iMac Pro. Fire at least one guy at the IT Dept, preferably the manager, since he allowed this very sorry state of affairs, good riddance. That will free some wasted budget.
The iMac Pro is my personal machine, which I bought because I wanted a really hefty Mac. I use it for Logic Pro in my personal life. I also use it for work.

You're serious?

The applications/code you need to run on Windows XP to spit out data cannot or has not yet been done on Windows 8 or Windows 10 thus far? Is the company that sourced such tools dead? Have you searched any particular windows tech forums/sites digging for a more modern solution?

Last time I heard of something like this it was with IBM's mainframe 3270 sessions and even those were running in a hosted VM environment and employees of Burlington Northern accessed it via legacy browser support over a decade ago. Took about 4yrs and finally they were able to move away from that.

Right now this is more than just a technical sticking point this is more a critical matter as Windows XP is not receiving any support whatsoever so anything that happens bad in that environment ... sorry to say but that's your butt out of the frying pan and into the fire.

Yes, this is correct. We have a number of tools running windows XP, windows 3 and even one running OS2/warp. Semiconductor fabrication tools are expensive and are sold with dedicated control computers. The tools last >30 years, and generally manufacturers do not support computer/software upgrade paths for older pieces of equipment. We do lose old computers, but replace with similar vintage when necessary (purchased used from e-bay). But again, it's cost-prohibitive, and frankly unnecessary, for a university to replace working equipment just because the computers are out of date.
 
  • Like
Reactions: NetMage
A computer is something I live with for half-decades or more. I don't agree with your "I like to pay for what I may or may not maybe need tomorrow or a few years from now today" argument (even if Apple gives me no choice... this is all academic).


No. I refuse to live like you.

Ok, when it comes to Apple I do, but that's because I have no choice. Doesn't mean I have to like it.

At least we can agree that macOS, iPadOS, and iOS, etc., are worth it. Yes?

Fair but I don't like to keep a computer for more than 3 years (generally 1) there's more resale value and you lose a LOT less money if you sell something when it's still valuable, you get the newest tech quicker, you're always up to date and within 3 years the computer industry has moved on a lot, you're missing out on all sorts you can't just "upgrade" to - there's more to a computer than just ram and SSD.

I live with my music production gear and my OLED TV's longer than I live with the same Macs.
 
  • Like
Reactions: Captain Trips
I am thoroughly perplexed by those who bring up that their unique workflows that wouldn’t have worked on the Air, Pro 13, or Mini before the M1 models somehow make the M1 models irrelevant. The point is that the M1 models of those computers are faster and more efficient than the Intel versions - by a significant amount. Full stop. No need to go further with the analysis.

But if you also cannot extrapolate the M1 to a higher TDP with more CPU cores, GPU cores, or unified memory to understand where this goes then you really shouldn’t be commenting.
 
So M-series will definitely take a hit either on performance or power efficiency if Apple ever moves the RAM outside of the package. It also stands to reason that, based on what you said, the advantages of Apple Silicon will rapidly diminish on high-end machines where performance is of more importance than power efficiency.

I'd be really surprised if Apple can offer the same kind of battery life on ASi-based MacBook Pro 16" while maintaining the same kind of lead on performance they currently enjoy with the ASi-based MacBook Air.

I’m not sure how you reach the conclusion about it taking a hit on high-end machines. It will do fine on high end machines.

If the memory is moved off the package, then, yes, power consumption will go up because the drivers will have to have increased current. But given the TDP of high end machines, this would be a small percentage of that. Additionally, latency will increase greatly. But the cure for that is to increase the sizes of the caches. This results in a smaller number of RAM accesses, so the penalty is cancelled out. Of course that means the caches consume more power, but, again, on a high performance machine you have a higher TDP.

As for the 16” MBP, all they need to do is add more RAM *in* the package, add more I/O bandwidth for more ports, and probably add 2 or 4 CPU cores. Since the battery in a 16” is bigger, they will still have amazing battery life, and the performance enhancements will keep the performance lead.

The one big question I have is re: graphics. There is a “discrete” (separate chip) GPU in the works, but I don’t know if they are reserving that for the desktop or if it will make an appearance in high end MBPs. If not, then I suspect they may also add a couple graphics cores to the ”M1x” for the 16” MBP.
 
Many still need Windows for work. MacBook’s offered the perfect solution for those who wanted a Mac for personal use while not needing to own a Windows system for work. I know it’ll be a few years before the full transition but it already has businesses weary of investing time and money into Mac’s that will offer less options and flexibility and consumers don’t want to spend $2499+ on a MacBook Pro when they will need a Windows system as well.

I would think if Windows is needed for work then work would provide the computer. I used to use my MBP (early 2011, 8gb ram, 512gb ssd, GPU but it has since died) for work all the time at home because I just needed to be able to run the Office apps and VPN/VNC. That worked well. But a few years back my work decided they didn't want any non-work managed computers to be able connect to work via VPN. So had to start bringing my laptop home everyday. In theory you can bring your own laptop into work and have certificates and other software installed on it so work can manage it but I didn't want to go that route. So at home I have both my work Dell and MBP connected to monitor/keyboard via a KVM switch (used to do that anyway, but since the GPU died on the MBP I can't connect to external monitors anymore, looking forward to more M1 releases!!)
 
5% of my work requires Windows,
Out of curiosity, for what applications do you require windows?
10% requires 32-bit apps,
I presume you mean macOS 32-bit apps? What applications do you require that were never upgraded? I presume they are completely unsupported at this point, does it seem like a business risk to be dependent on software long dead?

and 10% requires copious amounts of RAM. Right now a 2019 iMac with Mojave handles all that.
How much RAM do you need? How much do you have on your iMac?
This M1 is fantastic but even if it had 10x the performance at 1% of the cost all at 0.5w TDP, so long as it means I can't do 25% of my work and thus requires a 2nd machine, it's worthless to me.
If it had 10x the performance at 1% the cost a QEMU emulator would solve all your problems. :) However, no matter what, I would be looking for replacements for the unsupported macOS software as at some point, that is going to be a problem for you.
 
  • Like
Reactions: throAU
The iMac Pro is my personal machine, which I bought because I wanted a really hefty Mac. I use it for Logic Pro in my personal life. I also use it for work.
It is a great machine. I got my B/F’s hand-me-down when he moved to the Mac Pro.
Yes, this is correct. We have a number of tools running windows XP, windows 3 and even one running OS2/warp.
Seems like an opportunity for someone (I mean that seriously). One of two solutions: Reverse engineer the software and port it to something modern or figure out how to support them via something like QEMU on modern hardware that is still supported.
Semiconductor fabrication tools are expensive and are sold with dedicated control computers. The tools last >30 years, and generally manufacturers do not support computer/software upgrade paths for older pieces of equipment. We do lose old computers, but replace with similar vintage when necessary (purchased used from e-bay). But again, it's cost-prohibitive, and frankly unnecessary, for a university to replace working equipment just because the computers are out of date.
I am curious for what you use this fab gear. Given its age (anything supporting Window 3 is ancient), you are not researching fab techniques, and it seems that teaching people how to fab chips with ancient tech would have issues. Given the cost to maintain this gear and all the hazmat issues, I wonder why you do not out source the fab to somewhere else rather than do it in house. One would have access to more modern processes, etc., with fewer issues.
 
You're wrong. Bootcamp was there since the very beginning, having been introduced in Mac OS X Tiger, the first Mac OS X to work on the Intel architecture. It certainly is not a "bonus" feature as supporting Bootcamp requires writing all the custom drivers that go along with it. It's revisionist history you're espousing here.

You have to realize Apple wasn't in the same dominant position that it is in back in 2005. They almost had to do it in order to bring in new converts from the PC world. It was an advertised feature of the Macintosh. And strategies do change, but it was part of their grand strategy at the time.
while Apple's Mac product line is surely in a much better position than it was back in 2005, I'd say a market share of under 10% is far from being in a "dominant position". Being able to run Windows will still attract a lot of potential customers from the Wintel camp, however I guess the advancement in virtualization technologies and raw hardware power in the past decade may have made Bootcamp obsolete. Dual-booting was never an elegant solution, but running Windows in VM back then was just not practical for even the most basic of everyday tasks. Right now with 13" Macbook Pro 2018 I'd say running Windows in VM already offers a better user experience than dual-booting with bootcamp, albeit it seems still unclear how Apple Silicon will handle running Windows in VM.
 
Fair but I don't like to keep a computer for more than 3 years (generally 1) there's more resale value and you lose a LOT less money if you sell something when it's still valuable, you get the newest tech quicker, you're always up to date and within 3 years the computer industry has moved on a lot, you're missing out on all sorts you can't just "upgrade" to - there's more to a computer than just ram and SSD.

I live with my music production gear and my OLED TV's longer than I live with the same Macs.
So what you're talking about is more similar to leasing your computer devices. I've always been the type to always buy, whether cars or phones or PCs.

I just don't want to lease, but I have to admit that due to Apple's hardware decisions (since 2012 at least) it might be the better option, and something I will consider doing going forward.

My needs have stabilized as I increasingly become an empty-nester, and my Apple devices become single-user ones.

That and the fact that Apple's prices for storage have been getting (slightly) less ridiculous in recent years.
 
Clearly there are some enthusiasts that do it. But IMHO there is no way that anything like 1 in every 100 Mac users uses Boot Camp. I'd be extremely surprised if it's even as high as 1 in 1000.
Not sure we have the same definition of "enthusiasts", but IMHO enthusiasts will own a second machine if they want to play games or do some serious works in Windows. Or they will run Windows in VM if it's for running certain piece of specific software.

As far as I can see, a lot of the bootcamp users are those who like Mac's hardware design and quality, but still used to Windows UI and software, so they buy Mac but still mainly use Windows. And there could be an increasing number of those kind of people in recent time as more and more Mac are sold in countries like India and China.
 
I'm already envisioning a 12" Macbook - similar in appearance to the 2017 model, but with a less bezel-ly mini-led screen, PCIe 4.0 drives, and 1.5x perf improvement over M1, because it will ship with an M2...

But will still charge 200 for an 8GB memory upgrade...
 
while Apple's Mac product line is surely in a much better position than it was back in 2005, I'd say a market share of under 10% is far from being in a "dominant position". Being able to run Windows will still attract a lot of potential customers from the Wintel camp, however I guess the advancement in virtualization technologies and raw hardware power in the past decade may have made Bootcamp obsolete. Dual-booting was never an elegant solution, but running Windows in VM back then was just not practical for even the most basic of everyday tasks. Right now with 13" Macbook Pro 2018 I'd say running Windows in VM already offers a better user experience than dual-booting with bootcamp, albeit it seems still unclear how Apple Silicon will handle running Windows in VM.
I never said Apple is in a dominant position in the PC market (even now). I meant dominant as a tech company. Back in 2005, Mac was still Apple's main business so they had to do whatever they had to do to grow. This ties in with the suggestion that Bootcamp wasn't and couldn't be just a bonus feature.

You seem to have missed my main point entirely. It was never a debate on Bootcamp vs. Dual-booting. It was a debate on whether Bootcamp was "a fundamental part of the Intel transition game plan." The fact is without an Intel chip, you can do neither dual-booting nor virtualization. Parallels Desktop even allows you to start up Windows that's on your Bootcamp drive. They're, for all intents and purposes, one and the same as they rely on the same underlying technology. The kind of technology that's not available on ASi-based devices.

Virtualization isn't the same as emulation. It's just not possible for Apple Silicon to "handle running Windows in VM."
 
Last edited:
I was curious if on the ARM side things are more flexible and have evolved over time, or it also has to carry some baggage back from when it was first invented in 1985. For example the M1 only needs binary compatibility with 64bit iOS apps dating from 2013.
You’re right, it’s flexible for Apple in that they don’t HAVE to support the 32-bit instructions. So, they’re 64-bit clean :D
 
And they don’t need to be.

Premiere Pro may be undeniably powerful, but that is also its weakness, because it continues to be horribly unoptimised for pretty much every platform (often relying on pure specs to bulldoze its way through), and its full-featuredness just means added bloat and complexity for those do need the added functionality.

I believe there continues to be a sizeable user base who will benefit from more lightweight video editors like LumaFusion and Final Cut Pro. This is where Apple is uniquely positioned to cater to this user base who is being overserved by Premiere.
I highly doubt Premiere’s user base is underserved. It’s a full-featured editing program. The people that have to use Premiere for work don’t care about LumaFusion or Final Cut.
 
So what you're talking about is more similar to leasing your computer devices. I've always been the type to always buy, whether cars or phones or PCs.

I just don't want to lease, but I have to admit that due to Apple's hardware decisions (since 2012 at least) it might be the better option, and something I will consider doing going forward.

My needs have stabilized as I increasingly become an empty-nester, and my Apple devices become single-user ones.

That and the fact that Apple's prices for storage have been getting (slightly) less ridiculous in recent years.

Well I mean in a way - but then if you sell anything haven't you always leased it? Obviously it's not leasing as it's mine, I pay for it and I can do what I want - I just choose to sell it whilst it still has a decent market value and upgrade. It saves me money through my business too. I've always wanted to stay on top of technology anyway i'm not a fan of using 7 year old tech, I feel it really beings to show. The longest i've ever owned an Apple product is probably the 2011 Mac mini, technically i've lost all my money on it as now it's just sat on the floor unused and it's not worth enough to bother selling it.
 
  • Like
Reactions: Darth Tulhu
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.