Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm still on 2011/2012 Mac hardware because of Apple's move towards non-repairability back then.

And I RAGED against that machine until maybe last year.

But even my stubborn jackass self had to concede that the repairability ship had sailed and just wasn't coming back to Apple.

Finally, I came to realize that I love Apple products more than I (STILL) hate the fact that they're sealed, so it's going to be (more) expensive going forward, but at least for me it'll still be worth it (until it isn't).
If you have a very legit reason to "bend over" as I call it (not insulting you, simply pointing out being locked on that situation) then you have to stay.

I simply cant justify it anymore.

And as I always say, I currently have way too much ahrdware from them, but none of the disposable ones.
 
They can more closely tune their silicon for real-life or specialized software. Intel will do this for some customers. I'd love hardware to do Huffman decoding or jpeg zig-zag decoding and I don't see why you couldn't have transistors to do it but it's too specialized for general purpose CPUs.
Imagine if Apple starts putting those afterburners in the 16” Pro
 
As far as I can tell, there has been no support at all for Windows XP since April, 2014. Running critical infrastructure on a system is a bad idea if the device is not net connected, and is simply malpractice if it is (a net-connected with no security updates for 6 years is beyond problematic).
Microsoft was selling Custom Support Agreements for Windows XP as recently as 2017. I don't know if they still are but it wouldn't surprise me. Windows 7 reached End of Extended Support in January this year but Microsoft sell Extended Security Updates for that too. Many large enterprises buy that (sadly). Similar model for Windows Server operating systems.

These things are expensive though, and the costs increase very quickly.
 
I'm still on 2011/2012 Mac hardware because of Apple's move towards non-repairability back then.

And I RAGED against that machine until maybe last year.

But even my stubborn jackass self had to concede that the repairability ship had sailed and just wasn't coming back to Apple.

Finally, I came to realize that I love Apple products more than I (STILL) hate the fact that they're sealed, so it's going to be (more) expensive going forward, but at least for me it'll still be worth it (until it isn't).
Non repairable hardware is going to go bye-bye on the PC side eventually too IMO
 
Far more - I ran an entire business just selling already set up Boot Camp installations - a LOT of people are coming over from Windows or want a Windows option on the Mac.
It's been my experience as well. I'm the go-to guy for Mac related things in my circle and although many people don't end up using Bootcamp or VMs all that much, they take comfort in me telling them that their Mac has that capability.
 
There is precedent for it. I forget which manufacturer it was, but one of them was found to be "cheating" at benchmarks.

Indeed, some were detecting a benchmark and clocking higher, or something like that.

But if Apple added custom silicon to run Geekbench faster, which they could would that be cheating, any more than adding designs in the silicon to run Final Cut faster?
If you look at all the programs people run, and alter your silicon to run those most popular things faster is that cheating, or simply designing your chip to run the most popular/used titles the fastest it can?
It's a debatable point..... :)
 
In any event, they have power well under control, and there will be no problem easily scaling the M1 to fit into even their highest-end desktops (i.e. something like a iMac Pro or Mac Pro).
I'm actually very curious about the mid range lineup between MacMini and MacPro. It's almost 2021 and ultra widescreen monitors are the future. Personally, I will never go back to a small screen and I would have absolutely no use (or space) for a 27" iMac. But I DO have use for a powerful mac. Just not for the price of a new car.
 
  • Like
Reactions: Captain Trips
As far as I can tell, there has been no support at all for Windows XP since April, 2014. Running critical infrastructure on a system is a bad idea if the device is not net connected, and is simply malpractice if it is (a net-connected with no security updates for 6 years is beyond problematic).

Speaking as someone in an industrial machine control function there are still XP machines on factory floors (hell there are still DOS machines on factory floors) and someone other than me set it up but I can get MS support for the ones I deal with. The computers are literally from that time period but these aren’t used as office or data entry they control servo motors and positioning. And as long as a computer can be repaired or adapted to run XP most of these businesses will keep running them. The total machine can cost several hundred thousand dollars or more. Even paying 10 grand and getting back into production is cheap.
 
  • Like
Reactions: lysingur
Please have realistic expectations and educate yourself about how heavy some workloads are if you want to talk tech ✌️
I was just thinking the same thing about your post.

It is obvious that you completely missed the point of my response. Laugh all you want but this is a under 15 watt TDP CPU/iGPU achieving results on par with a 75 TDP dGPU. This is Apple's first iteration of a CPU/iGPU designed to go into desktop and laptop computers. Staggering results.
 
Why the big deal about upgradeability? This is nothing new for Apple, and many other hardware vendors. The form factors of thin'n'light machines often preclude removable components. Larger devices or desktop workstations provide an upgrade path should you prefer this to portability.

I view these computers as appliances that will have a certain lifespan, much like my home appliances or my car. They will either develop faults that are not economical to repair, or I will desire greater performance. If the latter I will sell or gift the device to someone who will be satisfied with it. Everything has a lifespan (us included!) and sometimes upgrading doesn't make much sense. For example, I have a 2012 Dell Xeon workstation, which I could technically upgrade but not to the latest Xeon CPUs, or make use of newer PCIe4 GPUs. Sure I could replace the motherboard, CPU & RAM and re-use my existing GPU & SSD/HDDs, but this would probably cost more than just buying a new machine.

Maybe the Mac Pro will offer plug-in cards with multiple SoCs, and this is the upgrade path?
Why people continue to make excuses for Apple?

There are absolutely no reasons to justify this.

Examples, Intel NUCs are faster and WAY smaller than Mac Minis and yet you can upgrade memory and SSD.

HP Z workstations Mini are the same size as a Mac Mini, yet, they can be upgraded without tools.

Dell Optiplex All in One, same thing, same dimensions as an iMac, yet everything can be upgraded/replaced.

I know, they dont have the magical MacOS, but I am willing to leave the magical kingdom for that.
 
  • Like
Reactions: bobnugget
But if Apple added custom silicon to run Geekbench faster, which they could would that be cheating, any more than adding designs in the silicon to run Final Cut faster?
If you look at all the programs people run, and alter your silicon to run those most popular things faster is that cheating, or simply designing your chip to run the most popular/used titles the fastest it can?
It's a debatable point..... :)
You mean, like Apple could for example include an AI chip in the design to run the AI tests of Geekbench much faster than any other PC which relies on AI support in an external graphics card?

Oh wait...
 
  • Like
Reactions: firewood
You mean, like Apple could for example include an AI chip in the design to run the AI tests of Geekbench much faster than any other PC which relies on AI support in an external graphics card?

Oh wait...

Well simply that, you naturally will wish to design a chip to run the things you want it to run the best it possibly can.
You could say it's cheating.
But also you could simply say it's being optimized for the selection of functions you wish it to perform well in.
 
l (FCP X is the only disaster I recall when they lost half an industry to Adobe and other NLE overnight)
But now, there are more seats of FCP now than at the high of FCP 7. It paid off in the end.

Do you have a source for that 50% datapoint? I don't think I've ever seen that stated.
3 minutes, 8 seconds, and you can find similar statements every year for the last several years.
 
Last edited:
You mean, like Apple could for example include an AI chip in the design to run the AI tests of Geekbench much faster than any other PC which relies on AI support in an external graphics card?
Adding units that are useful for accelerating commonly done tasks is good. Adding units that would only work for the benchmark test would be bad. :) I vaguely remember someone had a compiler that cheated by recognizing the benchmark and then just substituting the result with an appropriate time delay. I cannot remember who it was and I could not find it in a quick search.
 
Adding units that are useful for accelerating commonly done tasks is good. Adding units that would only work for the benchmark test would be bad. :) I vaguely remember someone had a compiler that cheated by recognizing the benchmark and then just substituting the result with an appropriate time delay. I cannot remember who it was and I could not find it in a quick search.
I wasn't actually saying that Apple "cheated" - adding an AI chip would be a bit much just to cheat in Geekbench ;)

Yet, I would take the M1 Geekbench score with a grain of salt, except if you heavily rely on Photoshop's AI filters.
 
Not sure why a lot of people sounds so surprised by M1's performance, yes it is FAST, but isn't that expected? I mean, we all know that the A14 can beat the crap out of the i9-9980HK in MacBook Pro 16 in single core performance in Geekbench already, and only loses in multi core performance because it only has two big cores. And now with M1 having four big cores and a big notebook chassis instead of the comparatively super tiny chassis of iPhone, I'm actually a bit surprised that it doesn't score even higher compared to A14.

I mean, compared to i9-9980HK in Macbook Pro 16, the single core performance of 1700 vs. 1100 looks impressive indeed, but when compared to A14, a notebook CPU score of 1700 vs. a phone CPU score of 1600 is just... not so groundbreaking... Really, compared to the A14 in iPhone 12, it seems the M1 is just slightly bumping up the clock-rate and adding two more big cores as it has the luxury of more than 10 times the chassis size.

I guess when compared to Intel CPUs, M1 looks like a miraculous revolution, but compared to Apple's own A14 in iPhone 12, M1 looks just like a natural evolution, and not really that much of a big leap at that.
 
  • Like
Reactions: NetMage
Far more - I ran an entire business just selling already set up Boot Camp installations - a LOT of people are coming over from Windows or want a Windows option on the Mac.
I’ve done a few of them for “insurance” and “peace of mind”. Whenever I checked back to see if they’ve had any problems with the Windows side, they all indicated that they didn’t have to boot into it. They just figured out how to do what they wanted to do on the Mac side.
 
How the heck is Apple so far ahead in performance? It's incredible how much of a lead they have it's like alien technology.
Hmm, I think they've been ahead in Geekbench single core performance since at least A12, so they've been leading for at least two years already. As far as I can see the only difference now with M1 is that they finally decide to put it in a much bigger notebook chassis so they can crank up the clock-rate and number of big cores, no alien technology needed for that...
 
No, it is more like we have finally caught up. Apple was often criticized for its poor performance compared to its peer group in the desktop space. Intel dramatically influenced Apple's ability to compete in the Mac space by the release schedule of the x86 processors. Often, Intel would release a high power version of their new chip before they released a power optimized version. Since Apple does not use the high power devices they were always left waiting and put behind the performance curve because of the wait.

With the new Apple Silicon, all of Apple's hardware design innovations can really shine because they will compare and compete more directly without being handicapped by being behind the performance curve out of the starting block. This is why M1 is so game changing For Apple.
Thanks for the explanation. Do you think without real competition intel have not been very innovative in the past but with the new M1, there will be a shift in increase and higher performing chips from intel in the near future?
 
Thanks for the explanation. Do you think without real competition intel have not been very innovative in the past but with the new M1, there will be a shift in increase and higher performing chips from intel in the near future?

How could there be? The only thing they could do immediately is to use TSMC as their fab. And that can't be done super quickly - Intel uses very non-standard design techniques - last I hear they were still using mils instead of microns - and they are not used to working with an external fab.

It will take them time to recover, if they can. And they will always have to cope with the x86 penalty.
 
How could there be? The only thing they could do immediately is to use TSMC as their fab. And that can't be done super quickly - Intel uses very non-standard design techniques - last I hear they were still using mils instead of microns - and they are not used to working with an external fab.

It will take them time to recover, if they can. And they will always have to cope with the x86 penalty.
I don't know much about the behind the scenes but I always thought they were slow to milk as much profit as they can with each iteration of their CPU but thank you for explaining the details.
 
Understood but that won’t provide you double the efficacy - I can render video on my iPad Pro with no choppyness that my intel MacBook can only do under severe duress and it has 3 times the RAM as my IPPro.

That's only because the task you mention happens not to be memory-starved even with the relatively small iPad Pro RAM, but this is not true in general.

Trying to do a task with less RAM that it requires would slow it down to a crawl no matter how powerful the CPU is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.