Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is your speculation. I have heard many other reasons why they have chosen to use soldered in RAM, etc., you claim none of those reasons are true/valid. Please provide some evidence that supports your position.
I didn't claim their reasons weren't valid. EDIT: Apple is a company. Their primary goal is to make money. Let's not be disingenuous here.
Curious about this. How much was the tuner board? How long did it take you to replace it? What size was the screen? How old was the set?
Roughly $50 for the tuner board, and about 15 minutes for the repair (back panel screws, boar screw, plug and play). It was a 40" plasma, which I still have and still works (but don't use...kids aged out).

In some cases that will certainly be true. If adding connectors and discrete components makes a device more repairable at the expense of reliability, size or other things, it might be worse for the environment.
Agreed.
Yes, there are clear benefits to being able to add RAM or mass storage, but you do not seem to accept that they are downsides also in terms of size or reliability.
I never said I don't accept that. I point out it's a half-full, half-empty scenario, depending on which side of the benefit/drawback matrix your use-case sits upon.

I can say that my repairable and upgradeable Macs have kept me from buying new Apple machines in 10 years, which of course is bad for Apple but good for me. That will certainly change going forward.

I WILL have FAR, FAR more downtime and a more severe drain to my wallet when issues arise, and will have to pay full price for when I inevitably outgrow my device. At the same time, I draw closer to an empty nest and my needs are reduced by the day.
 
Last edited:
If I replace a circuit board in my tv, and throw away the old one, it's NOWHERE NEAR the same as throwing away the TV and buying a new one.
If you just throw away the TV, it’s one TV. If you replace a circuit board in your TV, throw away the old circuit board, then eventually throw away the TV (it won’t last forever, or you may just want a new one), the end impact for you personally, is that 1 TV PLUS that 1 circuit board to the landfill (not only is it near, it’s more). Replace more parts? More parts to the landfill.

It’s not a lot, but, again if the person in question is a person that will throw things away instead of recycling, then it’s more in the landfill over the lifetime of that device than JUST the TV.

RAM and SSDs (and chargers, for that matter) are highly REUSABLE and RESELLABLE because I venture to say people would upgrade these more than they had to "repair" or replace them due to failure.
They ARE reusable and resealable when they’re in working condition, absolutely. It’s still questionable how many go through the effort, especially when swapping small capacity for larger capacity... how many people even want to pay enough to make it worth your while to sell it? Again, someone who recycles is likely the kind of person that will try to find a buyer. Someone who does NOT recycle is just glad they got the RAM they wanted and the old stick goes into the trash. Different story if they fail, of course.

I just don’t think I’d say that replaceable parts is better for the environment.
 
Last edited:
  • Sad
  • Disagree
Reactions: Jouls and lysingur
If you just throw away the TV, it’s one TV. If you replace a circuit board in your TV, throw away the old circuit board, then eventually throw away the TV (it won’t last forever, or you may just want a new one), the end impact for you personally, is that 1 TV PLUS that 1 circuit board to the landfill (not only is it near, it’s more). Replace more parts? More parts to the landfill.

It’s not a lot, but, again if the person in question is a person that will throw things away instead of recycling, then it’s more in the landfill over the lifetime of that device than JUST the TV.
If I throw away the first tv, and buy the same tv, and it breaks at the same time the first one did, and then buy a third tv, then we have two full TVs in the landfill in the SAME amount of time of having two circuit boards in the landfill, plus, I've now paid for THREE TVs .

Again, don't agree with your logic, and never will.

Repairing devices is always greener than having to replace the entire thing, except for my wallet which would be very much less green.
 
  • Like
Reactions: Jouls and lysingur
That's very strange. In my experience every SSD and HDD I ever owned failed. Which are probably somewhere around 50-100 over the past decades. And every single one of them broke. Mostly after 4-8 years of usage.

MTBF for most spinning drives is 3-4 years and failures there are not rare. However, your statement that every single SSD you own has failed does not match my experience at all.

It was never a problem however because I always replace them after 3-4 years. (Mostly upgrading them at the same time, because why not)

Not clear what you mean that your SSDs broke but it was not a problem because you had already replaced them?

May I ask you a question as well - do you make backups of your data? Because I'm making backups every single day for decades already and I never needed one. Ever. Except for the scheduled installation of new SSD/HDD or new computers.

I have regular backups of my systems. Again our experiences are very different. I have needed them pretty often. Sometimes to replace data on stolen/lost laptops, sometimes to restore data deleted by mistake, when I dealt with laptops with spinning drives, mechanical failure was not unusual.

Do you think we even need to make backups? In my experience it's very rare that a disaster strikes.

Yup, I find I need them pretty often. However, again it is a question of tradeoffs. I pay a few hundred dollars a year for each of my offsite backup options and they are completely automated. Were it to cost me hundreds of dollars a month or take many hours of my time, I might make different choices about what I saved.

Maybe we can turn this around and Apple should be specific how long Macs may be used? Printing something like "Good till 02/04/2025. Do not use after 02/04/2025!" on the cover?

Apple stops supporting systems with software after a period and stops repairing the hardware after a period, and when they do, they announce it fairly far in advance. As a process, it works for me.

My oldest machine is btw. 14 years old. It runs as a firewall behind the router. However, I did replace the fans and I did replace the PSU (which is in my experience the first part to break in any desktop) and the HDD of course.

I have lots of old machines that I fire up on occasion from an 1978 MITS Altair 8800b to an original IBM PC, an XT, an AT, a Mac, a Mac SE, many NeXT Systems (from 1988 through 1997), HP PA RISC systems (running OpenSTEP), and probably still have a PDP-11 that works. However, I have nothing in production that is older than 6 or 7 years (I would have to check).

My oldest Mac is 10 years old. Still running fine as well. I replaced HDDs twice und it runs now with an SSD and serves for testing purposes.

That is great, when was the last time it received a software update? For what kind of testing do you use it?

Let me ask you another question, do you have any problem with old computers? Do you think they are supposed to break and everyone should be forced to buy a new one and throw away the old one immediately?
No problem with them at all, but I am glad that Apple does not support everything forever as I would rather have resources spent on moving forward rather than older products preventing that ("Cannot require 64-clean code, this 20 year old Mac will not run it.").

You have said how old your oldest Mac is, but I am much more interested in how old your newest system is. I am also still interested in what you do/for what you use your Macs.
 
  • Like
Reactions: NetMage
There is NO BENEFIT in me not being able to upgrade or replace (in the event of failure) these myself.

I explained some of them before, but I will put them in here again. Soldered in RAM is much more reliable than socketed RAM. RAM in the same package is even less likely to fail and allows shorter data paths and more allow for lower power (@cmaier is that true?). For portables, sockets and connectors also add size and weight, both bad things. None of these things may matter to you, but they are real benefits.
Does the simplified manufacturing yield a more reliable product?

Yes.

I mean, people had to deal with the GARBAGE butterfly keyboard for 4 very long years (and it wasn't the ONLY garbage we had to deal with during that time).

Nothing to do with the previous question. The new keyboard design and its pros/cons has nothing to do with socketed memory.
 
  • Like
Reactions: NetMage
If I throw away the first tv,

If the first TV has every component tightly integrated directly onto the display driver and therefore is less likely to fail, it maybe way more environmentally friendly than your system with lots of replaceable parts.
Repairing devices is always greener than having to replace the entire thing, except for my wallet which would be very much less green.

As I just pointed out above, that is not necessarily true.
 
I explained some of them before, but I will put them in here again. Soldered in RAM is much more reliable than socketed RAM. RAM in the same package is even less likely to fail and allows shorter data paths and more allow for lower power (@cmaier is that true?). For portables, sockets and connectors also add size and weight, both bad things. None of these things may matter to you, but they are real benefits.


Yes.



Nothing to do with the previous question. The new keyboard design and its pros/cons has nothing to do with socketed memory.

of course RAM in the same package as the SoC has numerous technical advantages. First, you can GREATLY reduce the power of the drivers (on both the RAM and the SoC) because they have far less capacitance to drive. (Capacitance is proportional to the distance between the CPU and the RAM. Though that undersells it, because motherboard traces are also much wider (and taller) than the in-package traces, which means it’s really a much bigger improvement.

Additionally, latency is greatly reduced. It takes signals around 6ps per mm to travel within the package. Slightly slower on the PCB (depending on dielectric materials, etc.). But that’s a lot more millimeters to travel.

That also means that the penalty of a cache miss is much higher. So to keep performance reasonable, if memory is not inside the package, apple would probably have had to increase the size of the L2 cache by quite a bit. Which would, of course, take more die area, burn more power, etc.

In fact, distance between the RAM and CPU is so important that I wrote a section in my PhD dissertation on a scheme to stack the RAM together with the CPU in order to minimize that distance.
 
However, your statement that every single SSD you own has failed does not match my experience at all.

I've two failed SSDs (they aren't long enough on the market yet to have more data about when they usually fail).

One is an external SSD where the controller broke beyond repair after 2 years. It would be easy to continue to use this SSD just connected to a new controller. But someone, who probably thinks like you, decided to solder and glue everything together, so it wasn't possible to get the SSD out of the case without breaking the SSD. What were the benefits of soldered and glued external SSDs again? Never mind, I learned my lesson and I only buy cases with controllers separately from the SSDs wherever possible.

The second SSD reached end of life. It's my first SSD, 120 GB. The number of writable sectors goes down quickly. However, it's still fine for reading, so I connected it to our old Wii to just load a bunch of games from it. If it was a new MacMini the whole computer would be dead.

And whatever you may want to say, there is no reason at all to prevent replacement. You highly underestimate the skills of the Apple engineers if you think that's required for space or whatever. And it's out of question that the ability to replace an SSD makes the whole machine MORE reliable in the long term and not less reliable.

But yes, upgrading SSDs was a big point as well. Especially in the beginning when the capacity of SSDs grew quickly every year. Who would want a computer with a non-replaceable 64GB SSD today from just a couple of years ago?

Yup, I find I need them pretty often. However, again it is a question of tradeoffs. I pay a few hundred dollars a year for each of my offsite backup options and they are completely automated. Were it to cost me hundreds of dollars a month or take many hours of my time, I might make different choices about what I saved.

So if you had to pay 10 times the price for the storage, would you say it would be a good tradeoff if you had to pay twice the price for the computer as well for this?

I could think of a lot of reasons why Apple might say, we block every other backup storage service and you MUST buy our cloud storage if you want to make backups. It's just so much more reliable if Apple does it, and the connection can be much more secure, because security could be done in hardware or whatever.

I have lots of old machines that I fire up on occasion from an 1978 MITS Altair 8800b to an original IBM PC, an XT, an AT, a Mac, a Mac SE, many NeXT Systems (from 1988 through 1997), HP PA RISC systems (running OpenSTEP), and probably still have a PDP-11 that works. However, I have nothing in production that is older than 6 or 7 years (I would have to check).

I don't have the space to keep all old PCs. I mostly give them away to neighbor kids. Which doesn't mean I'm fine when they break some months later.

You have said how old your oldest Mac is, but I am much more interested in how old your newest system is. I am also still interested in what you do/for what you use your Macs.
My lastest Mac is just a 2014 MacMini. I stopped buying Mac hardware after that because Apple doesn't have any hardware which fits my needs.

I need a fast graphics card for Autodesk Maya.
I need lots of memory and disk space for database analytics and for producing music.
I need a fast CPU for pretty much everything.

What I DON'T and will NEVER need is a monitor. I have a great ultra widescreen monitor and will never go back. Buying an iMac just to put this thing under the desk would be ridiculous.

I can get this and even more than I need as DIY for $1700 or for $13000 from Apple for the exact same experience. I love macOS and I hate Windows with passion. But even I have limits. I would buy some sort of (mini) tower from Apple in a blink of an eye - but not for the price of a new car.
 
  • Disagree
Reactions: NetMage

I've two failed SSDs (they aren't long enough on the market yet to have more data about when they usually fail).

One is an external SSD where the controller broke beyond repair after 2 years. It would be easy to continue to use this SSD just connected to a new controller. But someone, who probably thinks like you, decided to solder and glue everything together, so it wasn't possible to get the SSD out of the case without breaking the SSD. What were the benefits of soldered and glued external SSDs again? Never mind, I learned my lesson and I only buy cases with controllers separately from the SSDs wherever possible.

The second SSD reached end of life. It's my first SSD, 120 GB. The number of writable sectors goes down quickly. However, it's still fine for reading, so I connected it to our old Wii to just load a bunch of games from it. If it was a new MacMini the whole computer would be dead.

And whatever you may want to say, there is no reason at all to prevent replacement. You highly underestimate the skills of the Apple engineers if you think that's required for space or whatever. And it's out of question that the ability to replace an SSD makes the whole machine MORE reliable in the long term and not less reliable.

But yes, upgrading SSDs was a big point as well. Especially in the beginning when the capacity of SSDs grew quickly every year. Who would want a computer with a non-replaceable 64GB SSD today from just a couple of years ago?



So if you had to pay 10 times the price for the storage, would you say it would be a good tradeoff if you had to pay twice the price for the computer as well for this?

I could think of a lot of reasons why Apple might say, we block every other backup storage service and you MUST buy our cloud storage if you want to make backups. It's just so much more reliable if Apple does it, and the connection can be much more secure, because security could be done in hardware or whatever.



I don't have the space to keep all old PCs. I mostly give them away to neighbor kids. Which doesn't mean I'm fine when they break some months later.


My lastest Mac is just a 2014 MacMini. I stopped buying Mac hardware after that because Apple doesn't have any hardware which fits my needs.

I need a fast graphics card for Autodesk Maya.
I need lots of memory and disk space for database analytics and for producing music.
I need a fast CPU for pretty much everything.

What I DON'T and will NEVER need is a monitor. I have a great ultra widescreen monitor and will never go back. Buying an iMac just to put this thing under the desk would be ridiculous.

I can get this and even more than I need as DIY for $1700 or for $13000 from Apple for the exact same experience. I love macOS and I hate Windows with passion. But even I have limits. I would buy some sort of (mini) tower from Apple in a blink of an eye - but not for the price of a new car.

You could go with a custom workstation runing Linux for Maya. It would also support a wider range of graphics cards vs AMD cards if you needed something else. Music, not sure what you use but can also support that as well. If you were really outside the Apple market of course.
 
I explained some of them before, but I will put them in here again. Soldered in RAM is much more reliable than socketed RAM. RAM in the same package is even less likely to fail and allows shorter data paths and more allow for lower power (@cmaier is that true?). For portables, sockets and connectors also add size and weight, both bad things. None of these things may matter to you, but they are real benefits.
When it comes to RAM, I can see the benefits, no question. I never denied it, even if you keep missing that part. I cannot accept outright, on your say so, that it is more reliable, though. But that is not the question, is it? The question is: do those benefits outweigh the drawbacks? My point is: Depending on who you ask, the glass is half full or half empty.
Again, so you say. Like I said, I'm on 9-year old repairable, replaceable hardware, and reliability has been a non-issue with regards to RAM or SSDs for that matter (because I replaced them as my needs grew, not due to failure).
Nothing to do with the previous question. The new keyboard design and its pros/cons has nothing to do with socketed memory.
It has EVERYTHING to do with reparability/upgradeability, and whether you fall on the acceptable/unacceptable spectrum of the subject in question.
 
You could go with a custom workstation runing Linux for Maya. It would also support a wider range of graphics cards vs AMD cards if you needed something else. Music, not sure what you use but can also support that as well. If you were really outside the Apple market of course.
I love macOS. Period. ;)

I use a lot of software, like Maya, Ableton Live, Cubase, Studio One, tons of audio plugins, financial & accounting stuff, brokerage stuff, Papyrus, Jetbrains Toolbox, Adobe Cloud...

So going full Linux isn't an option anyway. However, I do have multiple Linux servers and use them remote from my Mac. Mostly for large databases and number crunching.

I'm really curious if Apple will ever produce a higher end iMac without a monitor. Especially because it's nearly 2021 and I think not just most enthusiasts will soon use ultra widescreen monitors. It's just brilliant to work with these. I'm curious if Apple will really go for iMacs with UW monitors or stick with 27". Who wants a 27" if you have an UW already on your desk? And who wants to pay a $2000 premium for an Apple with UW monitor if you have one already?

P.S.: And who the hell would want to take an ultra widescreen iMac to a Genius Bar? I would have to rent a transporter and someone to help me for that. ;)
 
Last edited:
The ability to upgrade means that you can buy a small configuration today to keep up-front costs low and then add as your needs change over time. I built a system a few months ago - i7-10700, 64 GB RAM (upgradeable to at least 256 GB), 3 TB SSD and it has fantastic thermals) for $1,700. I can get to 128 GB of RAM by just adding 2 32 GB sticks. Or I can add storage. Or I can replace the motherboard and CPU with a 64-core Threadripper.

The ability to upgrade is like an option. And options have value.

No different from buying a home where you can do upgrades like remodeling, adding a garden, tool shed or an extension.

But a home is hundreds of thousands and you live in it for tens of years. A computer is a few grand. I don't at all agree with the "I want to save money up front" argument. If you need something better later on, sell it and buy the better model like I have to do with anything else I use.
 
I'm using my computers for work. Time is money. When the SSD in my PC breaks, I order a new SSD online and it will be delivered within 3 hours at my door, mostly the time it takes to relax, grab some popcorn and watch a movie on Netflix. I replace it, restore a backup and be good to go within 4 hours max.

If this happens with an iMac, I need to take the huge iMac to my car, drive 10km through the traffic into town to the next Apple store, can't find a parking spot anywhere near, take the heavy iMac 500m to the store while it's raining, go to the Genius Bar, wait until they figured out that indeed the SSD is broken, be told that it will take 3 weeks to repair, go back to my car to find a parking ticket, binge watch Netflix series for 3 weeks while I don't earn money and lose customers, return again to the Genius Bar, get my iMac back, pay 3 times what I paid for my PC repair and got another parking ticket before I spend another hour in traffic jams till I'm back home again and swear to never buy an iMac ever again.

I mean i'm sure you were doing that even when you could change the SSD in an iMac (which I did used to do back 10 years ago, to make the only Raid zero'ed SSD iMac in the world at the time)
 
If the first TV has every component tightly integrated directly onto the display driver and therefore is less likely to fail, it maybe way more environmentally friendly than your system with lots of replaceable parts.
Yea, well, there's no way to tell that it's NOT going to fail so the likelihood that it will is a wash. Word of (your) mouth doesn't count. And in this case, they're ALL circuit boards.
As I just pointed out above, that is not necessarily true.
All things being equal reliability-wise, repairable is better.

I don't, and never will share your optimistic assumptions. End of story.
 
Storage is the one case where being able to upgrade makes some sense (in general I agree with you). The issue with storage is that, for those in the content creation business storage, needs increase over time and it is not ideal to have to pay today's prices for one's storage needs in 18 months, 2 and 3 years.

This is fair really - although with the speed of Thunderbolt 3, i've found it easier enough to add more storage to a desktop machine. I think after you've got 2TB of stuff you probably need to start looking at what you've actually got that NEEDS to be on there, lots of it could probably be archived or backed up to a slower raid array.
 
But a home is hundreds of thousands and you live in it for tens of years. A computer is a few grand. I don't at all agree with the "I want to save money up front" argument.
A computer is something I live with for half-decades or more. I don't agree with your "I like to pay for what I may or may not maybe need tomorrow or a few years from now today" argument (even if Apple gives me no choice... this is all academic).

If you need something better later on, sell it and buy the better model like I have to do with anything else I use.
No. I refuse to live like you.

Ok, when it comes to Apple I do, but that's because I have no choice. Doesn't mean I have to like it.

At least we can agree that macOS, iPadOS, and iOS, etc., are worth it. Yes?
 
  • Like
Reactions: Jouls and pshufd
I dare say they have trustworthy stats telling them what percentage of Mac users actuallyuse Windows on their Macs. And they probably judged that potentially losing them as clients is worth getting all the other perks.

I for one don't care for Windows, and I don't personally (in my friends circle) know any Mac users who do (and I know quite a few).

I agree with you about not caring much for Windows, I find it rather annoying in pretty much every aspect. However I can tell you that there are many instances where current Macs are used to run a non-Mac OS. At my office we have been fortunate to be able to acquire 9 Macs (in a Windows centric company) cause the hardware perfectly fit the use case. However, of those 9, only 3 run MacOS. The other 6 run the company standard Windows image or some form of Linux.

I hope Apple does get an x86 emulation/translation solution of some kind in place for ASi Macs, cause it really will impact buying choices in some very real environments.
 
  • Love
Reactions: lysingur
of course RAM in the same package as the SoC has numerous technical advantages. First, you can GREATLY reduce the power of the drivers (on both the RAM and the SoC) because they have far less capacitance to drive. (Capacitance is proportional to the distance between the CPU and the RAM. Though that undersells it, because motherboard traces are also much wider (and taller) than the in-package traces, which means it’s really a much bigger improvement.

Additionally, latency is greatly reduced. It takes signals around 6ps per mm to travel within the package. Slightly slower on the PCB (depending on dielectric materials, etc.). But that’s a lot more millimeters to travel.

That also means that the penalty of a cache miss is much higher. So to keep performance reasonable, if memory is not inside the package, apple would probably have had to increase the size of the L2 cache by quite a bit. Which would, of course, take more die area, burn more power, etc.

In fact, distance between the RAM and CPU is so important that I wrote a section in my PhD dissertation on a scheme to stack the RAM together with the CPU in order to minimize that distance.
Now see, this is all a great reason to make RAM unupgradable that I can get behind. The performance benefits far outweigh the drawbacks. However, this is only happening NOW with the M1.

Of course, I'm still waiting to see the real-world impact on the current and future machines that don't have soldered RAM yet.

However, I'm not as technical as you but having the storage SSD soldered is a separate issue, no?

I'm having a hard time reconciling that this will be "better", even if this day and age externals are plenty fast for my needs at least. I cannot speak for time-sensitive pros though.
 
But a home is hundreds of thousands and you live in it for tens of years. A computer is a few grand. I don't at all agree with the "I want to save money up front" argument. If you need something better later on, sell it and buy the better model like I have to do with anything else I use.

I have a 2008 Dell XPS Studio that I was using up until a few months ago. Core i7-920, 48 GB of RAM, lots of SSD and it was driving a pair of 4k monitors. I bought it refurbed back in 2008 for $580. It's been a Linux server, a development system, Windows machine, and Hackintosh.

I saved a ton of money up-front.
 
I love macOS. Period. ;)

I use a lot of software, like Maya, Ableton Live, Cubase, Studio One, tons of audio plugins, financial & accounting stuff, brokerage stuff, Papyrus, Jetbrains Toolbox, Adobe Cloud...

So going full Linux isn't an option anyway. However, I do have multiple Linux servers and use them remote from my Mac. Mostly for large databases and number crunching.

I'm really curious if Apple will ever produce a higher end iMac without a monitor. Especially because it's nearly 2021 and I think not just most enthusiasts will soon use ultra widescreen monitors. It's just brilliant to work with these. I'm curious if Apple will really go for iMacs with UW monitors or stick with 27". Who wants a 27" if you have an UW already on your desk? And who wants to pay a $2000 premium for an Apple with UW monitor if you have one already?

P.S.: And who the hell would want to take an ultra widescreen iMac to a Genius Bar? I would have to rent a transporter and someone to help me for that. ;)
Mac Pro. Save up, my friend.

Apple, as a vendor, is not interested in your headless iMac use-case.
 
@cmaier

Can you please share some info on how ARM approaches backwards compatibility?
I always assumed they have more freedom in taking out old garbage and changing design since they are normally used in tightly integrated systems.
Like for example when Apple removed compatibility for 32bit iOS apps, they could then take out any supporting bits from future chips as well and simplify the design.

Thanks!

THe M1 doesn't really approach backwards compatibility; its all done in software - by macOS.

Basically at install time or first run, any app that has x86/x64 in its header is recompiled by Rosetta2 into a native executable.

Now, because this is done "just in time" and without the source (instead it uses the x86 binary), it probably won't generate optimal code, but apple have spent a lot of time getting it to work "good enough". Also, most applications are just a bunch of glue code making calls into the OS provided libraries (cocoa, Metal, etc.) which will be native and run at native speed.

So even if Rosetta itself doesn't generate optimal code, the old comp sci code optimisation rule of thumb of "90% of the time is spent in 10% of the code"*** likely applies - and that 10% of the code is likely the libraries used to do the "heavy lifting" (e.g., Metal, etc.) that are native to the OS. So long as the relevant 10% of the code is optimised (i.e., the system libraries) you'll get most of the speed benefit. Put another way, if the 90% of the unoptimised code runs 2x as slow, you'll only see a ~10% speed hit because only 10% of the processor time is spent in that code.

And that will be vastly offset by the fact that when running native code, the m1 is roughly 3x the speed of say, an ice lake i7 which was in the old MBA/MBP 13". You get 3x throughput 90% of the run-time, and say 1/2 throughput 10% of the run-time - 3x 90% of the time wins.

If/when apple retire Rosetta2 from macOS (as they did with Rosetta), the m1 will have absolutely no compatibility with intel apps at all.


edit:
*** and this is why trying to optimise everything before profiling is stupid. Even if you optimise the 90% of your code that only runs 10% of the time down to ZERO, you're only winning 10% performance, max. And you're very unlikely to reduce its runtime to zero without removing it entirely :D

Far better to spend your performance optimisation work on the 10% of the code that actually matters, performance wise. Which you will learn by profiling your app.
 
Last edited:
  • Like
Reactions: Captain Trips
If I throw away the first tv, and buy the same tv, and it breaks at the same time the first one did, and then buy a third tv, then we have two full TVs in the landfill in the SAME amount of time of having two circuit boards in the landfill, plus, I've now paid for THREE TVs .
Wellll, as long as you continue to assume that the last TV you end up with will never go to the landfill, then I agree, you’re good! I mean, if a person has already thrown two TV’s in the landfill, they’re not going to throw away a third, right? That TV will continue to be in use for many generations to come and never ever see the landfill! (Which is, of course, fallacious.) :)

Repairing devices is greener if the device you repaired NEVER goes to the landfill, that’s 100% true!
 
  • Disagree
Reactions: Jouls
When you look at the trajectory since the switch from PowerPC to Intel, Apple yearly Mac sales jumped for the first few years. But since I think 2012 or so, the sales have been flat year over year, about 18 million units a year. This means there was limited benefit to be had by the switch to Intel. Yes, it brought many people over that wanted X86, but it has been going nowhere since.

I look at that a bit differently.

What's the biggest Mac segment? Laptops. What happened with the x86 switch? Like double the battery life.

I believe THAT drove sales, not so much the x86 compatibility, I think that was a lesser factor. I think power was the big one.

What's happening now?

2x battery life again.

I think we'll see an up-tick in sales purely based on performance and battery life. Legacy compatibility, in the world of largely cloud apps and people who are often mobile-first is less important imho.
 
Mac Pro. Save up, my friend.

Apple, as a vendor, is not interested in your headless iMac use-case.

Yeah, you are absolutely right about Apple. I just hope it may change with the new lineup and ultra widescreens. I even thought already about getting a MacMini, taking it apart and building my own mini (pun intended) tower with it. With some internal drive bays. Let's see if the SoC Graphics will be sufficient for Maya or if there will be eGPUs. And let's see how the real performance will be. It will need some time anyway until the software I want is ported to ARM (Apple said it'll take developers just a couple of days if I remember correctly). I hope by then there will be 64GB MacMinis.

P.S.: And there is also still this "right to repair bill" and a European Union which doesn't like where this is going either.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.