Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In my experience, macOS/OS X/Mac OS X requires SSDs to work well beginning with Lion. Back then I was using a base spec Early 2009 Mac mini (2 GHz C2D, 2 GB RAM, 160 GB HDD) and Lion ran way slower than Snow Leopard, specially in boot times, application lauching and multitasking. I even added more RAM (went to 8 GB, the max supported), but it wasn't until I replaced the drive with an SSD that the system began running smoothly again.
 
I definitely have to agree with dronecatcher here on about everything, and the point he’s making.

From High Sirrra 10.13 and up, HDD’s are nearly unusable. There is a similar phenomenon with later versions of Windows 10 as well.

A couple years ago before I had my upgraded MP5,1 my main machine was a Mac Pro 3,1. Dual 2.8GHz xeons (“8 Core early 2008”), 20GB RAM. At the time I was at my old job and I did not have much to spend on this stuff. My first experience since 2013 with “modern” macOS was on that Mac. I had figured out how to install 10.12 Sierra onto it which was the latest at the time. It was pretty quick for the most part. I didn’t own an SSD for it yet. Eventually I upgraded it to 10.13, and it was clear it needed an SSD. All that memory, all those cores, well the OS is designed to run on an SSD and it just tanks performance if it isn’t on one.

Few years later I have a better job and I have acquired (hoarded?) a lot more computers. I have two 4,1/5,1s. When I got the second one, I upgraded it to Mojave 10.14 which was the latest at the time. At the time it was a stock 8-core 4,1 at 2.66GHz I think. 24GB RAM. My main MP5,1 which had already been upgraded with an SSD and a bunch of other stuff was pretty much instantaneous to run any app.
The 4,1 however only had a HDD. Mojave was painfully slow on it. In fact the aforementioned MP3,1 with it’s SSD was faster.

Same thing with Windows 10. I have a Sony Vaio laptop with an i7-2670qm and 16GB of RAM. I’ve had it for many years. When I finally took the plunge from Windows 8.1 to 10 on it, it would get slower and slower as M$ pushed out those new releases every 6 months. Imagine booting up a random Dell Dimension in 2005 with Windows XP on it, waiting 15 minutes for firefox to launch after it finally boots. This is how my Vaio felt well in comparison on 8.1 and 7, it was one of the fastest machines I’d ever used.
I finally tossed in an SSD and it was back to feeling like it used to, running Windows 10.

I’m not sure how developers are doing this since neither Windows nor macOS has been changed fundamentally from the ground up. Windows 11 is still running the NT base, and runs on NTFS. macOS is still Darwin UNIX. I didn’t run APFS on any of the Macs I mentioned until they had SSD’s. They were still using HFS.
 
Ok but how much of this slowdown is bloat, and how much is ssd optimization? If a good spinning platter gets you 180 MB/s, and an an NVME gets 5000 MB/s, then that’s just sheer horsepower making the difference, which can make up for years and years of unoptimized bloat. If you have the same machine crawling with a hard drive, but usable or even quick with an ssd, there are multiple factors at play.
 
I understand the natural inclination....but my main concern is whether the film is worth watching, the resolution will never fix that - everything I cherish as landmark cinema was viewed on a fuzzy 576P television ;)
Fair, but in film school we used grainy 16mm black and white, I liked it better than my sd tv.
 
  • Like
Reactions: AdamBuker
Ok but how much of this slowdown is bloat, and how much is ssd optimization? If a good spinning platter gets you 180 MB/s, and an an NVME gets 5000 MB/s, then that’s just sheer horsepower making the difference, which can make up for years and years of unoptimized bloat. If you have the same machine crawling with a hard drive, but usable or even quick with an ssd, there are multiple factors at play.
I am very good at keeping Windows unbloated. Before I got the SSD, I decided to reformat that Vaio and re-install Windows just to be sure. Nope. Clean install, and it was still unbearably slow. At least for a 4C/8T i7 machine.

Same experience on my Mac Pro 3,1 in transitioning to an SSD. I cloned the HDD onto an SSD, and the difference was just insane.
 
Sure, people using G3's probably said the same thing about G4's and G5's. People using System 7 said about 8 and 9 .. etc etc... To each their own. Happy owner of most of those anyway :)
I don't believe I am communicating my point clearly. There are tasks which older computers can easily handle if technology "improvements" hadn't rendered them obsolete.

Take Mac Rumors as an example. The goal of this site, at least as I believe it, is for people to communicate with one another. There's nothing overly complex about this forum that relegates a G3 to obsolescence. I recall participating in forums on G3's (and older) systems without issue. Yet using such a system on this site is an exercise in futility. Why? Because that's the world we live in today. Something as simple as this concept of this site requires computing resources which, as far as I can tell, are unnecessary.

Yes, there are tasks which require the latest computing power. I'm not advocating for the advancement to cease. But there are, IMO, too many applications where having the latest computing power is unnecessary.
 
Bloat bloat bloat. It’s just how the unholy marriage of capitalism and the internet work. Every new feature seems like it comes with more surveillance, more “targeted promotions,” less creativity. I’d argue that most innovations help companies, not end users. Still, I wouldn’t want to edit 4K videos on a G5. I definitely prefer 4K to SD video.
What about HD compared to 4K? IMO HD was a significant improvement over SD. I feel that 4K over HD isn't remotely close.
 
What about HD compared to 4K? IMO HD was a significant improvement over SD. I feel that 4K over HD isn't remotely close.
Yeah HD was a huge step, much bigger than 4K. The problem was, if you were doing prosumer video, most HD cams were just faking it. I used an HVX200 and it was really 720x540 but pixel shifted to give slightly more detail in 720 or 1080p mode. Moving up to a GH1 was a real major improvement, especially using vintage Nikon glass. 4K cameras for the most part now are actually 4K, so you can see the vast improvement in image quality in a way prosumer cameras couldn’t do in the 1080p space.
 
Yeah HD was a huge step, much bigger than 4K. The problem was, if you were doing prosumer video, most HD cams were just faking it. I used an HVX200 and it was really 720x540 but pixel shifted to give slightly more detail in 720 or 1080p mode. Moving up to a GH1 was a real major improvement, especially using vintage Nikon glass. 4K cameras for the most part now are actually 4K, so you can see the vast improvement in image quality in a way prosumer cameras couldn’t do in the 1080p space.
Funny enough, though, the displays were a different matter. Our only tv in the house is an 11 year old bottom of the line Panasonic 1080p plasma, and it still looks great. Yeah, I love the 4K oleds, but I can’t bring myself to buy one as long as this tv still works. It works like new, no problems ever, and it’s moved several times including across the country.
 
Funny enough, though, the displays were a different matter. Our only tv in the house is an 11 year old bottom of the line Panasonic 1080p plasma, and it still looks great. Yeah, I love the 4K oleds, but I can’t bring myself to buy one as long as this tv still works. It works like new, no problems ever, and it’s moved several times including across the country.
I am in the same situation. I have a nice 1080p TV and I can't see any benefit to replacing it with a 4K TV. IMO 4K doesn't offer a substantially better experience over HD that HD did over SD. Yet we're in the era of 4K TV. More computing resources required for little benefit. Such is the way of computing these days.
 
  • Like
Reactions: profcutter
I am in the same situation. I have a nice 1080p TV and I can't see any benefit to replacing it with a 4K TV. IMO 4K doesn't offer a substantially better experience over HD that HD did over SD. Yet we're in the era of 4K TV. More computing resources required for little benefit. Such is the way of computing these days.
I think the main difference now is HDR. If you’re right up on a 55 inch display, you can see the difference with a 4K tv, but it’s the intense color reproduction that really shines on the new tvs. Watch Strange New Worlds on paramount plus, first on our workhorse 1080p plasmas, then on a 4K oled. Does not compare.
 
@Dronecatcher, I recently bought a 2011 mini (2.3GHz i5, 8GB, 120GB SSD w/ 10.13.6) to add to my setup, and assigned it as my main driver. That is; Synergy BT keyboard/mouse host, sharing the home WiFi to ethernet LAN in my office and Time Machine host (USB HDDs) for the network.

My goal was to replace the regular use of a MacPro3,1 (8-core 3.2GHz, 16GB, multiple TBs of HDD storage, 480GB SSD w/ 10.11.6) which was previously in this role, to reduce power consumption while maintaining the relative comfort (speed / smoothness / convenience) I was used to. Also to spend as little as possible in the process.

After attempting an (existing) unused Mac mini late 2009 (C2D 2.26Ghz, 8GB, 240GB SSD w/ cloned El Cap from the MP3,1), I decided I needed a little more grunt. The Nvidia 9400m wasn’t very happy driving a 27” HiDPI display and the C2D would bog down quickly under load.

So, the 2011 mini appeared on the ‘bay for less than A$100 and was already fitted with an SSD. I grabbed it and set it up and have been mostly impressed. It is notably noisier than my 2009 minis, but otherwise a good fit for my use.

I was considering putting (unsupported) Catalina on it, but might stick to HiSi as it has always been a solid release, and still supported by the majority of browsers. Plus I like to have a range of OS versions across my Macs for testing things.

Not the point of your observation I know, but I wanted to let you know the 2011 is a nice little Mac if you can put an SSD in it. :)

There are members on the forum who have even installed Snow Leopard on the 2011 mini. It would be interesting to set this up on the HDD and then run those comparisons again. It should fly :cool:
 
@Dronecatcher I understood your point. Going beyond macOS, long ago, I was bemused that I required 2GB on my then new C2D Dell Vostro 1520 with Vista to do exactly the same things with exactly the same versions of programs as I had with my PIII HP Omnibook 6000 using 256MB and Win2K. The mind-boggling scale of the sheer bloat which had burgeoned across successive generations of the OS was undeniable at just a cursory level.

This quote by F.W. van Wensveen is perhaps one of the most damning indictments on the ludicrous inefficiency of requiring ever more powerful hardware for computing tasks whose requirements are at best, extremely modest and largely static:

Only 32 kilobytes of RAM in the Apollo capsules' computers was enough to put men on the moon and safely get them
back to Earth. The Voyager deep space probes that sent us a wealth of images and scientific data from the outer
reaches of the solar system (and still continue to do so from interstellar space) have on-board computers based on a
4-bit CPU. An 80C85 CPU with 176 kilobytes of ROM and 576 kilobytes of RAM was all that controlled the Sojourner
robot that drove across the surface of Mars and delivered geological data data and high-resolution images in full-color
stereo. But when I have an 800MHz Pentium III with 256 Megabytes of RAM and 40 Gigabytes of disk space, and I try
to type a letter to my grandmother using Windows XP and Office XP, the job will take me forever because my
computer is underpowered!

My point is that High Sierra obviously leans on launching from a SSD to maintain reasonable speeds - something I wasn't expecting to be so pronounced.

My 2010 iMac has a 3.5" HDD. I'd have to check if it's 5400rpm or 7200rpm, but High Sierra isn't fun.

As I've mentioned before, my experience has been the complete opposite with High Sierra and a (7200rpm) spinner in my 2011 MBP. The biggest issue that I've faced has been sluggishness due to insufficient RAM: once I upgraded from from 4GB to 12GB - everything would open and get underway very quickly.

Funny enough, though, the displays were a different matter. Our only tv in the house is an 11 year old bottom of the line Panasonic 1080p plasma, and it still looks great. Yeah, I love the 4K oleds, but I can’t bring myself to buy one as long as this tv still works. It works like new, no problems ever, and it’s moved several times including across the country.

I also have a Panasonic plasma. :) Unfortunately it's dying slowly, thanks to a green tinge that's progressively tainting the image but for now, it still produces a beautiful picture and I'll make the most of how many years remain till it becomes unwatchable - by which time we might even have entered the realm of 8K.
 
As I've mentioned before, my experience has been the complete opposite with High Sierra and a (7200rpm) spinner in my 2011 MBP.
That's interesting, maybe it's not just my spinner being 5400RPM but maybe it's defective - wouldn't surprise me with the seller who clearly swapped out the SSD when he didn't get the auction price he was expecting!
 
  • Like
Reactions: Amethyst1
The problem was, if you were doing prosumer video, most HD cams were just faking it.
I have a HDV camcorder which shoots at 1440×1080 using rectangular pixels. While it’s not quite full HD, it’s still miles better than SD.

There are members on the forum who have even installed Snow Leopard on the 2011 mini. It would be interesting to set this up on the HDD and then run those comparisons again. It should fly :cool:
I have a 13” 2011 MBP which is basically the exact same hardware. With an SSD, Snow Leopard flies, Mavericks and High Sierra are very usable, Mojave is a bit sluggish at times.

As I've mentioned before, my experience has been the complete opposite with High Sierra and a (7200rpm) spinner in my 2011 MBP.
My iMac has 8 GB RAM. Maybe its spinner is dying, but Snow Leopard is totally fine in terms of responsiveness.
 
If you wanna talk about progress, reading this thread I remembered a video from 2019 when the guy was comparing a Commodore64 with an MacBook Pro for just fresh booting, opening a doc and printing. It was fun to watch and see how much progress was made in 35 years of technology. For the "rules" skip to 2:22, and the showdown at 7:25.

Obviously that we increased the complexity of the tasks and could achieve much more that we ever imagined, but some tasks that we do, don't need to be bloated to the infinity.

This can be a really long discussion because the "under the hood improvements" sometimes aren't real improvements for the users, but for the developers to cut time in launching new products, "with new features that nobody asked", many dynamic things that takes much more time to be processed and rendered, because we have now much more processing power to be wasted.

Many things aren't compiled and optimized like before also because the languages are designed for speed of the development team and collaboration with an increased number of participants, so the quality of the code decreases making all of those problems only worst with time.

So this it's real progress? Yeah, but I would take much slower and consistent progress any time that this "modern progress" that has been rushed at least the past decade.
 
Last edited:
My 2010 iMac has a 3.5" HDD. I'd have to check if it's 5400rpm or 7200rpm, but High Sierra isn't fun.
I was gifted a 2010 iMac for free for this reason - the original owner assumed the machine was broken or at least ‘obsolete’. It‘s a top spec iMac, quad core i7 I think, and he’d paid I don’t know how much to have a 2TB HDD installed at purchase. But this OS made his machine unusable.

Ironically it’s now the fastest mac I own, purring away with a 256GB SSD. I couldn’t give away the 2TB HDD - it’s sitting in a drawer.
 
In a previous Millenium, I worked at Adobe. It was apparent, despite all the complaints of our customers and of us (I was in tech support,) that photoshop was getting bigger and bigger, full of bloat and obsolete code that no one had bothered to clean out version to version. When we complained to the higher ups, they explained: no one pays for optimization or removing old code. People pay for new features. High Sierra was a real departure because it allegedly was simply an optimization and bug fix update on a grand scale, if I remember correctly. I wish apple would do that every year, with each OS and each app, but again, you sell more products with “all new” and “exclusive” than “it does the same thing it did before, just much faster and more reliably.”
 
If you wanna talk about progress, reading this thread I remembered a video from 2019 when the guy was comparing a Commodore64 with an MacBook Pro for just fresh booting, opening a doc and printing. It was fun to watch and see how much progress was made in 35 years of technology. For the "rules" skip to 2:22, and the showdown at 7:25.

Obviously that we increased the complexity of the tasks and could achieve much more that we ever imagined, but some tasks that we do, don't need to be bloated to the infinity.

This can be a really long discussion because the "under the hood improvements" sometimes aren't real improvements for the users, but for the developers to cut time in launching new products, "with new features that nobody asked", many dynamic things that takes much more time to be processed and rendered, because we have now much more processing power to be wasted.

Many things aren't compiled and optimized like before also because the languages are designed for speed of the development team and collaboration with an increased number of participants, so the quality of the code decreases making all of those problems only worst with time.

So this it's real progress? Yeah, but I would take much slower and consistent progress any time that this "modern progress" that has been rushed at least the past decade.
Yeah it is funny remembering that you just flipped the switch on the side and boom! Flashing cursor. I share that nostalgia, I do. But I also love filmmaking, photography, music, etc, stuff which made immense progress since the 80s.

Even my Amiga 4000, which was my last non-apple Computer, can’t do things that my phone can do. 4096 colors in 320x480 is just so limiting. If I was focused on games more, I could probably justify trying to find an old Amiga, I miss the simple games that didn’t require major commitments to finish.

But having full-screen video, in HD and now on my new machine approaching HDR color, so simply that we take it for granted, that’s something I would miss terribly if I went back in time. Look at the demo scene on the Amiga and the c64, they were so creative! But their best work simply pales compared to what any 2000s machine can do. Also, it’s funny to think that the shift from immediate flashing cursor happened with the shift from 8 bit to 16 bit. All of a sudden, you went from immediate boot on an apple ][ or a commodore to wait….load…..wait……load…. Swap floppy….. wait…………..

All that is to say, I’m having a hard time imagining the next jump. VR? AI? Maybe I’m getting old but I’m not nearly as excited about that as some of the other major leaps.
 
I also have a Panasonic plasma. :) Unfortunately it's dying slowly, thanks to a green tinge that's progressively tainting the image but for now, it still produces a beautiful picture and I'll make the most of how many years remain till it becomes unwatchable - by which time we might even have entered the realm of 8K.

So I live in a couple places (long, boring story,) and I needed to get a tv for the other, part time place. I was looking far and wide for another panny plasma, since I loved ours so much. People were charging ridiculous money. I had hoped to pick up the grand champion, the zt60 or a vt60, but those were over a thousand bucks, even now, 8 years later. I ended up with an LG a1 bottom of the line oled. It was open box, so less than half the cost of those top of the line plasmas. I have to say, if my home tv started to show signs of green tinges or what have you, I’d be all over the sales looking for a new OLED. Sad thing is, I know damned well they won’t last a fraction of the time that my plasma has done. But for now, they are quite beautiful.
 
  • Like
Reactions: TheShortTimer
But having full-screen video, in HD and now on my new machine approaching HDR color, so simply that we take it for granted, that’s something I would miss terribly if I went back in time.
Maybe ....maybe not. I rediscovered my ability to make music when circumstances reduced me to only having a Nintendo DS with a Korg Synth cartridge to write on - 4 tracks of audio with 2 instruments instead of the near infinite number available on computers.

Going back to basics can be a liberation from the mission creep enhanced by the bewildering digital jewels available in any modern DAW.
 
Last edited:
Even my Amiga 4000, which was my last non-apple Computer, can’t do things that my phone can do. 4096 colors in 320x480 is just so limiting.
Compared to PCs with VGA cards that could do 256 colours at 320×200, it was liberating. And what could phones do in 1992 with their monochrome displays?

Look at the demo scene on the Amiga and the c64, they were so creative! But their best work simply pales compared to what any 2000s machine can do.
Of course any modern machine destroys any Amiga or C64, but that's not the point of the demo scene.
 
In a previous Millenium, I worked at Adobe. It was apparent, despite all the complaints of our customers and of us (I was in tech support,) that photoshop was getting bigger and bigger, full of bloat and obsolete code that no one had bothered to clean out version to version. When we complained to the higher ups, they explained: no one pays for optimization or removing old code. People pay for new features.

I'm old enough to remember when optimization was an important aspect of software because publishers were unable to get away with telling customers that their computers were too slow and should upgrade to a faster model or replace the CPU with a more powerful version.

High Sierra was a real departure because it allegedly was simply an optimization and bug fix update on a grand scale, if I remember correctly. I wish apple would do that every year, with each OS and each app, but again, you sell more products with “all new” and “exclusive” than “it does the same thing it did before, just much faster and more reliably.”

Even Windows 2000 was widely celebrated on this basis due to the huge contrast of reliability and rock solid performance compared to that of the kludgey GUI atop DOS Win9x predecessors. Apple could pitch their products to consumers with this marketing strategy but that's no longer a priority for them, it appears.

Yeah it is funny remembering that you just flipped the switch on the side and boom! Flashing cursor. I share that nostalgia, I do.

Then enjoy these shots taken from some of my 80s machines. :D

N33fFNK.jpg


5dBoD3a.jpg


Nf03arg.jpg

But I also love filmmaking, photography, music, etc, stuff which made immense progress since the 80s.

Even my Amiga 4000, which was my last non-apple Computer, can’t do things that my phone can do. 4096 colors in 320x480 is just so limiting.

The AGA chipset can achieve 262k colours in 640x400 with Hold & Modify 8. ;)

Look at the demo scene on the Amiga and the c64, they were so creative! But their best work simply pales compared to what any 2000s machine can do.

There's still incredible stuff being created for retro computers and the accomplishments are doubly impressive because of their limitations compared to that of 2000s machines. :)

Compared to PCs with VGA cards that could do 256 colours at 320×200, it was liberating.

It was! In 1992 my Amiga had PCM stereo sound as standard whereas all my dad's PCs possessed out of the box was a beep. There was no contest between playing Elite on my Amiga or a 386 and listening to the Blue Danube rendition performed by the PAULA sound chip versus the beep laden version from an internal speaker. :D

And what could phones do in 1992 with their monochrome displays?

If they even had what we'd consider today as a display. For example, this is a version of the cellular phone that my dad used around 1992:

0*JJjgEwrZj0WCESYs.jpg


There was an area to input the phone number and very little else.

Of course any modern machine destroys any Amiga or C64, but that's not the point of the demo scene.

Precisely. :)

Also, it’s funny to think that the shift from immediate flashing cursor happened with the shift from 8 bit to 16 bit. All of a sudden, you went from immediate boot on an apple ][ or a commodore to wait….load…..wait……load…. Swap floppy….. wait…………..

Many/most 8 bit computers cold booted to an immediate flashing cursor and/or availability of BASIC but there were exceptions. I have one that requires the insertion of a system disk before you can even reach a flashing cursor. The 16 bit Atari ST immediately boots to a GUI environment from ROM.

All that is to say, I’m having a hard time imagining the next jump. VR? AI? Maybe I’m getting old but I’m not nearly as excited about that as some of the other major leaps.

I know where you're coming from: I'm more excited about what's being achieved on the retro scene - particularly within this community. :)

I have to say, if my home tv started to show signs of green tinges or what have you, I’d be all over the sales looking for a new OLED. Sad thing is, I know damned well they won’t last a fraction of the time that my plasma has done. But for now, they are quite beautiful.

Unfortunately money is the factor here - we're currently experiencing a cost of living crisis in the UK and I really can't justify a new TV when I have basic financial priorities to cover. Also, I have other TVs - including an LED unit so it makes the case for buying a new one even more difficult.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.