I bought an iMac with 6 GB of RAM, 1 TB of storage, and 2 cores at 2.4 GHz for $2000.
It's remarkable how little Macs have improved over the last
14 years, because I bought that computer in
2007.
I think people give Intel too much crap for the limited improvement.
| 2007 iMac for $2000 | 2021 iMac for $2000 | % Change |
Memory | 6 GB | 8 GB | 33% Improvement |
Storage | 1 TB | 512 GB | 50% Worse |
Compute | 2 x 2.4 = 4.8 | 6 x 3.3 = 20 | 316% Improvement |
Moore's Law says that today's iMac should be 2^(14/2) = 128x better, so yeah, by that metric, the compute hasn't improved by anywhere near as much as it should, but really? We want to blame Intel for how minor the improvements have been?
Before I had the 2007 iMac, I had a 2002 eMac.
| 2002 eMac | 2007 iMac | x Change |
Cost | $800 | $2000 | 2.5x |
Memory | 256 MB | 6 GB | 24x |
Storage | 60 GB | 1 TB | 17x |
Compute | 1 x 800 MHz = .8 | 2 x 2.4 = 4.8 | 6x |
I realize I went from a budget model to a mid-tier model, but holy crap - across the board I got a computer that was many times better despite only being 5 years newer.
Such improvements just haven't happened since. It's not Intel's fault - Intel is doing way better than the other components. Memory and Storage just aren't improving like they used to...
Or maybe they are. I can buy comparable parts in much smaller form factors at much lower prices than before. See, for example, my Raspberry Pi. Maybe I should compare a current top end Raspberry Pi against the brand new iMacs...