@victoria99 Speaking as someone deeply immersed in the technological world, good on you. It wasn't always this way, but some phones, satellites, and certain computers have long since started to disrespect the users' rights to freedom and privacy. Some forms of modern technology have gotten so invasive into their users' lives and identities that some people have decided to just go ultra low tech altogether, and abandon their smartphones, smart TVs, smart houses, social media, and it goes on and on...
I don't wish to write out another full paper on the subject, but a public figure by the name of Mr. Richard Stallman has pinpointed this exact issue and has actively fought against it for over 30 years. If you are able to watch it, here is a good video of him explaining this crisis to the general public in full:
-
I must ask you to excuse my colleagues' priorities and suggestions; we don't get many folks like you. Historically, most people that come around here in need of guidance with their old Mac simply don't have your patience and principles concerning the subject. Usually, all they want is for all of their apps and websites to 'just work' without putting more than 2 seconds of thought into it, and cannot possibly be bothered by spending any more time than that to get it working and well at that. They just do not care about the value of what even an older and slower computer can still offer them in the modern day, and what it may even teach them about technology going forward. In many ways, it strikes me as a colder and more apathetic world than what it might have been in the past, back when home computing was a new concept.
Personally, I believe that the rise of hyper consumerism in recent eras have a lot to do with this, but I'm sure that there's more to it than just that. It's made such a mark on society though that people in my generation have even been described to have the attention span of a goldfish, which I fully believe given the fact that everyone and their dog, mother, brother, sister, and father all have this little pocket device on their person 24/7 that keeps demanding their attention to the latest notification every 5 seconds of the day, so they're never even given a chance to focus on just one thing (or one person) for a whole five minutes of their time. And what's sad is that people have grown up like this, for as long as they can remember. They don't even have an alternative experience for reference.
-
I can't pinpoint my first computer. Right from the get go, I was already immersed in an ecosystem of a handful of machines, so while I remember one thing at one point in time, I have evidence that I started using computers at an even earlier point in time, so I'm not even fully sure on that front.
The earliest computer I can still remember using was a Power Mac G3 Blue and White in around the 2004 timeframe, which was originally released in 1999. More likely though, I probably started with an iMac G3 somewhere around 2002 or 2003, released a year before in 1998. For the longest time, I was never much impressed by them or by what they allowed you to do, because from my perspective, they were simply always there. But they stuck around for a good while afterward, the iMac only falling out of use sometime after 2010 or 2011. Unfortunately I do not have it anymore, as it stopped turning on and I at the time lacked the technical knowledge to properly diagnose and repair it.
I still have the Power Mac though. Although it was retired much earlier than the iMac was, it still runs like a champ (albeit a very temperamental one, but that's a different issue).
-
@Dronecatcher Of course there is a path of diminishing returns as you move up the ladder, but to a certain point, I also speak from experience when saying that GPU upgrades do make a difference in even 2D tasks, relative to the cards they're replacing. How I perceived it, there was a noticeable improvement in performance going from a Rage 128 to a Radeon ME on a low end G3, and there was also a respectable difference in desktop usages in going from a GeForce 6600 to a Radeon X1900 GT on a high end G5, from how quickly windows opened, to how fast Finder started. But in the end, I suppose user perception of performance improvement is a subjective matter, ultimately.
Given the fact that its pins would need to be precision taped though, I think I might have to end up siding with you on the GPU here. But not running Leopard. I've tried a GeForce4 MX on Leopard, with 1 GB of RAM, and it isn't a pleasant experience to say the least.
And keep in mind that the G5 will suck a ton more power than the G4, and pretty much any other consumer desktop-class computer as well. Nor will it fix the website incompatibilities either (if they have at this point even encountered any yet).
I don't wish to write out another full paper on the subject, but a public figure by the name of Mr. Richard Stallman has pinpointed this exact issue and has actively fought against it for over 30 years. If you are able to watch it, here is a good video of him explaining this crisis to the general public in full:
-
I must ask you to excuse my colleagues' priorities and suggestions; we don't get many folks like you. Historically, most people that come around here in need of guidance with their old Mac simply don't have your patience and principles concerning the subject. Usually, all they want is for all of their apps and websites to 'just work' without putting more than 2 seconds of thought into it, and cannot possibly be bothered by spending any more time than that to get it working and well at that. They just do not care about the value of what even an older and slower computer can still offer them in the modern day, and what it may even teach them about technology going forward. In many ways, it strikes me as a colder and more apathetic world than what it might have been in the past, back when home computing was a new concept.
Personally, I believe that the rise of hyper consumerism in recent eras have a lot to do with this, but I'm sure that there's more to it than just that. It's made such a mark on society though that people in my generation have even been described to have the attention span of a goldfish, which I fully believe given the fact that everyone and their dog, mother, brother, sister, and father all have this little pocket device on their person 24/7 that keeps demanding their attention to the latest notification every 5 seconds of the day, so they're never even given a chance to focus on just one thing (or one person) for a whole five minutes of their time. And what's sad is that people have grown up like this, for as long as they can remember. They don't even have an alternative experience for reference.
-
I can't pinpoint my first computer. Right from the get go, I was already immersed in an ecosystem of a handful of machines, so while I remember one thing at one point in time, I have evidence that I started using computers at an even earlier point in time, so I'm not even fully sure on that front.
The earliest computer I can still remember using was a Power Mac G3 Blue and White in around the 2004 timeframe, which was originally released in 1999. More likely though, I probably started with an iMac G3 somewhere around 2002 or 2003, released a year before in 1998. For the longest time, I was never much impressed by them or by what they allowed you to do, because from my perspective, they were simply always there. But they stuck around for a good while afterward, the iMac only falling out of use sometime after 2010 or 2011. Unfortunately I do not have it anymore, as it stopped turning on and I at the time lacked the technical knowledge to properly diagnose and repair it.
I still have the Power Mac though. Although it was retired much earlier than the iMac was, it still runs like a champ (albeit a very temperamental one, but that's a different issue).
-
@Dronecatcher Of course there is a path of diminishing returns as you move up the ladder, but to a certain point, I also speak from experience when saying that GPU upgrades do make a difference in even 2D tasks, relative to the cards they're replacing. How I perceived it, there was a noticeable improvement in performance going from a Rage 128 to a Radeon ME on a low end G3, and there was also a respectable difference in desktop usages in going from a GeForce 6600 to a Radeon X1900 GT on a high end G5, from how quickly windows opened, to how fast Finder started. But in the end, I suppose user perception of performance improvement is a subjective matter, ultimately.
Given the fact that its pins would need to be precision taped though, I think I might have to end up siding with you on the GPU here. But not running Leopard. I've tried a GeForce4 MX on Leopard, with 1 GB of RAM, and it isn't a pleasant experience to say the least.
And keep in mind that the G5 will suck a ton more power than the G4, and pretty much any other consumer desktop-class computer as well. Nor will it fix the website incompatibilities either (if they have at this point even encountered any yet).
Last edited: