Well as I’m trying programming on old systems, it’s becoming pretty clear how a few hz or a few BYTES makes such a difference in efficiency and freedom to a programmer.
Apple II’s Applesoft BASIC floating point precision limits its ability to do serious and critical calculations immensely. Just as much as changing how decimals are handled saves a dozen lines of code that try to compensate. If you find your code creates an error, how easy is it to browse the code you’ve already entered and executed, find the exact line, edit, execute, and complete the task all over again? Add onto that any code that’s intended to send output to a printer for logging and checking. How much of these results can be stored in RAM? If you can’t store the whole output in RAM then you’re stuck printing and that means the process is only as fast as your printer. I can’t imagine juggling the floppies it would take to handle that limitation. It’s frustrating that sometimes the limit of characters displayed on screen can sometimes be exactly one line too short to make quick work of an edit or review.
Seems to me that, in these early days, computers were intended to help people create problems to solve. Maybe I don’t need to know how many hands can be dealt in poker, and of those how many are winning hands, but sitting in my room without internet and a machine that is intended to do just such calculations makes it tempting. I can imagine I’d have spent every waking hour making up problems to solve.
It has been great to get all kinds of insight from different industries and hobbies. I generally have a different view of the early home computing days. The harder more nuanced part is how 10mhz in a PowerBook amounts to anything. Is a few seconds saved from image exports really revolutionary?
I was reminded of this thread because I just presented a Keynote to our company President. I'm a drafting manager. He was completely blown away because I've implemented a new workflow in our software that will immensely effect our productivity. It came down to a demonstration where I showed that a job that normally takes somebody an hour can be done in seconds. Seconds. He laughed out loud and looked at the senior managers and said "well why the hell didn't anybody else think of that?" Reminds me of a guy that was selling me a car and, after he'd owned it for 9-years, I touched a switch to test something and he said "WOW, I never even knew my car could do that!"
RTFM lol.
Apple II’s Applesoft BASIC floating point precision limits its ability to do serious and critical calculations immensely. Just as much as changing how decimals are handled saves a dozen lines of code that try to compensate. If you find your code creates an error, how easy is it to browse the code you’ve already entered and executed, find the exact line, edit, execute, and complete the task all over again? Add onto that any code that’s intended to send output to a printer for logging and checking. How much of these results can be stored in RAM? If you can’t store the whole output in RAM then you’re stuck printing and that means the process is only as fast as your printer. I can’t imagine juggling the floppies it would take to handle that limitation. It’s frustrating that sometimes the limit of characters displayed on screen can sometimes be exactly one line too short to make quick work of an edit or review.
Seems to me that, in these early days, computers were intended to help people create problems to solve. Maybe I don’t need to know how many hands can be dealt in poker, and of those how many are winning hands, but sitting in my room without internet and a machine that is intended to do just such calculations makes it tempting. I can imagine I’d have spent every waking hour making up problems to solve.
It has been great to get all kinds of insight from different industries and hobbies. I generally have a different view of the early home computing days. The harder more nuanced part is how 10mhz in a PowerBook amounts to anything. Is a few seconds saved from image exports really revolutionary?
I was reminded of this thread because I just presented a Keynote to our company President. I'm a drafting manager. He was completely blown away because I've implemented a new workflow in our software that will immensely effect our productivity. It came down to a demonstration where I showed that a job that normally takes somebody an hour can be done in seconds. Seconds. He laughed out loud and looked at the senior managers and said "well why the hell didn't anybody else think of that?" Reminds me of a guy that was selling me a car and, after he'd owned it for 9-years, I touched a switch to test something and he said "WOW, I never even knew my car could do that!"
RTFM lol.