Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well as I’m trying programming on old systems, it’s becoming pretty clear how a few hz or a few BYTES makes such a difference in efficiency and freedom to a programmer.

Apple II’s Applesoft BASIC floating point precision limits its ability to do serious and critical calculations immensely. Just as much as changing how decimals are handled saves a dozen lines of code that try to compensate. If you find your code creates an error, how easy is it to browse the code you’ve already entered and executed, find the exact line, edit, execute, and complete the task all over again? Add onto that any code that’s intended to send output to a printer for logging and checking. How much of these results can be stored in RAM? If you can’t store the whole output in RAM then you’re stuck printing and that means the process is only as fast as your printer. I can’t imagine juggling the floppies it would take to handle that limitation. It’s frustrating that sometimes the limit of characters displayed on screen can sometimes be exactly one line too short to make quick work of an edit or review.

Seems to me that, in these early days, computers were intended to help people create problems to solve. Maybe I don’t need to know how many hands can be dealt in poker, and of those how many are winning hands, but sitting in my room without internet and a machine that is intended to do just such calculations makes it tempting. I can imagine I’d have spent every waking hour making up problems to solve.

It has been great to get all kinds of insight from different industries and hobbies. I generally have a different view of the early home computing days. The harder more nuanced part is how 10mhz in a PowerBook amounts to anything. Is a few seconds saved from image exports really revolutionary?

I was reminded of this thread because I just presented a Keynote to our company President. I'm a drafting manager. He was completely blown away because I've implemented a new workflow in our software that will immensely effect our productivity. It came down to a demonstration where I showed that a job that normally takes somebody an hour can be done in seconds. Seconds. He laughed out loud and looked at the senior managers and said "well why the hell didn't anybody else think of that?" Reminds me of a guy that was selling me a car and, after he'd owned it for 9-years, I touched a switch to test something and he said "WOW, I never even knew my car could do that!"

RTFM lol.
 
YesIt had a cassette tape drive for loading and saving programs. "Hunt the Wumpus" was the first program I probably played on that thing.
OMG - I’d totally forts on about “Hunt the Wumpus”! I even programmed that on my Acorn Atom.

I seem to remember a great little skiing game on the Pet that made good use of the Commodore Pet’s block graphics.
 
Last edited:
  • Like
Reactions: jchap
Well as I’m trying programming on old systems, it’s becoming pretty clear how a few hz or a few BYTES makes such a difference in efficiency and freedom to a programmer.

Apple II’s Applesoft BASIC floating point precision limits its ability to do serious and critical calculations immensely. Just as much as changing how decimals are handled saves a dozen lines of code that try to compensate. If you find your code creates an error, how easy is it to browse the code you’ve already entered and executed, find the exact line, edit, execute, and complete the task all over again? Add onto that any code that’s intended to send output to a printer for logging and checking. How much of these results can be stored in RAM? If you can’t store the whole output in RAM then you’re stuck printing and that means the process is only as fast as your printer. I can’t imagine juggling the floppies it would take to handle that limitation. It’s frustrating that sometimes the limit of characters displayed on screen can sometimes be exactly one line too short to make quick work of an edit or review.

Seems to me that, in these early days, computers were intended to help people create problems to solve. Maybe I don’t need to know how many hands can be dealt in poker, and of those how many are winning hands, but sitting in my room without internet and a machine that is intended to do just such calculations makes it tempting. I can imagine I’d have spent every waking hour making up problems to solve.

It has been great to get all kinds of insight from different industries and hobbies. I generally have a different view of the early home computing days. The harder more nuanced part is how 10mhz in a PowerBook amounts to anything. Is a few seconds saved from image exports really revolutionary?

I was reminded of this thread because I just presented a Keynote to our company President. I'm a drafting manager. He was completely blown away because I've implemented a new workflow in our software that will immensely effect our productivity. It came down to a demonstration where I showed that a job that normally takes somebody an hour can be done in seconds. Seconds. He laughed out loud and looked at the senior managers and said "well why the hell didn't anybody else think of that?" Reminds me of a guy that was selling me a car and, after he'd owned it for 9-years, I touched a switch to test something and he said "WOW, I never even knew my car could do that!"

RTFM lol.
I read an article a year or so ago about some coders. They were creating some whiz-bang stuff that increased productivity, got things done quickly (as you've done) and took into account a lot of variables.

It also cost people their jobs. People that these coders had established relationships/friendships with. Which, a lot of people will shrug and say 'So what'. Well, when the coders also coded THEMSELVES out of a job I guess things became a bit different.

After doing that once or twice to themselves these same coders started keeping their mouths shut.

I'm not making any sort of opinion or statement on that. Just relating what I read.
 
I read an article a year or so ago about some coders. They were creating some whiz-bang stuff that increased productivity, got things done quickly (as you've done) and took into account a lot of variables.

It also cost people their jobs. People that these coders had established relationships/friendships with. Which, a lot of people will shrug and say 'So what'. Well, when the coders also coded THEMSELVES out of a job I guess things became a bit different.

After doing that once or twice to themselves these same coders started keeping their mouths shut.

I'm not making any sort of opinion or statement on that. Just relating what I read.
There are two things I was told about when I first started my career over 30 years ago:

1) What you’ll be doing will kick people out of jobs.
2) Don’t get used to the job because young kids will come in when you’re middle age and you’ll be unemployed.

Here’s what I found:

1) What I do enables companies to get more knowledge out of their data, allowing the to grow m ore strategically and hire more people.
2) I’m still working, young‘uns I have met have no interest in working the hours I used to when I was their age, and I have enough skills, knowledge and passion to run them around the block and back.
 
I seem to have memories of the 90's and an understanding of the 80's as being a battlefield of industries vying to prove their necessity while computers gently said "no...".

Care to share this article?

This is the second time I've started a managing position, found that people are doing things because "that's how we've always done it", and spent the time to un-F the system. In both cases, I didn't necessarily code anybody out of a job. I did take the last division from being behind to suddenly needing more reps to increase our customer base because we started at accomplishing 50% workload per day to 150-200% per day. This time isn't as significant, because it isn't a complete rewrite as much as a change in order of operations and redistributing workload. It should get us from the fire into the pan in no time, though.
 
  • Like
Reactions: eyoungren
2) I’m still working, young‘uns I have met have no interest in working the hours I used to when I was their age, and I have enough skills, knowledge and passion to run them around the block and back.
Yeah, at some point (with what I do) I realized that my boss at that time needed someone in their 20s. Someone willing to work almost 24/7, have no life and sleep in the office on a cot.

Not anything I was interested in doing as the sole supporter with a wife and two small kids. As with you, I have enough skills to keep employed at my age (51). I'm a graphic designer working on golf scorecards and yardage books. They type of stuff I do would put most young people looking for flash and notoriety to sleep. But, I remain employed because I've got 20+ years to back my skills up.
 
  • Like
Reactions: jchap
I seem to have memories of the 90's and an understanding of the 80's as being a battlefield of industries vying to prove their necessity while computers gently said "no...".

Care to share this article?

This is the second time I've started a managing position, found that people are doing things because "that's how we've always done it", and spent the time to un-F the system. In both cases, I didn't necessarily code anybody out of a job. I did take the last division from being behind to suddenly needing more reps to increase our customer base because we started at accomplishing 50% workload per day to 150-200% per day. This time isn't as significant, because it isn't a complete rewrite as much as a change in order of operations and redistributing workload. It should get us from the fire into the pan in no time, though.
I will see if I can find the article. It was on Flipboard at some point, I think an Atlantic article.
 
I seem to have memories of the 90's and an understanding of the 80's as being a battlefield of industries vying to prove their necessity while computers gently said "no...".

Care to share this article?

This is the second time I've started a managing position, found that people are doing things because "that's how we've always done it", and spent the time to un-F the system. In both cases, I didn't necessarily code anybody out of a job. I did take the last division from being behind to suddenly needing more reps to increase our customer base because we started at accomplishing 50% workload per day to 150-200% per day. This time isn't as significant, because it isn't a complete rewrite as much as a change in order of operations and redistributing workload. It should get us from the fire into the pan in no time, though.
Found it.

It's behind Atlantic's paywall, so unless you have free articles left you can't see it via direct link.

Here is the Google cached version…

 
I'm grateful I got to read these throughout the work day.

So it seems a lot of people were using their computers at that time just for Usenet type message boards?



When I was repairing PowerBook G4's and teaching the owner how to work around Flash Player with FlashBlock and Flash alternatives, I was impressed that 3-year-old laptops were still going "strong" *cough wheeze*.

I'm currently on a 2012 MBA for my main. I bought it in 2013 for $500 and I'd say if I threw it in the trash, I got my worth. Actually my other two portables were found in the trash. Spoiled rich kids just pitch these things every year when they get the new one.
The day I watched Steve Jobs unveil the iPad, the A4 chip made me drool. I equated it to a PowerMac G4...in your hand! Most impressively, it had no fan. That got my imagination turning as I dreamed that one day there would be a MacBook Air based on the Apple chip that would have no fan at all, the iPhone's Retina Display pixel density, and a magical port that would do it all so there was no need for ethernet, FireWire, etc. and it could be used for docking.
Well I waited and waited and now...damn it was worth it. I'm getting an M1 MBA as soon as the refresh causes them to depreciate.



What does a repair shop need an Apple III for?



And what made one better than the other as far as for the user? I understand lack of expandability but was it just software support? Kinda reminds me of the WindowsME/2000 disaster, you just had to experience it to get it.



I like that little insight. Maybe I could relate since the "locked down" iOS has been threatening to bleed over into macOS with a similar attempt to make things "easy" by making them "stupid".



I'm curious how 68k machines were used for graphics at that time. How do a few hundred (in width) grayscale pixels add up to something print-worthy or useful?
You Bet!

running a print shop you had to use every tool in the Mac cradle
early docs were done on an apple 2 plus and image writer... miss that sound

best thing that happened was the Lisa then upgrading to a full Mac 128, 512 plus, setup with laserwriter
now that, was like floating on a cloud.

all of the iterations of software. workhorses were MS word, excel, aldus PM, adobe illustrator
such fun to learn the new software and everything was so easy.

now adays, its just the same old grind but with bloated over priced software you don't own, you don't need it but you need it...
mainly for new machine compatibility.
 
I’m hoping to hear more of a perspective of erstwhile tech from the contemporaneous era. I’d like to know more about what an 800x600 resolution meant to somebody at the time and how things changed. What happened the first time you got a whole whopping 1mb of RAM?

Just trying to view old “junk” with a more fair eye.

Kids today probably laugh at N64, but the first time I saw 3D Mario running around a corridor and lighting his butt on fire, it was like dreams coming true. The most advanced 3D effect I had experienced then were the Flying stages in Lawnmower Man on SNES, so try to imagine how much of a leap that was for a console kid.
The Mac 128 - Mac plus were basically the same machine and were "locked" and not upgradable. eventually people found a way to add memory to the 128
the Mac plus shipped with 1MB of memory but could support 4MB with a workaround
1MB chips were hundreds of dollars and yes to load programs you needed to load them in RAM just like today and for the Mac you could allocate Memory to programs speed things up open more windows etc...
upgrading to 4meg was paramount and getting an external HD was a must buy as well, or you would have to load the operating system disk before you could use the computer.

later when power Macs were on the scene and the advent of the multi scan monitors from apple were available the 1024 on 17" and 1920 on the 20" were neat to look at but in those days like now video cards for apple computers were virtually-non existent.
 
Experiences includes companies where:

1. VP bought an IBM PC when they came out charging $1500 on his personal credit card.
2. Same VP later buying a 20 MB hard disk for it, almost as large as current MacPro but much noisier, for $5K.

3. Command line arpanet was used (its technologies such as tcp/ip enabled the internet)

4. Programming on a 512x512 Plasma graphical terminal for education (PLATO). It was timesharing, but had graphical experience not duplicated for over another decade with the Lisa. Main computer systems hard disk was so large and heavy that the concrete floor vibrated as it did reads/writes. Technologies which PLATO introduced included forums, message boards, online testing, e-mail, chat rooms, picture languages, instant messaging, remote screen sharing, and multiplayer video games.

5. (GRiD Systems) produced:
a. First portable computer
b. First clamshell laptop
c. First portable tablet

6. Worked for the company (Silicon Graphics or SGI) that produced the 3D graphics for the first Jurassic Park. Revolutionized the film industry, providing the foundation for all of the Marvel and other CGI movies today. If the fan failed some interior components could melt they got so hot. Still have the T-Shirts "We build a better dinosaur".
 
Last edited:
aaahhh the good ole days. It was due to Apple that got me interested in digital photography (only a hobby). I remember seeing a Apple Quicktake 150 digital camera at a computer fair. Due to the design of it I thought it was one of those polaroid instant cameras which I had been after at the time. When I looked at it more closely and noticed it was not one of those instant camera's i wondered what the heck it was as i'd never seen or even heard of digital camera's back then. My intrigue got the better of me and I purchased it but had no machine to use it on so I bought a second hand Apple 520c powerbook which had no operating system on it and the only OS i could find at the time was an original OS install CD but alas did not have a CD player that would work with a powerbook so yep, you guessed it, out I went and bought a SCSI CD player.

The internet was not around as we know it today which meant I had no way of knowing how everything worked or went together so I took it all to a local Apple dealer who didn't do walk in customer work, only business work but was kind enough to help me out. They got everything working and I was ready to go to try out the digital camera. All i had ever used was the traditional 35mm film cameras so using this Quicktake digital camera was a complete eye opener for me.

I cannot for the life of me find where I have put the 520c so I use it on my 3400 powerbook instead. Is still a neat looking camera even though I do get some weird looks by passers by when I am out and about using it :)
 
later when power Macs were on the scene and the advent of the multi scan monitors from apple were available the 1024 on 17" and 1920 on the 20" were neat to look at but in those days like now video cards for apple computers were virtually-non existent.

That's not really true. There was a lot of healthy competition in the Mac graphics card market, from Radius to SuperMac to RasterOps, and probably a couple of others I'm completely forgotting. The problem is, the good graphics cards that could drive a 19" monitor in 24-bit color with good performance were breathtakingly expensive. I splurged for a 17" monitor and a midrange Radius NuBus card for my IIci, and that was about all I could handle financially at the time, but it did pay off in my ability to work more productively.
 
  • Like
Reactions: VulchR
6. Worked for the company (Silicon Graphics or SGI) that produced the 3D graphics for the first Jurassic Park. Revolutionized the film industry, providing the foundation for all of the Marvel and other CGI movies today. If the fan failed some interior components could melt they got so hot. Still have the T-Shirts "We build a better dinosaur".

This whole post is the greatest thing I've read in a good long while. Totally underrated. Please write a book, seriously.
 
  • Like
Reactions: Middleman-77
...

6. Worked for the company (Silicon Graphics or SGI) that produced the 3D graphics for the first Jurassic Park. Revolutionized the film industry, providing the foundation for all of the Marvel and other CGI movies today. If the fan failed some interior components could melt they got so hot. Still have the T-Shirts "We build a better dinosaur".
We used to use SGI workstations in the lab first neuroscience lab I worked at. They were fantastic for the time - much better than the Sun worsksations we had used previously. Never melted a SGI down though. :)
 
When I was in high school taking CS classes - in the late 90s - I remember several companies sending men in suits promising to pay for college if we would promise to work for them for X years managing their old cobol / <insert xyz here> systems. This continued even in college.

I’m in my late 30s. I remember talking to several developers who were 2x my age - several of them expressed regret that their software projects earlier on resulted in job loss for people they knew.

I know several guys who were begged to come out of retirement to help manage old Unix servers that California cities still use today.

Now everything is moving to “cloud” - we use Jira for our help desk and sprint management application. It’s amazing how complicated these things are and what you can do with them - but they’re sure a pretty $.
 
Last edited:
  • Like
Reactions: eyoungren
Like others, I started off with a TRS Color Computer, and eventually got a version of unix running it, which is fantastic given its limitations, tiny ram, dual floppy drive.
Yeah, me too! I started with a Tandy/Radio Shack Color Computer 2 with its cassette recorder and an Okidata dot matrix printer. I really only ran Microware's real-time OS-9 (the Unix-like OS) except when using the game cartridges. When the Color Computer 3 came along, I moved to it and picked up the Multi-Park Interface and a single sided 35 track 5.25" floppy disc drive. The first big upgrade was getting it to 512 kb of RAM and replacing the floppy disk drive with two 40 track, double sided drives. Eventually, I moved it to a mini tower and added a 100 mb SCSI hard drive and keyboard adapter to use an old PC keyboard. It's still out in my shed. I wonder if it still boots.

I still occasionally fire up OS-9 on a Color Computer 3 emulator on my Mac just to remind me how far ahead of its time it was.

My first Mac was a Centris 610 with the 68LC040 processor (disabled math chip). I don't remember its specs other than 8 mb of RAM and an 80 mb hard drive.
 
Last edited:
  • Like
Reactions: VulchR
Combining computers and vehicles, look at what something as simple as computer drafting has done for everyday industrial design.

Dead thread, but I just remembered this as I was looking for a new film camera.

CAD aided in creating the modern digital SLR as we know it.

Here are two Canon cameras that were likely designed with little to no computer modeling. The T70 was super advanced as it had an 8bit CPU and was the first with a display! It is the segmented display on the top beside the shutter release. Yet, for as advanced as it is, it doesn't compare in design to what came shortly after.

1971
1200px-Canon_F-1_%2813746363604%29.jpg

1984
Canon_T70.jpg


Later, with the collaboration of Luigi Colani (I have my opinions of Luigi, but they would be censored on this forum) Canon designed a camera that would shape ALL cameras to come. I believe this wouldn't be possible without the aid of computer modeling and the use of those...compound curves. Oh yeah! Not my favorite design but as far as film cameras, it was revolutionary.

1986
film118_b.jpg
 
It‘s funny, because I know the F1 predates CAD for consumer products, but looking at the prism and working your way down, it looks like a low-polygon rendering of a camera.
 
I purchased an Apple IIe in 1983 at the Base Exchange in Atsuki Air Station Japan, ferried it home to NAS Agana on my EP-3, and kept it for 9 years. Mostly used for word processing, some basic banking records, games were text, that computer setup cost a pretty penny, but I never looked back. In Japan, you could find a Japple, but I thought better of that and struck with the authentic hardware. :)

72A50CA7-7F1D-4038-A4EC-C5C0EA353500.jpeg

'You Are Standing In An Open Field West Of A White House, With A Boarded Front Door; There Is A Mail Box Here'

7621232C-9EFE-4DEB-8F6C-5A98DB148677.jpeg
I could find bootleg programs in Tokyo, Seoul and Osan, South Korea. The Orient helped keep my expenses manageable. ;) I finally acquired a GUI interface in 1992 with the purchases of Macintosh Performa 6300 and I’ve owned a Mac ever since, although it is now relegated to what I would describe as ”serious home computer work” and some photo manipulation.
86373757-328C-4D77-A5A4-B7270D6DD9F6.jpeg
Performa 6300​

My first PC was acquired about 1996 and over a period of about 10 years became my primary gaming platform. Oh yeah I fought the good fight with Macs and gaming. 1994 upgraded it to a Power PC. Marathon was pretty fantastic. That was my first real taste when you could round up a group of Macs and have at it in networked competition.

137C2214-1780-4B90-9919-0B9545F9A692.jpeg
Better than Doom! :D

I toted my Power PC to many a LAN party (1999ish), what a beautiful piece of engineering that was. Played a lot of Unreal and Unreal Tournament then.

E521B12D-0CD0-46E2-9C78-0CF06A966F16.jpeg


My current PC, I originally built in 2013 ($1000) upgraded and rebuilt in 2019 has more power than my 2016 MBP, and is used for gaming and now I find myself delving into some serious Unreal Engine graphic work with it because it has the power I need. Windows has always been a maintenance challenge, my MBP is chugging along and if anyone asks, I prefer the MacOS to Windows always. :D
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.