Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Altemose

macrumors G3
Mar 26, 2013
9,189
487
Elkton, Maryland
I owned one of these beasts, if memory serves me it had a 4.77mghz processor and I believe I maxed out the ram to 640k.
BTW that is an 8" floppy drive.
Lugged it around in a large sports type bag with an over the shoulder strap.
This system had 2 full size PC expansion slots, I actually installed a 20 MB "Plus HardCard" in one slot and got it to work, but it would not boot from it.
When I think back at what we paid for our hobby in those days it makes me shudder, 3-4k for computers, $500 for the 640k memory upgrade and on it goes.
And to top it off, wages were probably half of what they are today.

We got way off track of the topic of this thread. But it goes to show that we never had any issues with poor code or sloppy code being written on these machines. Because people didn't have spare RAM or hard drive space, code was not written in anything but the cleanest manner.
 

wobegong

Guest
May 29, 2012
418
1
We got way off track of the topic of this thread. But it goes to show that we never had any issues with poor code or sloppy code being written on these machines. Because people didn't have spare RAM or hard drive space, code was not written in anything but the cleanest manner.


Yes you're right - way off topic ;) but it did get me thinking, we had crap software back then too (Did anyone here use Compuserv??) and the programming wasn't necessarily tighter and better it was just a whole lot more basic in its function.
To be honest PC's (Mac's/whatever) are a whole lot more reliable now than they were then. Crashes were very frequent and you had to accept it, even on windows when was the last time you saw a BSOD?
My PM, MBP, Linux work desktop and even (lol) my Win8 desktop hardly ever crash - it's now a big event if they were to do so.
I believe the days of 'switch it off and on again' are behind us, or at least much rarer - we had to do that trick A LOT 'back in the days' :)
 

eyoungren

macrumors Penryn
Aug 31, 2011
28,769
26,835
Did anyone here use Compuserv??
No, I avoided it and AOL like the plague. Of course, at the time that they were becoming popular you had to have a subscription and since I was a teen I was spending my money elsewhere.

If you can recall, Bulletin Board Systems (BBS) that's really where I got my passion for being online. I was introduced to that whole scene in 1985 (I was 14) at our local Commodore Users Group. They actually had a BBS so to connect I had to learn all about baud, modems, 40/80 character terminal apps, x-term, y-term, and on and on. By the time I graduated high school I was running a part time BBS on my Commodore 128.

Then the internet came along some time later and then forums. Todays forums, are really just a natural extension of the BBS.
 

reco2011

macrumors 6502a
May 25, 2014
531
0
Yes you're right - way off topic ;) but it did get me thinking, we had crap software back then too (Did anyone here use Compuserv??) and the programming wasn't necessarily tighter and better it was just a whole lot more basic in its function.
To be honest PC's (Mac's/whatever) are a whole lot more reliable now than they were then. Crashes were very frequent and you had to accept it, even on windows when was the last time you saw a BSOD?
My PM, MBP, Linux work desktop and even (lol) my Win8 desktop hardly ever crash - it's now a big event if they were to do so.
I believe the days of 'switch it off and on again' are behind us, or at least much rarer - we had to do that trick A LOT 'back in the days' :)

I can agree with this to some extent. In the beginning, the part of computer history I really enjoyed, personal computers were maturing. I looked forward to the next OS release, the next processor upgrade, the next new thing. Today it has pretty much settled down.

Computing power has reached a point where it is more than sufficient for a lot of what people do. It doesn't, or at least shouldn't, take a lot of computing power to browse the web, do e-mail, participate on forums, etc. But it does. I still go back to my web browsing example. Why do I need a 2.0GHz dual core system to browse the web? Because web developers keep adding a bunch of, IMO, useless stuff. When I look at my web surfing habits I'm looking for information. Such as what I find here. Or read on a news site. Yet for some reason sites feel the need to add more and more stuff which gets in the way of the content.

Don't get me wrong...I'm not against moving forward. And there are always going to be people / tasks who / which need the most computing power available. But, IMO, browsing the web should be more than adequate on a 500MHz iBook G3 or equivalent.

Now get off my lawn!
 

Altemose

macrumors G3
Mar 26, 2013
9,189
487
Elkton, Maryland
No, I avoided it and AOL like the plague. Of course, at the time that they were becoming popular you had to have a subscription and since I was a teen I was spending my money elsewhere.

If you can recall, Bulletin Board Systems (BBS) that's really where I got my passion for being online. I was introduced to that whole scene in 1985 (I was 14) at our local Commodore Users Group. They actually had a BBS so to connect I had to learn all about baud, modems, 40/80 character terminal apps, x-term, y-term, and on and on. By the time I graduated high school I was running a part time BBS on my Commodore 128.

Then the internet came along some time later and then forums. Todays forums, are really just a natural extension of the BBS.

And I thought I was cool when I was seven and I learned most of the DOS commands so I could play games I found on floppy in the garage.... This is on XP at this point, but my dad made me a DOS boot disk :D

Shows how far we have come today with the Internet and all...

I can agree with this to some extent. In the beginning, the part of computer history I really enjoyed, personal computers were maturing. I looked forward to the next OS release, the next processor upgrade, the next new thing. Today it has pretty much settled down.

Computing power has reached a point where it is more than sufficient for a lot of what people do. It doesn't, or at least shouldn't, take a lot of computing power to browse the web, do e-mail, participate on forums, etc. But it does. I still go back to my web browsing example. Why do I need a 2.0GHz dual core system to browse the web? Because web developers keep adding a bunch of, IMO, useless stuff. When I look at my web surfing habits I'm looking for information. Such as what I find here. Or read on a news site. Yet for some reason sites feel the need to add more and more stuff which gets in the way of the content.

Don't get me wrong...I'm not against moving forward. And there are always going to be people / tasks who / which need the most computing power available. But, IMO, browsing the web should be more than adequate on a 500MHz iBook G3 or equivalent.

Now get off my lawn!

But how could they sell Chromebooks then? :D
 

christo930

macrumors newbie
Original poster
Dec 11, 2013
16
1
Remember this? I still remember my computer teacher in 1st grade plugging in the only SuperDisk drive in the school into her PC and XP telling her it was a regular floppy. The discontent when she couldn't get more than 5 students work on there was hilarious...

The idea of a floppy went away in favor of CDs and DVDs for speed and reliability. The optical standards are all created equal but you can get faster drives in the computers for existing disks. If the engineers wanted to make floppies faster they would have had to redesign the drive and the disks, where as with CDs you stick it in what ever speed reader and it goes whatever speed it is capable of.

USB isn't going anywhere for a long time. But within 20 years I believe we will see the decline of USB drives and external storage and go fully cloud for many. Unfortunately, that means we will have a big problem using our PowerPC Macs then... :mad:

The move to 'cloud computing' is a GIANT step backwards and has been the goal of the industry since at least the mid 90s when internet went mainstream. I will ALWAYS want my own computer with it's own software and it's own disks space. Cloud computing is just a return to dumb terminals with graphics and all of these free storage services will turn into rental services and all of the free software will be rental. I am totally against the idea of a move back to the cloud. Luckily, I don't think Linux will be going away anytime soon and so we will still get to keep our client end machines and not be forced to use dumb terminals.


As for floppy disks, speed could easily be improved. In fact, without interlacing, just the higher density would speed it up. The problem with Iomega and so on was that they weren't open standards and the media was very expensive. I still maintain that CDs are far less reliable than a floppy. One tiny scratch in a cd renders it useless. I was talking about an open standard with modern speed and density. The fact that floppies aren't sealed and created in clean rooms will always limit their capacity, but I do think getting a 1GB floppy at regular old floppy disk prices is technically feasible, but probably wouldn't be backward compatible. It's just a matter of miniaturizing the head and the magnetic particles suspended in the layer on top of the Mylar. The world has probably moved on and my hoping for a new floppy standard is well, unlikely to say the least.

Wouldn't rebooting necessarily delete virtual memory files?

Today's hardware IS without question of lower quality than older hardware, especially hard disks. I have some really old hard drives that work fine (we're talking 10MB MFM hard drives) and nearly every drive I own that is older than 10 years old doesn't work. Compare early keyboards with the junk we get with our computers today. We went from nice heavy switch style keyboards (which were ALWAYS part of a review of a PC back then, not feature or looks, but quality of the switches) to membrane based Keyboards that you are lucky if it survives a year or 2 and are hard on the fingers. I use old IBM branded keyboards on my computers (except for my macs because they don't have PS2 connectors). The cases are much, much thinner and cheaper and so are the power supplies. Getting a good power supply today is very difficult and they have pretty high failure rates compared to the old ones. CD/DVD drives are probably not that much different because they so new and were really only a standard in a computer after the great drop in quality in the middle to late 90s as home computers went mainstream. I got my first computer in 1981 and the drop in quality of manufacture has been very evident. There was also a period of several years where millions of badly designed capacitors plagued computers and were known to blow out in the late 90s early 2000s. In fact, the only real quality improvement is VLSI being used to lower the chip count on a given board, thus lowering the number of parts that could fail. The downsides are that they aren't serviceable and are made much more cheaply than the older boards, even the VLSI chips. Modern boards use narrower etched circuitry and thinner layers of etched copper and are more prone to failure. The MTBF ratings of nearly every piece of hardware in a modern computer is less than the MTBF rates of, say, 1990.

I do agree that the hardware race is no longer as important, especially after the release of XP and the long time between it and vista. This allowed speed of hardware to catch up to the software. Most computers made after 2005, assuming they have enough ram are perfectly usable today for just about everything a typical home user would do. Cad, video editing, gaming and other intense software being the exception.

AOL sucked (I actually used it when it was called Q-link and was geared towards the 8 bit systems). I did like AOL for dos as it came with geos, and under-appreciated competitor to windows. GEOS could actually run fairly well on a 5150, especially if you had 640k. I did use AOL for a while when I was still using telix to connect to local BBS's to access files and arpnet (or arcnet) and internet on some BBSs for email and news groups. But once a local ISP became available, the whole BBS scene collapsed.

I've always been a command line junkie. It's so much more straight forward, at least the way my mind works. Long file names has had an effect on that for managing files, but luckily XP still displays short file names with /x and it's easy to figure out the "short name" anyway.

For those would like to check out gem, geos, cp/m etc, I'd really recommend PCE (there's a 5150 emulator with (PC speaker) sound and VGA and an 80186 processor and IIRC. 4MB EMS and custom dos drivers to load dos high and use the UMA for TSRs) and comes in all kinds of configurations from dos 5 to CPM and gem as well as an early compact Mac emulator (I think it's an SE that's being emulated) and I am pretty sure there are both OSX and PC versions of both. You can also use dosbox to mount a PCE floppy image file and get software onto the virtual hard disk by floppy image. Just use the dosbox "imgmount -t floppy" command. otherwise, it's hard to get files from the 'world' into the emulated machine. But you can download dos games or whatever and then use dosbox to mount the built in floppy img file to transfer files to the 'inside' of the 5150 emulated pc

I had both a Commodore SX-64 and a Compaq Luggable (with a 9" green cga compatible screen, an 8086 which I replaced with an nec v20, 640k ram, 10MB MFM hard disk and a 360k 5.25" disk drive). There is a reason they called that thing "luggable", damn it was heavy. Of course, it was built like a tank.

Chris

----------

You are totally right about Flash being a pig. I personally despise it but it is the world we live in today...

Which eMac is this? The earlier ones only had a B grade AirPort Card, and if the Verizon router only broadcasts N (which the iPhones and iPads can see and use), it would be invisible to the eMac. Try setting it to mixed mode and seeing if that helps...

The problem with floppies is that they are slow! They are also damaged easily by dust and debris which seemingly is always in the floppy drives of old PCs. Plus there is a lot of plastic used to make one and they probably should be recycled. Ever install Windows 95 off of floppies? You would know just how long an install would take. Another downfault is that they can't be used as frisbees like CDs and DVDs when you are done with them.

The airport card came out of a 450mhz G4, it didn't come with the eMac. The eMac I have is the one that was strictly for schools (cd-rom 700mhz version) that I got on a closeout from smalldog Otherwise, these weren't available to the public to the best of my knowledge. I'll give that a try, but where do you go to turn on mixed mode?
 
  • Like
Reactions: jblagden

jruschme

macrumors 6502
Dec 20, 2011
265
30
Brick, NJ
I use old IBM branded keyboards on my computers (except for my macs because they don't have PS2 connectors).

Belkin used to make a very nice PS/2->USB converter that was switchable between Mac and PC modes. I use one with a Model M on my Mac at home.
 

Altemose

macrumors G3
Mar 26, 2013
9,189
487
Elkton, Maryland
The move to 'cloud computing' is a GIANT step backwards and has been the goal of the industry since at least the mid 90s when internet went mainstream. I will ALWAYS want my own computer with it's own software and it's own disks space. Cloud computing is just a return to dumb terminals with graphics and all of these free storage services will turn into rental services and all of the free software will be rental. I am totally against the idea of a move back to the cloud. Luckily, I don't think Linux will be going away anytime soon and so we will still get to keep our client end machines and not be forced to use dumb terminals.


As for floppy disks, speed could easily be improved. In fact, without interlacing, just the higher density would speed it up. The problem with Iomega and so on was that they weren't open standards and the media was very expensive. I still maintain that CDs are far less reliable than a floppy. One tiny scratch in a cd renders it useless. I was talking about an open standard with modern speed and density. The fact that floppies aren't sealed and created in clean rooms will always limit their capacity, but I do think getting a 1GB floppy at regular old floppy disk prices is technically feasible, but probably wouldn't be backward compatible. It's just a matter of miniaturizing the head and the magnetic particles suspended in the layer on top of the Mylar. The world has probably moved on and my hoping for a new floppy standard is well, unlikely to say the least.

Wouldn't rebooting necessarily delete virtual memory files?

Today's hardware IS without question of lower quality than older hardware, especially hard disks. I have some really old hard drives that work fine (we're talking 10MB MFM hard drives) and nearly every drive I own that is older than 10 years old doesn't work. Compare early keyboards with the junk we get with our computers today. We went from nice heavy switch style keyboards (which were ALWAYS part of a review of a PC back then, not feature or looks, but quality of the switches) to membrane based Keyboards that you are lucky if it survives a year or 2 and are hard on the fingers. I use old IBM branded keyboards on my computers (except for my macs because they don't have PS2 connectors). The cases are much, much thinner and cheaper and so are the power supplies. Getting a good power supply today is very difficult and they have pretty high failure rates compared to the old ones. CD/DVD drives are probably not that much different because they so new and were really only a standard in a computer after the great drop in quality in the middle to late 90s as home computers went mainstream. I got my first computer in 1981 and the drop in quality of manufacture has been very evident. There was also a period of several years where millions of badly designed capacitors plagued computers and were known to blow out in the late 90s early 2000s. In fact, the only real quality improvement is VLSI being used to lower the chip count on a given board, thus lowering the number of parts that could fail. The downsides are that they aren't serviceable and are made much more cheaply than the older boards, even the VLSI chips. Modern boards use narrower etched circuitry and thinner layers of etched copper and are more prone to failure. The MTBF ratings of nearly every piece of hardware in a modern computer is less than the MTBF rates of, say, 1990.

I do agree that the hardware race is no longer as important, especially after the release of XP and the long time between it and vista. This allowed speed of hardware to catch up to the software. Most computers made after 2005, assuming they have enough ram are perfectly usable today for just about everything a typical home user would do. Cad, video editing, gaming and other intense software being the exception.

AOL sucked (I actually used it when it was called Q-link and was geared towards the 8 bit systems). I did like AOL for dos as it came with geos, and under-appreciated competitor to windows. GEOS could actually run fairly well on a 5150, especially if you had 640k. I did use AOL for a while when I was still using telix to connect to local BBS's to access files and arpnet (or arcnet) and internet on some BBSs for email and news groups. But once a local ISP became available, the whole BBS scene collapsed.

I've always been a command line junkie. It's so much more straight forward, at least the way my mind works. Long file names has had an effect on that for managing files, but luckily XP still displays short file names with /x and it's easy to figure out the "short name" anyway.

For those would like to check out gem, geos, cp/m etc, I'd really recommend PCE (there's a 5150 emulator with (PC speaker) sound and VGA and an 80186 processor and IIRC. 4MB EMS and custom dos drivers to load dos high and use the UMA for TSRs) and comes in all kinds of configurations from dos 5 to CPM and gem as well as an early compact Mac emulator (I think it's an SE that's being emulated) and I am pretty sure there are both OSX and PC versions of both. You can also use dosbox to mount a PCE floppy image file and get software onto the virtual hard disk by floppy image. Just use the dosbox "imgmount -t floppy" command. otherwise, it's hard to get files from the 'world' into the emulated machine. But you can download dos games or whatever and then use dosbox to mount the built in floppy img file to transfer files to the 'inside' of the 5150 emulated pc

I had both a Commodore SX-64 and a Compaq Luggable (with a 9" green cga compatible screen, an 8086 which I replaced with an nec v20, 640k ram, 10MB MFM hard disk and a 360k 5.25" disk drive). There is a reason they called that thing "luggable", damn it was heavy. Of course, it was built like a tank.

Chris

----------



The airport card came out of a 450mhz G4, it didn't come with the eMac. The eMac I have is the one that was strictly for schools (cd-rom 700mhz version) that I got on a closeout from smalldog Otherwise, these weren't available to the public to the best of my knowledge. I'll give that a try, but where do you go to turn on mixed mode?

In your router settings. Otherwise, if you have another router you can set that to broadcast a separate b speed network.
 

seveej

macrumors 6502a
Dec 14, 2009
827
51
Helsinki, Finland
...time moves on, computers move on, the internet moves on. This is the story of life. To fight it is deny the inevitable. But an eMac is not trash, even a 700mhz model.

Sorry, I'll have to interject here: This is a common fallacy: assuming history is indicative of a natural law - the assumption that because things have "always" been in a specific way, they will continue in that same way.

If the "founding fathers", the initiators of the french revolution or Steve Jobs had succumbed to this fallacy we would not have the United States, a modern (as in: not attic greek) version of democracy or Apple.

RGDS,
 

thefredelement

macrumors 65816
Apr 10, 2012
1,193
646
New York
I get this post but have to say in some ways software developers are really pushing the bounds of what's available and on some platforms run really efficient code (like mobile).

You also have to think there's a lot of high level code where the developers don't necessarily interact too much with that machine is doing.

Something like JavaScript, really, is just blocks of texts that need to be ran through by the browser, where it can interact with other web resources, on page stuff or user input. As the web has evolved so has JS and I kind of like it, it's functional, it's easy and it gets things done.

From a dev point of view, it's hard to reach the lowest common denominator, in profit situations, I usually look at my user's stats and make sure I'm covering the highest percentage of my user base. Sometimes you get to the point where newer technology provides a better competitive solution and you hope your users upgrade.

While Flash is a giant piece of crap, back when it started there was nothing else like it really available and EVERYONE wanted it. It exploded and evolved and it will take years, probably decades, before HTML5 can really replace it as the web's default video player.

In short, don't knock the devs, for the good stuff, people try to write great code that runs great and does something awesome. If you started a project today, wouldn't you want to make it awesome using today's stuff? Do you think Adobe thought they were infecting the web with crap when they created Flash?

I think if you had a decent machine from the year 2000, with media from the year 2000, you'd probably be set. Things back then were VERY different, there were way more constraints and just a lot less of everything, the languages themselves didn't contain nearly as much as they do today. (Look at PHP, it's an entirely different animal today than it was in the 90s).
 

eyoungren

macrumors Penryn
Aug 31, 2011
28,769
26,835
Do you think Adobe thought they were infecting the web with crap when they created Flash?
Just one minor point here. Adobe did not create Flash. They were offered it at one point in 1995 by Futurewave Software, but they turned it down. For years, it was known as Macromedia Flash because Macromedia acquired it when they bought out FutureWave Software.

Of course, when Adobe bought Macromedia they got Flash (as well as other programs they then proceeded to eliminate systematically).

I would argue that Flash became the POS it is once Adobe bought it. Adobe has to continually be making new software in order to continue to exist. So, it makes no sense for them to focus on supporting older machines. One of the first things they did was immediately give PowerPC the axe when it came to Flash.

So, while you may be right as far as devs and coders go (and I don't know if I agree), the companies that these coders work for also share some blame in how they allocate the resources (people, time, money) assigned to a project.
 

christo930

macrumors newbie
Original poster
Dec 11, 2013
16
1
Sorry, I'll have to interject here: This is a common fallacy: assuming history is indicative of a natural law - the assumption that because things have "always" been in a specific way, they will continue in that same way.

If the "founding fathers", the initiators of the french revolution or Steve Jobs had succumbed to this fallacy we would not have the United States, a modern (as in: not attic greek) version of democracy or Apple.

RGDS,

There is an even better way of seeing this. We are all stuck in a normalcy bias in an unprecedented period of human history that represents about .001% of human history.
Throughout most of human history, there was no progress to be expected. Things went on year in year out for centuries and centuries without ever changing. This idea that we will not only have exponentially (meaning a growth by some % of this year's consumption) more and better not only next year, but in perpetuity or at least in 'your' (whoever you may be) lifetime is really only a 20th and 21st century idea (and arguably the late 19th century, but that's just splitting hairs). It took 190 thousand years for the first truly life altering breakthrough in humanity to come along, agriculture. The idea that you would setup a permanent camp and build sturdy buildings because you were going to be there a long time never even occurred to people before the invention of agriculture, which most scientists seem to think happened about 10k years ago and took a lot of time to spread.

I really don't take the view that this perpetual exponential growth is going to hold much longer. In some sense it has already stopped, but not only do I think it will stop completely in my lifetime, but it was also begin reversing, where we will expect exponentially less each year (shrinking by some % of current consumption).

Chris
 

Imixmuan

Suspended
Dec 18, 2010
526
424
Sorry, I'll have to interject here: This is a common fallacy: assuming history is indicative of a natural law - the assumption that because things have "always" been in a specific way, they will continue in that same way.

If the "founding fathers", the initiators of the french revolution or Steve Jobs had succumbed to this fallacy we would not have the United States, a modern (as in: not attic greek) version of democracy or Apple.

RGDS,

I take your point, but in computer terms, my point definitely applies. Try getting online today with a Powerbook 1400c running System 7. I would love it if someone would develop a modern browser for System 7. That is about as likely to happen as Steve Jobs rising from the dead. And...

History does repeat itself, and nations never seem to learn the lessons. The British, the Russians and now the Americans have all had to learn the painful lesson that Afghanistan, as a place, a concept, a country, really sucks.
 

thefredelement

macrumors 65816
Apr 10, 2012
1,193
646
New York
Just one minor point here. Adobe did not create Flash. They were offered it at one point in 1995 by Futurewave Software, but they turned it down. For years, it was known as Macromedia Flash because Macromedia acquired it when they bought out FutureWave Software.

Of course, when Adobe bought Macromedia they got Flash (as well as other programs they then proceeded to eliminate systematically).

I would argue that Flash became the POS it is once Adobe bought it. Adobe has to continually be making new software in order to continue to exist. So, it makes no sense for them to focus on supporting older machines. One of the first things they did was immediately give PowerPC the axe when it came to Flash.

So, while you may be right as far as devs and coders go (and I don't know if I agree), the companies that these coders work for also share some blame in how they allocate the resources (people, time, money) assigned to a project.

I totally forgot about macromedia, thanks for the reminder.

If you make programs for money, you make them so the most people can use them. I have no idea what the stats are but the PPC install base, a subset of the overall Mac install base, it has to be tiny these days.

If they got word that Apple was moving on (which I think they did, vaguely remembering some keynote, or I may be confusing it with the transition to OS X overall, sorry) to Intel then it makes good business sense to support that platform. Also, just because you make a program that then creates media or games or navigation bars, you can't necessarily control what people make with it.

I just wouldn't blame the developers, there's too many different variables to deal with... cost cutting companies, office politics, bad managers, scrapped projects, personal goals, platform changes, api changes, life in general, etc. It's also not exciting for a developer to go back and try to support every possible everything, sometimes it's a huge project just updating so things remain compatible with a new release.

My overall point that may have gotten lost in my babbling is that if you buy a computer from a certain era it will most likely work better with apps and media from the same era. The problem with the web is that it's updated everyday.

Funnily enough, I was in a studio a month or two ago where a G5 was recording everything with Digital Performer from back then. In a closed system it has performed the same tasks over and over for all of these years with no changes.

Getting the most out of old hardware is great but getting down on developers because older hardware with older software doesn't run the latest and greatest seems a few steps off the mark to me.
 

ihuman:D

macrumors 6502a
Jul 11, 2012
925
1
Ireland
I take your point, but in computer terms, my point definitely applies. Try getting online today with a Powerbook 1400c running System 7. I would love it if someone would develop a modern browser for System 7. That is about as likely to happen as Steve Jobs rising from the dead. And...

History does repeat itself, and nations never seem to learn the lessons. The British, the Russians and now the Americans have all had to learn the painful lesson that Afghanistan, as a place, a concept, a country, really sucks.

That's not always true. One example is that the UK has learned a lot from its earlier Plantation attempts and you can really see this when you look at the colonisation of places like America. But in general I would suppose I'd have to agree.
 

reco2011

macrumors 6502a
May 25, 2014
531
0
I totally forgot about macromedia, thanks for the reminder.

If you make programs for money, you make them so the most people can use them. I have no idea what the stats are but the PPC install base, a subset of the overall Mac install base, it has to be tiny these days.

If they got word that Apple was moving on (which I think they did, vaguely remembering some keynote, or I may be confusing it with the transition to OS X overall, sorry) to Intel then it makes good business sense to support that platform. Also, just because you make a program that then creates media or games or navigation bars, you can't necessarily control what people make with it.

I just wouldn't blame the developers, there's too many different variables to deal with... cost cutting companies, office politics, bad managers, scrapped projects, personal goals, platform changes, api changes, life in general, etc. It's also not exciting for a developer to go back and try to support every possible everything, sometimes it's a huge project just updating so things remain compatible with a new release.

My overall point that may have gotten lost in my babbling is that if you buy a computer from a certain era it will most likely work better with apps and media from the same era. The problem with the web is that it's updated everyday.

Funnily enough, I was in a studio a month or two ago where a G5 was recording everything with Digital Performer from back then. In a closed system it has performed the same tasks over and over for all of these years with no changes.

Getting the most out of old hardware is great but getting down on developers because older hardware with older software doesn't run the latest and greatest seems a few steps off the mark to me.
My problem with the web is it's updated everyday with things of questionable value.

Take the tale of two banking web sites. I have both a bank and a credit union. The bank, being a multi-billion dollar entity, has a flashy web site which uses all the latest and greatest. The credit union, being of modest means, has a, relatively speaking, basic web site. When I do online banking with my bank I have to deal wit all the "cool" of their site...when all I really want to do is check my balance, pay some bills, or transfer money. The credit union is a different story. Their site is functional and I spend less time doing my tasks there as I don't have to deal with all the "cool" of the web.

Another example: Ebay and Craiglist. I go to each in order to sell / buy something. Ebay is a lot of flash. Craigslist is very basic. Which do I prefer? Craigslist...it's easy and fast for me to list / buy something because it's function over form.

Yes, there are sites where the cool is necessary / welcome. But it's my opinion too many websites go for cool and it gets in the way of function. News sites can be some of the worst. I go to read an article and some audio / video ad starts to play without user intervention. Or some expanding ad expands and then, seconds later as I'm shooting for the close box, it begins to close and I inadvertently click on the ad...which brings up another window.
 

thefredelement

macrumors 65816
Apr 10, 2012
1,193
646
New York
My problem with the web is it's updated everyday with things of questionable value.

Take the tale of two banking web sites. I have both a bank and a credit union. The bank, being a multi-billion dollar entity, has a flashy web site which uses all the latest and greatest. The credit union, being of modest means, has a, relatively speaking, basic web site. When I do online banking with my bank I have to deal wit all the "cool" of their site...when all I really want to do is check my balance, pay some bills, or transfer money. The credit union is a different story. Their site is functional and I spend less time doing my tasks there as I don't have to deal with all the "cool" of the web.

Another example: Ebay and Craiglist. I go to each in order to sell / buy something. Ebay is a lot of flash. Craigslist is very basic. Which do I prefer? Craigslist...it's easy and fast for me to list / buy something because it's function over form.

Yes, there are sites where the cool is necessary / welcome. But it's my opinion too many websites go for cool and it gets in the way of function. News sites can be some of the worst. I go to read an article and some audio / video ad starts to play without user intervention. Or some expanding ad expands and then, seconds later as I'm shooting for the close box, it begins to close and I inadvertently click on the ad...which brings up another window.

I hear you on that 100%, it doesn't matter the platform or the horse power, some parts of the web absolutely suck. The drive to earn the almighty marketing dollar is gross, I was in that industry for longer than I'd like to admit, it's gross, every part of it is just gross. Context based advertising was cool and actually not bad in the mid-2000s, now it's just out of control.

As far as some of the busy designs, boy do they suck, so I'm right there with you on that. I've worked with a lot of great designers over the last 10 years or so, most of the time they have a gun to their head, held by some mid-level so and so who thinks he or she knows what they are talking about.

I like some of the trends lately in web design, in a way we're coming full circle with minimalism taking center stage (as if we had a choice back in the 90s, when load times actually meant something).

I hope things flush out and good design is rewarded with patronage, thus forcing their competitors to keep up with the changes. Too often people are lazy, including customers (by means of lack of complaining), which leads to people thinking they are doing something correctly, or what their customers like. First hand I can tell you that changing the design of a long running site, even for something that's much cleaner and less intensive is still a struggle, people just get used to stuff too.

As a developer or a designer, you get caught up with so much crap, you're usually not the start and stop of any project and often times you'll get overruled by someone higher up the ladder than yourself because they think they know better. Thankfully not every place is like this, and you see that, which is great, but you get a lot of situations where designers and developers want to make great stuff, they keep up with trends or learn the latest API changes or new APIs and really want to deliver but for whatever the reason out of their control is, they just can't.

Sure, there are people who stick to what they know and don't learn new stuff and shame on them in some ways, languages are always updated, dependent APIs as well and it's important to keep up, I've seen how and why it happens first hand though. You get someone who wants to make great things, gets a good job and gets 'beaten down' time and time again and they just become kind of sad sacks sitting there, churning out the same crap over and over again to keep the pay check coming in. It sucks, usually you don't get into something like design or development unless you kind of love it, or at least a part of it and to spend 8 hours a day at least, coding marketing pop ups with flash videos and barely visible close buttons is probably not what that person envisioned their career being like.

People aren't like us though, Apple fans in an apple forum, if we were the majority, that crap wouldn't have stuck around when it started.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.