Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
skunk said:
Why not have a small predictive keypad and a joystick, like on a cellphone? Takes up very little space, and, at least in Europe, everybody is very adept at texting.


I hate texting. It takes too long. Then again, no reason to learn, in the US, most text messages cost 10c apiece. That's a lot of money (added up) for a single conversation.
 
Calebj14 said:
I thought it sounded familiar. Could this mean that a Cube G5 will be announced?

If you want a ridiculously hot, loud computer, then it's perhaps possible. Even a single 970FX 2.0ghz is 22-25 watts, or five to six times the heat of the original cube's processor alone. Add in the FSB shift from 100mhz to 1ghz, PC100 RAM to PC3200, ATI Rage Pro 128 on AGP 2x to (presumably) an nVidia GeforceFX 5200 Ultra on AGP 8x, 5400RPM ATA66 to 7200RPM SATA... I mean, yeah.

I always wanted a fireball on my desktop! :rolleyes: :p
 
Calebj14 said:
I hate texting. It takes too long. Then again, no reason to learn, in the US, most text messages cost 10c apiece. That's a lot of money (added up) for a single conversation.

I send text messages all the time. However, comparing sending text messages to typing something on the keyboard is like comparing Morse Code to a telephone conversation. Sure, it gets the message to the other end but, as Calebj14 mentioned, it takes a long time.

Squire
 
Squire said:
I send text messages all the time. However, comparing sending text messages to typing something on the keyboard is like comparing Morse Code to a telephone conversation. Sure, it gets the message to the other end but, as Calebj14 mentioned, it takes a long time.

Squire

Zero-force typing and gestures replacing a traditional keyboard.
104-function in a control that fits under one hand.

Sure, there's a learning curve, but there's one with writing, typing, and anything else we do to communicate, too.
 
thatwendigo said:
Zero-force typing and gestures replacing a traditional keyboard.
104-function in a control that fits under one hand.

Sure, there's a learning curve, but there's one with writing, typing, and anything else we do to communicate, too.

The top one was still a keyboard. How would the bottom one work with word processing?

Perhaps we're overlooking the most obvious keyboard replacement of all: voice recognition. I don't know if it was in the early stages of this thread or another one, but there was some talk of Apple making leaps and bounds in this area.

So in 5-10 years, we'll have everybody carrying around iPod-sized computers that they talk to and gesture at in order to have functions performed. They will connect them (or, rather, have them detect) LCD displays located virtually everywhere. Or, for doing work alone, images will appear on quarter-inch sized screens embedded in one's eyeglasses.

And you say I'm not thinking different, thatwendigo? ;)

Squire
 
Squire said:
The top one was still a keyboard. How would the bottom one work with word processing?

I imagine it's down to key combinations and perhaps toggle modifiers, which would mean it's a lot like a stenographer's machine. You learn the patterns for symbols and then use them to get a much, much faster input than you would from a full keyboard. There's already research being done into how the massive improve the human input rate beyond what people like me, who can almost inherently touch-type at rather high speed, can already achieve.

Now if I'd quit switching keyboards, I might get even faster. :D

Perhaps we're overlooking the most obvious keyboard replacement of all: voice recognition. I don't know if it was in the early stages of this thread or another one, but there was some talk of Apple making leaps and bounds in this area.

So in 5-10 years, we'll have everybody carrying around iPod-sized computers that they talk to and gesture at in order to have functions performed. They will connect them (or, rather, have them detect) LCD displays located virtually everywhere. Or, for doing work alone, images will appear on quarter-inch sized screens embedded in one's eyeglasses.

And you say I'm not thinking different, thatwendigo? ;)

Well, at first your weren't. ;)

I don't know if voice input will ever fully replace manual, especially not in public situations where background noise could be a serious problem. That's one place that direct input is still a valuable tool, though I could see how a good dictation program would be even more useful to academics and professionals in many types of business.

Pervasive computing raises all kinds of security issues, though, and while it's starting to kind off edge that way in certain areas - I used the free 802.11 network at SeaTac last year - it's not something I necessarily think would be in everyone's best interests. Instead, I envision more of a limited spread throughout a given environment, whether home, office, sudio, or what have you. The ideal seems to be a multi-node processing cluster with storage, an access unit (perhaps even smaller than the OQO) that controls the interface and handles the user-side tasks, and tie-ins to massive parallelism in networking and computing power.

Some day, we might be able to rent out spare cycles on our machines to universities and businesses that need temporary power. Micropayment and encrypted, anonymous, double-blind transfer would allow this to be done without horrible breeches in privacy. Already, some proxy services use double-blind systems with limited record keeping to prevent the protected clients from being strongarmed if the servers are ever raided.

That assumes that the RIAA doesn't win, and that Microsoft's TCI doesn't make the general populce assume that DRM has to be an albatross around your neck. In other words, we can both keep dreaming... :cool:
 
thatwendigo said:
Imagine, if you will, the Apple home computer system in five to ten years. When I look, I see a device like the flexMac for personal use, with two or three processor and storage units serving to the client machines (which could be more traditional or even further jumps) so that resources are optimized as much as possible.

OK, let´s look 10 years ahead. You probably have a digital life device, a piece of hardware the size of a wallet (or a dollar, who knows) that truly is the digital extension of yourself. It´s obviously got phone capabilities, calendars, and all things a PDA can do today.

Display-wise, maybe a flexi pull-out type. But it can also be folded. So if you require a massive screen, you just keep folding out. So if you originally had a 5x3" pull-out, if you fold it out 5 times its now 24x20", ideal for doing serious work on.

The processor isn´t as powerful as a workstation, but the device, as you move about, hooks up to all availbale networks in it´s range, using x-grid or some related technology to utilise the processor power offered on the available network. No need for physical modules. External storage - .mac.

Of course, you could go on with fantastic implementations and tech solutions, how this device would be telepathic so there would be no need for a keyboard or a mouse. It would instantly display the artists visions, just based on thought. It would type an essay from extracted thoughts, and even better, it would be able to reply to a Macrumors post, effortlessly.

The OQO, may of course be a step to some new level. It´s just not here yet. I can see it´s potensial with business users currently using palms. Or as a portable video-device. But nothing more.

On the video presentation they have on their site, it strikes me that all these people are always alone. Must be the cramped display. Also, at the end, the female computer voice goes: "...it is the only computer..." and you expect it to continue: "...that will talk to you." Similar to the classic mac one. Or is it just me?
 
thatwendigo said:
If you want a ridiculously hot, loud computer, then it's perhaps possible. Even a single 970FX 2.0ghz is 22-25 watts, or five to six times the heat of the original cube's processor alone. Add in the FSB shift from 100mhz to 1ghz, PC100 RAM to PC3200, ATI Rage Pro 128 on AGP 2x to (presumably) an nVidia GeforceFX 5200 Ultra on AGP 8x, 5400RPM ATA66 to 7200RPM SATA... I mean, yeah.

I always wanted a fireball on my desktop! :rolleyes: :p

Hey, at least in the summer you can make s'mores and in the winter you wouldn't have to pay the outrageous natural gas prices! Innovation never sleeps! :D
 
thatwendigo said:
Simple. You get more power control if you are drawing less off the battery. When you need to conserve, you merely remove or shut off the modules you don't need, and that includes everything from drives to processors. When you need it again, slap it in and there it is. Also, this would allow the sale of massively upgradable chassis and product lines, which would only need the purchase of modules to increase their performance.

I must say that many of your ideas are fascinating and have caused me to look into some ideas I had never thought of before, but this one I still don't get. Wouldn't it be better to design a machine with two processors that can run off of one when that is all the processing it needs. I can understand the appeal to add power, memory, storage, etc. I just still don't see what you would then take it out. You've already paid for it.

thatwendigo said:
I imagine it's down to key combinations and perhaps toggle modifiers, which would mean it's a lot like a stenographer's machine. You learn the patterns for symbols and then use them to get a much, much faster input than you would from a full keyboard. There's already research being done into how the massive improve the human input rate beyond what people like me, who can almost inherently touch-type at rather high speed, can already achieve.

Now if I'd quit switching keyboards, I might get even faster. :D

I wonder if we will ever see a major change in the keyboard. I read a story recently how the average typing speed has actually fallen in recent years. Because kids learn to type on their own at home rather than being trained on proper form and technique, companies that need high-speed touch typists are having trouble tracking down the old Word-per-Minute numbers they could find ten years.

thatwendigo said:
Some day, we might be able to rent out spare cycles on our machines to universities and businesses that need temporary power. Micropayment and encrypted, anonymous, double-blind transfer would allow this to be done without horrible breeches in privacy. Already, some proxy services use double-blind systems with limited record keeping to prevent the protected clients from being strongarmed if the servers are ever raided.

Would this been similiar system to say the "folding" programs you see on the web lately? What I would think would be more interesting would be the ability to "rent" out extra cycles from some-one elses machine. Say you want to render a movie in FCP. You could basically turn your internet connection into a render farm to help speed up the process.

Not that I think the ideas you talk about will never be seen, but I think they are a lot more than ten years away. How much have computers really changed in the past ten years? Additionally, as the current model of computing continues to grow (new people buying computers) the model becomes more entrenched in the marketplace. Look what has happened with digital television. The FCC has already had to move the date where all stations had to change over to digital broadcasting several times.

Even if Apple were to come out with concepts you would like to see right now, chances are it would be the next Newton (a product the world just wasn't ready for yet) or even worse the next Cube (an excellent product that ends up being so overpriced that it appeals to only a tiny niche of an already niche market).
 
thatwendigo said:
If you want a ridiculously hot, loud computer, then it's perhaps possible. Even a single 970FX 2.0ghz is 22-25 watts, or five to six times the heat of the original cube's processor alone.
...
I always wanted a fireball on my desktop! :rolleyes: :p

Does the original cube's processor (PPC 7400?) really run at 500 MHz at only 5 watts. That's pretty friggin' efficient.
 
pjkelnhofer said:
Does the original cube's processor (PPC 7400?) really run at 500 MHz at only 5 watts. That's pretty friggin' efficient.

The 500MHz G4 (7410) run at between 5.3W and 11.9W (max). With an external power supply, which keeps your feet warm on a cool winter day, there´s not much inside the enclosure that generate the amounts of heat we see in the G5s.

The 1GHz upgrade (7455 I believe) runs at between 15W and 22W and adds a fan.
 
Belly-laughs said:
Display-wise, maybe a flexi pull-out type. But it can also be folded. So if you require a massive screen, you just keep folding out. So if you originally had a 5x3" pull-out, if you fold it out 5 times its now 24x20", ideal for doing serious work on.

Projected screens (holographic matrix?) would probably be easier than an electroreactive polymer that was also foldable, and I don't see the conductive plastics technology leaping quite that far in the next ten years. There's less driving it than there is with computing, but it's still a fun idea to play with. With the projected screen, I'm not sure how you could keep a touch interface, but it would probably need some pretty good predictive and spacial heuristics, backed up by verbal commands.

The processor isn´t as powerful as a workstation, but the device, as you move about, hooks up to all availbale networks in it´s range, using x-grid or some related technology to utilise the processor power offered on the available network. No need for physical modules. External storage - .mac.

I don't think we're going to ever completely be rid of physical storage, at least not in the business world. Even encrypted transmissions can be intercepted, so internal storage will be important for a while yet, despite the fact that today's encryption methods (my key is triple layered 1024-2048-2048 DECS-ElGamal-DECS) are basically unbreakable within any kind of reasonable timeframe. The number of keypairs even for 128-bit encryption makes it unlikely that anyone can bruteforce it within reasonable time (I've heard packet-capture methods take a week just to break onto a wireless network, and that's with an active source that's feeding you a couple of million packets a day).

However, with the advent of clustering and perhaps even for-pay renting of spare cycle time, nothing prevents someone determined from temporarily devoting a lot of clock time to the idea of breaking encryption. Basically, it would be like pj asks - distributed computing, but without the limitations of current models. Professional coders could write the apps to take advantage of the model, and you might even find whole university labs that are sitting unused and ready to be bought for a few hours. What happens when the whole world has a potential BigMac on their desk?

Of course, you could go on with fantastic implementations and tech solutions, how this device would be telepathic so there would be no need for a keyboard or a mouse. It would instantly display the artists visions, just based on thought. It would type an essay from extracted thoughts, and even better, it would be able to reply to a Macrumors post, effortlessly.

Once again, we're a very long way from anything that fanciful. Nothing that I've suggested could be done right now is even remotoley this complex, and the human neurosystem (which is my field of study, incidentally) isn't understood enough to tap it thay way yet. The Japanese are making incredible progress with motor systems, and a compnay over there is working on a robot-asisted walker unit that looks kind of like a smaller, lower half of the Powerloader from Aliens. The big deal is that it has an expert system that taps the nerve trunks between L5 and L8, or as far up as they can get without really screwing the spinal cord, so that they can intercept the signals intended for the legs. The control box interprets the signals into machine code, feeds power and direction to the actuators, and they move the patient's legs. Yes, the mecha-crazed Japanese are making robot walkers for paraplegics. :D

The OQO, may of course be a step to some new level. It´s just not here yet. I can see it´s potensial with business users currently using palms. Or as a portable video-device. But nothing more.

I think we're just going to have to disagree on this, then, because I can see more possibilities for it than I can possibly articulate.

On the video presentation they have on their site, it strikes me that all these people are always alone. Must be the cramped display. Also, at the end, the female computer voice goes: "...it is the only computer..." and you expect it to continue: "...that will talk to you." Similar to the classic mac one. Or is it just me?

At least one of the OQO's major designers worked for Apple on the Titanium PowerBooks, so he might either be doing homage to his alma mater, or he might just be trading off of a research Apple had already done. Derivative art, and all that jazz...
 
pjkelnhofer said:
I must say that many of your ideas are fascinating and have caused me to look into some ideas I had never thought of before, but this one I still don't get. Wouldn't it be better to design a machine with two processors that can run off of one when that is all the processing it needs. I can understand the appeal to add power, memory, storage, etc. I just still don't see what you would then take it out. You've already paid for it.

Ah, I think I see where the problem is. Let me try to break this down, then. Say that you were a business user looking at a new computer purchase. You do some looking around, decide what your needs are, and come up with the following list:

Apple Core Module - 4ghz G6, 1GB PC5200 RAM, 200GB SATA2 15000RPM
Apple PowerBook - Space: Core, 1 RAM module, 1 storage or I/O module
Apple PowerMac - Space: Core, 3 processor modules, 4 RAM modules, 4 storage or i/O modules​

When you need to be on the go, you could pull a RAM module and one of your drives from the basic station and slot them into your portable so that you don't have to pay for separate components for both.

Or, to run with my networked computing idea, the setup for a family like mine:

Apple flexMac x4 - 4ghz G6, 1GB PC5200 RAM, 200GB SATA2 15000RPM, 802.15.3c, IEEE1394.d, all that other stuff I was talking about (appropriately upddated for current trends)
Apple Processing Unit - Space: 8 processor modules, 8 RAM modules, 4 storage modules, 802.20 base station (BTO option)
Apple Storage Unit - Space: 1 processor module, 2 RAM modules, 12 storage modules
Apple Processor Module II x5 - 6ghz G6, 8MB L2 cache
Apple Storage Module III x3 - 1TB drives
Apple RAM Module II x3 - 2GB PC5200 DDR3
Apple HD Studio Wall Display x2 - 50" widescreen, 802.15.3c and IEEE1394d​

Put simply, you'd have your less powerful machine that you carry around, and unlike the "personal units," the Processing and Storage Units would be very much like home user xServe solutions. You'd have hot-swappable components that are standardized and can be dropped in to expand as needed. Your storage is handled by second computer, which is connected by extremely fast backplane to your central processor. Using xGrid or something like it, your portable flexMacs merely offload tasks to it, while working as graphically responsive tablets that you use to direct the larger screens at the cradles or on the HD monitors.

However, if you something more "full featured," as Belly would have it, then you can buy the laptop shell and use modules in it to do so.

I wonder if we will ever see a major change in the keyboard. I read a story recently how the average typing speed has actually fallen in recent years. Because kids learn to type on their own at home rather than being trained on proper form and technique, companies that need high-speed touch typists are having trouble tracking down the old Word-per-Minute numbers they could find ten years.

I actually taught myself, but it seems that I largely learned the proper way, from everything that I've seen. Of course, I've been in front of a computer since I was three years old, and I quite literally cut my teeth on Apple. :D

There have been efforts to reform the QWERTY keyboard a number of times, but it's always died off. The DVORAK was one, and it didn't get far, but I'm thinking something more radical than just changing the order of the keys. Technology marches on, and the future might not wait for those who cling too hard to the ways of the past. Witness Intel and the dropping of the Pentium 4... ;)

What I would think would be more interesting would be the ability to "rent" out extra cycles from some-one elses machine. Say you want to render a movie in FCP. You could basically turn your internet connection into a render farm to help speed up the process.

That's exactly what I'm talking about, going both directions.

Not that I think the ideas you talk about will never be seen, but I think they are a lot more than ten years away. How much have computers really changed in the past ten years? Additionally, as the current model of computing continues to grow (new people buying computers) the model becomes more entrenched in the marketplace. Look what has happened with digital television. The FCC has already had to move the date where all stations had to change over to digital broadcasting several times.

Even if Apple were to come out with concepts you would like to see right now, chances are it would be the next Newton (a product the world just wasn't ready for yet) or even worse the next Cube (an excellent product that ends up being so overpriced that it appeals to only a tiny niche of an already niche market).

It's entirely possible that everything I've said will never happen, and I accept that. The thing is that I try to live up to the hype that Apple built some time ago, and I always endeavor to think just that shade more into the "different" than the next man does. It's served me well in many cases, and I'm a pretty good problem solver in real life, though I'm not a hardware engineer or a programmer. I'm just a scientist in training, and I know a lot of snippets of all kinds of things.

Perhaps these would never make. Maybe my ideas would never sell, but they make sense and have a coherent drive to them, which is more than I can say for a lot of the things you see in the computing world.

pjkelnhofer said:
Does the original cube's processor (PPC 7400?) really run at 500 MHz at only 5 watts. That's pretty friggin' efficient.

Yes, it does. I dug up a Motorola PDF on the consumption of the 7400, 7410, 7455, and 7457, and the average heat load for the 7400 was 4.6-7 watts.
 
thatwendigo said:
Ah, I think I see where the problem is. Let me try to break this down, then. Say that you were a business user looking at a new computer purchase. You do some looking around, decide what your needs are, and come up with the following list:

Apple Core Module - 4ghz G6, 1GB PC5200 RAM, 200GB SATA2 15000RPM
Apple PowerBook - Space: Core, 1 RAM module, 1 storage or I/O module
Apple PowerMac - Space: Core, 3 processor modules, 4 RAM modules, 4 storage or i/O modules

When you need to be on the go, you could pull a RAM module and one of your drives from the basic station and slot them into your portable so that you don't have to pay for separate components for both.

Or, to run with my networked computing idea, the setup for a family like mine:

Apple flexMac x4 - 4ghz G6, 1GB PC5200 RAM, 200GB SATA2 15000RPM, 802.15.3c, IEEE1394.d, all that other stuff I was talking about (appropriately upddated for current trends)
Apple Processing Unit - Space: 8 processor modules, 8 RAM modules, 4 storage modules, 802.20 base station (BTO option)
Apple Storage Unit - Space: 1 processor module, 2 RAM modules, 12 storage modules
Apple Processor Module II x5 - 6ghz G6, 8MB L2 cache
Apple Storage Module III x3 - 1TB drives
Apple RAM Module II x3 - 2GB PC5200 DDR3
Apple HD Studio Wall Display x2 - 50" widescreen, 802.15.3c and IEEE1394d

Put simply, you'd have your less powerful machine that you carry around, and unlike the "personal units," the Processing and Storage Units would be very much like home user xServe solutions. You'd have hot-swappable components that are standardized and can be dropped in to expand as needed. Your storage is handled by second computer, which is connected by extremely fast backplane to your central processor. Using xGrid or something like it, your portable flexMacs merely offload tasks to it, while working as graphically responsive tablets that you use to direct the larger screens at the cradles or on the HD monitors.

However, if you something more "full featured," as Belly would have it, then you can buy the laptop shell and use modules in it to do so.

While this is a good idea for the coporate market, I hope you're not thinking this will work in the home. While you and I may understand it, the average computer user will not. It will be too confusing and lead them to buy a PC.
 
Calebj14 said:
While this is a good idea for the coporate market, I hope you're not thinking this will work in the home. While you and I may understand it, the average computer user will not. It will be too confusing and lead them to buy a PC.

Will it be more confusing than all the ' Intel Extreme,' 'DirectX,' 'Athlon XP,' 'high-speed' nonsense out there at the moment? No. The reason being that I'm not a marketing type and I'm not prettifying the concept, because this is a discussion between people who at least somewhat understand computers.

In fact, I think that making upgrades into to use modules that slot in through ports that are easily labeled would be a quantum leap forwards in usability for the consumer. No more cracking the case to upgrade your RAM or HD. All you need is to buy a module and put it into the hole with the right symbol.
 
thatwendigo said:
Will it be more confusing than all the ' Intel Extreme,' 'DirectX,' 'Athlon XP,' 'high-speed' nonsense out there at the moment? No. The reason being that I'm not a marketing type and I'm not prettifying the concept, because this is a discussion between people who at least somewhat understand computers.

In fact, I think that making upgrades into to use modules that slot in through ports that are easily labeled would be a quantum leap forwards in usability for the consumer. No more cracking the case to upgrade your RAM or HD. All you need is to buy a module and put it into the hole with the right symbol.

I am afraid that I would have to agree with Calebj14 on this one. As intriguing as your system maybe. I think there is no way to dress it up for the average consumer. I am sure a good number of new PC sales go to consumers who could have been just as well served with adding RAM, getting a new video card, etc. Not to mention the vast majority of computer stores CompUSA, Dell, Best Buy, even Apple would prefer that customers continue to buy whole new machines.

Hot-swappable drives have been around for a long time but that are not in consumer computers not because the are confusing, mostly because people have no interest in them. I think the main thing holding up your plan is that people are simply not ready for that sort of modular computing in their home. They just wouldn't see the need for it.
 
pjkelnhofer said:
As intriguing as your system maybe. I think there is no way to dress it up for the average consumer. I am sure a good number of new PC sales go to consumers who could have been just as well served with adding RAM, getting a new video card, etc.

Yes, but a large part of that fact is that the upgrade process is still too esoteric for most people to handle. Mac users hold onto their machiens for longer, and a lot of us keep using them even when a new one is bought. If that were done away with - aside from major changes in architechture, then why wouldn't consumers flock to the banner?

What's easier? Opening your case, knowing which slot RAM goes into, where to pull the processor and put the new one, how to attach the cooling system, and where to plug the molex connectors, or being able to match symbols and push something into a bay?

THINK DIFFERENT! The market is one way now, but that doesn't mean it has to stay that way.

Not to mention the vast majority of computer stores CompUSA, Dell, Best Buy, even Apple would prefer that customers continue to buy whole new machines.

The people who really need new machines will, and they're the ones who would shell out in the first place. Consumers, smart ones at least, will be looking at the upgrade path, and it's quite possibly you could make money there by having an easy way for people who would be afraid to do it.

Hot-swappable drives have been around for a long time but that are not in consumer computers not because the are confusing, mostly because people have no interest in them.

Five years ago, people didn't have much of an interest in home digital video, wireless networking, fast peripherals that could draw power from the system and stream media very quickly (FireWire, which Apple invented), all-in-one computers, or digital music players (the iPod really opened the market). Ten years ago, hardly anybody thought that text messaging, the web, DVDs as storage media, or third party gaming harware would really do much. Fifteen years ago, home networking was a joke, screens were tiny, computers were more expensive for less capability, and GUI systems were still a niche market that was breaking into new ground. Twenty years ago, nobody used a mouse or a GUI with any regularity and 5.25" disks and tape drives were the rule.

Do you really want to say that you know what will and won't be of interest tomorrow, what Apple will eitehr create or adopt and show the world that it really does need?

I think the main thing holding up your plan is that people are simply not ready for that sort of modular computing in their home. They just wouldn't see the need for it.

Raise a generation on something, and they'll be used to it. I have reflexes my dad couldn't even dream of, and so does my sister, because we've augmented natural skill with eye-hand coordination skills that come from arbitrary, abstract association in video games. He can't play Halo, but I can decimate whole servers because I can lead targets, judge incoming projectiles, adapt, and do other things that have also helped me in playing real sports.
 
pjkelnhofer said:
I think the main thing holding up your plan is that people are simply not ready for that sort of modular computing in their home. They just wouldn't see the need for it.
Doesn't mean the need can't/won't eventually exist, maybe sooner than later.

One could claim that (too) many half-baked products/services show up prematurely on the market, with hype and frenzy, then quickly fade into some niche or die off. Later, an apparently similar but genuinely superior product/service arrives. Perception are still tainted by the first generation attempt and failure so they're reluctant to try something different, regardless of its quality and value. Seems to be a form of backwards thinking.

Actually, the "consumer mentality" to technology is a big yawn.

I'm not gonna continue or refine my ideas since thatwendigo has already posted another dissertation that covers the more forward-thinking theme of where I wanted to go. And his last comments remind me of Douglas Rushkoff's "Playing the Future: What We Can Learn from Digital Kids" that I just started reading.
 
thatwendigo said:
Yes, but a large part of that fact is that the upgrade process is still too esoteric for most people to handle. Mac users hold onto their machiens for longer, and a lot of us keep using them even when a new one is bought. If that were done away with - aside from major changes in architechture, then why wouldn't consumers flock to the banner?

What's easier? Opening your case, knowing which slot RAM goes into, where to pull the processor and put the new one, how to attach the cooling system, and where to plug the molex connectors, or being able to match symbols and push something into a bay?

THINK DIFFERENT! The market is one way now, but that doesn't mean it has to stay that way.

Yes, but knowing which "bay" to put something into would be just as confusing. "Modules" could get lost, just like anything else, and would probably be so troublesome that consumers wouldn't like to use them at all.


The people who really need new machines will, and they're the ones who would shell out in the first place. Consumers, smart ones at least, will be looking at the upgrade path, and it's quite possibly you could make money there by having an easy way for people who would be afraid to do it.


The point is, that consumers don't have the time to be "smart." The very word "upgrade" sends shivers up their spines, not to mention images of computer geeks crashing their computer. They would be afraid to do it. Consumers want a machine that works, right out of the box. Face it, you're not in that category, and neither are we. We want to be able to upgrade our machines, but they don't. They just want it to work.

Five years ago, people didn't have much of an interest in home digital video, wireless networking, fast peripherals that could draw power from the system and stream media very quickly (FireWire, which Apple invented), all-in-one computers, or digital music players (the iPod really opened the market). Ten years ago, hardly anybody thought that text messaging, the web, DVDs as storage media, or third party gaming harware would really do much. Fifteen years ago, home networking was a joke, screens were tiny, computers were more expensive for less capability, and GUI systems were still a niche market that was breaking into new ground. Twenty years ago, nobody used a mouse or a GUI with any regularity and 5.25" disks and tape drives were the rule.

All of those things took time. The better accepted ideas (GUI, iPod, disks/diskettes) took less time than the less accepted ideas (mouse, tape drives). While this may come some day, it's not going to be in the next round of Apple products.

Do you really want to say that you know what will and won't be of interest tomorrow, what Apple will eitehr create or adopt and show the world that it really does need?

Do you?


Raise a generation on something, and they'll be used to it. I have reflexes my dad couldn't even dream of, and so does my sister, because we've augmented natural skill with eye-hand coordination skills that come from arbitrary, abstract association in video games. He can't play Halo, but I can decimate whole servers because I can lead targets, judge incoming projectiles, adapt, and do other things that have also helped me in playing real sports.

Yes, I agree, but the world is not ready for the ideas you've suggested. Not now. Maybe in 5 or 10 years, but face it, would you want your PC using friends calling you because they can't find their Windows module? :D
 
thatwendigo said:
Will it be more confusing than all the ' Intel Extreme,' 'DirectX,' 'Athlon XP,' 'high-speed' nonsense out there at the moment? No. The reason being that I'm not a marketing type and I'm not prettifying the concept, because this is a discussion between people who at least somewhat understand computers.

In fact, I think that making upgrades into to use modules that slot in through ports that are easily labeled would be a quantum leap forwards in usability for the consumer. No more cracking the case to upgrade your RAM or HD. All you need is to buy a module and put it into the hole with the right symbol.

A quantum leap that the consumer is not ready for. Give it time, it may happen.
 
sjk said:
Doesn't mean the need can't/won't eventually exist, maybe sooner than later.

I am sure that the need will exist. I just don't think that it is their right now and if Apple introduced the technology at this years WWDC there would be a lot of hoopla and very little revenue from it.

One could claim that (too) many half-baked products/services show up prematurely on the market, with hype and frenzy, then quickly fade into some niche or die off. Later, an apparently similar but genuinely superior product/service arrives. Perception are still tainted by the first generation attempt and failure so they're reluctant to try something different, regardless of its quality and value. Seems to be a form of backwards thinking.

Actually, the "consumer mentality" to technology is a big yawn.

That is the problem. Right now, rather than focus on pushing the technology enevelope I honestly think that Apple needs to focus on getting people to buy their computers. As amazing as the ideas may be, if the average consumer is not ready for it, then it is not going to sell, and Apple will have inovated itself into a Digital Music company.

The iMac wasn't revolutionary when it came out, but it was a very important computer for Apple because it got people buying Macs again. That is what Apple needs right now. Not a paradigm changing computer. Something powerful and affordable would do nicely.
 
Calebj14 said:
Yes, but knowing which "bay" to put something into would be just as confusing. "Modules" could get lost, just like anything else, and would probably be so troublesome that consumers wouldn't like to use them at all.

Is the difference between USB and CAT-5 hard? No? Then why would modules have to be?

You're just not thinking about this rationally, man. There is no reason at all that Apple couldn't make some system by which the modules are perfectly capable of slotting into one bay but not the others. It would be ridiculously simple, in fact.

The point is, that consumers don't have the time to be "smart."

That's how we ended up in this current political mess... ;)

The very word "upgrade" sends shivers up their spines, not to mention images of computer geeks crashing their computer. They would be afraid to do it. Consumers want a machine that works, right out of the box. Face it, you're not in that category, and neither are we. We want to be able to upgrade our machines, but they don't. They just want it to work.

I grok this, but I think that you're wrong. Five years ago, the average consumer wouldn't have spent $400 on a portable music player, either, but they're doing it in droves right now. More and more people are learning about their computers, partly because of people like me who are trying to teach and not just fix when something goes wrong.

It may take time, but upgrades are going to become more important. Nobody wants to buy a whole new computer just because the hard drive is full.


I'm perfectly happy to speculate, dream, and brainstorm.

Yes, I agree, but the world is not ready for the ideas you've suggested. Not now. Maybe in 5 or 10 years, but face it, would you want your PC using friends calling you because they can't find their Windows module? :D

Sure, why not? They already call me when they can't find their network and that doesn't even usually have a hardware problem. :D

pjkelnhofer said:
I am sure that the need will exist. I just don't think that it is their right now and if Apple introduced the technology at this years WWDC there would be a lot of hoopla and very little revenue from it.

I disagree. Most of the less future-tech ideas I bring up are things that are going to be standard in PCs over the next year, and hopping on early is a good way to hook even more of the fence sitters than Apple already is.

As I saw someone at Slashdot put it, Apple is the holy grail of people who really know computers, right now. They're a fast, stable, secure *NIX with a pretty GUI and good usability features, while still retaining the command line. It's all on well-designed and implremented hardware, and it keeps getting faster with every release.

That is the problem. Right now, rather than focus on pushing the technology enevelope I honestly think that Apple needs to focus on getting people to buy their computers. As amazing as the ideas may be, if the average consumer is not ready for it, then it is not going to sell, and Apple will have inovated itself into a Digital Music company.

Apple needs to push the technology in order to sell computers, though. They're a one-company army facing the entirety of the world, with a very limited number of allies in their corner. Without innovation on both hard and software, they're dead in the water and the cheaper, more expansive PC market swallows them up. You can't out-Dell Dell, and you can't out Microsoft... Well, you get the idea.

Entrenched market is a hard opposition to face, but it's the reality. Apple needs the wow-factor, and there's no way around it.

The iMac wasn't revolutionary when it came out, but it was a very important computer for Apple because it got people buying Macs again. That is what Apple needs right now. Not a paradigm changing computer. Something powerful and affordable would do nicely.

Wrong. It was a three cord internet computer with most of the performance of a tower at the time, combined with a perfectly good monitor and optical drive. It also adopted the emerging USB standard before anyone else, and was one of the first production computers to use 802.11 wireless with any commercial success. Compared to the rat's nests that many PCs become, the iMac was a godsend for consumers, and it was pretty enough that you didn't feel you needed to hide it.
 
thatwendigo said:
No more cracking the case to upgrade your RAM or HD. All you need is to buy a module and put it into the hole with the right symbol.

Actually most cases just have screws or tabs which allow them to be opened without physically damaging it. If you're cracking your case, you should maybe take a step back and re-read the instructions that came with it. <boom tish>
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.