Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Re: 25 GHz?

Originally posted by Ja Di ksw
Not to be rude, but what does someone need 25 Ghz for? Honestly, that is just insane, especially if they are dual processors. I know people making music or animation or whatever would like them, but for your average Joe, do we really need that much?

With this speed we can finally call a Japanese friend and talk to him in plain English. The software will translate in real-time and speak out in Japanese.
This everyday thing in the future will need some serious horsepower.

I'll bet there will be more. :cool:
 
Agreed

I remember when my brother bought his first computer and was told that no one would ever need more than 640k of memory.

We can't measure the future need for power by what is currently available, but rather by what we can imagine.

The example of real-time translation is a good one.

Does anyone remember Apple's vision for computing in their short 1987 video "The Knowledge Navigator"? It included an intelligent agent that created a more natural interface. I can't wait to get rid of the keyboard!
 
Good to see Marklar's still alive. It plays an important role in the checks and balances system going on.

As for the naysayers about 45 nm and 25GHz, I'm glad none of you work for Apple/IBM.
 
more power

Don't forget carbon nanotube interconnects. No, it isn't in production yet, but shows great promise to bypass crosstalk at 45nm. (EEtimes.com?)
The computing power to do real time translation, high level AI, almost realtime universal db searches and direct man/machine interconect. (Kevinwarick.com)
 
Re: Re: Re: 25 GHz?

Originally posted by Jeff Harrell
AI is not a computationally bound problem. It's not like we have AI systems out there right now that can do the job but are too slow. AI is a theoretically bound problem: we don't even know if it's possible yet, much less if it's possible with existing technology, much less whether existing technology is fast enough.

Well, it's both. But your point is taken.
 
Originally posted by Yosh
There's an inconsistency in this article.

1) Steve says we will reach 3 GHz in the next 12 mos (say by Aug-Sept 2004)
2) Article says 3GHz will NOT be accomplished with the G5 which will top out at 2.6-2.8GHz.
3) Article says the PPC980 will start at 2.6-3 GHz and top out at 4.5-5 GHz.
4) Article says the 980 chips are targeted at the end of 2004 (which I assume to mean Nov-Dec 2004).

Thus, the article contradicts what Steve had said at the WWDC Keynote (3GHz by Aug-Sept 2004). Either we will have a 3 GHz G5 (970 chip) or the 3 GHz 980 will be available earlier than the "end of 2004". In either case, throw in the nonsense about 20-25 GHz by 2007-2008 and the article doesn't seem to be all that realistic.

Read it again, slowly and carefully, it makes perfect sense.
 
Judging by the 970 pipeline stages these speeds could be possible

How high of a frequency a processor can achieve on a given process size has a lot to do with the amount of pipeline stages the core has. The 970 has 16 stages for integer-type instructions, 21 stages for floating-point instructions and up to 25 stages for Altivec (vector) instructions. Compare that to the Pentium 4 (20 pipeline stages) which will peak at about 3.2 GHz on the same process size as the 970 and it looks like it could very well be possible that the 970 will hit 2.6-2.8 GHz. If the person giving out this information to MacRumors is incorrect and just guessing, he or she is probably not off by much.

Look at it another way, the Power4 has 12 integer pipeline stages and the 970 has 16. The fastest POWER4 chip now runs at 1.7 GHz. Since the 970 has 1/3 more integer pipeline stages than the POWER4, then the 970 might be able to achieve 2.2+ GHz in the next few months. If this is in the realm of possiblity then I'd expect Apple to announce faster G5 speeds before the end of this year.
 
i want... no, NEED more than 25GHz!!!

don't stop at just 25GHz, because we'll need even more than this. I'm not even talking about translation... even some high quality voice recognition could eat up a large portion of this processing power.

I'm not just talking about a new version of viavoice, but real-time recognition that can do full speed transcription [edit: even with accents, which hampers the funcationality of viavoice for my parents], intelligent automated punctuation, and do so with the same hit to total cpu-load as a keyboard has on my imac g4/800.

add to that translation, another area where there can be orders of magnitude improvements. then, you have speech generation, from text, using preset voiceprints to automatically generate an appropriate rate, rhythm, and tonal inflection.

put together the speech recognition, translation, and voice speech generation and you have something which could potentially max-out a dual 25GHz.


what if you could hum a tune and it would automatically go through your itunes library and find that song, and if not find it from the itunes music store?

how about visual recognition? if you click on a picture of someone in a movie and it searches your iVideo (movie equivalent of itunes which stores your entire DVD collection) library to see that this star actor was the waiter extra in some movie 10 years before, which is where you recognize them from.

and some quality virtual-reality porn will require a quad GX (the "G ten") 42GHz with a 3d hologram projector... hopefully it will be ready by the time i find someone, get married, have kids, and get left by her. one can only hope...

afc
 
Re: Judging by the 970 pipeline stages these speeds could be possible

Originally posted by Phinius
How high of a frequency a processor can achieve on a given process size has a lot to do with the amount of pipeline stages the core has. The 970 has 16 stages for integer-type instructions, 21 stages for floating-point instructions and up to 25 stages for Altivec (vector) instructions. Compare that to the Pentium 4 (20 pipeline stages) which will peak at about 3.2 GHz on the same process size as the 970 and it looks like it could very well be possible that the 970 will hit 2.6-2.8 GHz. If the person giving out this information to MacRumors is incorrect and just guessing, he or she is probably not off by much.

Look at it another way, the Power4 has 12 integer pipeline stages and the 970 has 16. The fastest POWER4 chip now runs at 1.7 GHz. Since the 970 has 1/3 more integer pipeline stages than the POWER4, then the 970 might be able to achieve 2.2+ GHz in the next few months. If this is in the realm of possiblity then I'd expect Apple to announce faster G5 speeds before the end of this year.

It isn't just pipeline depth that determines the clock rate of a processor, buffering, cache, die size, power dissipation, parallelism, latencies, components, and the general architecture of the CPU all go into determining how high a CPU can scale. We will use the Pentium 4 for example as it was designed to scale to high clock rates. A relatively small number of ALUs, FPUs, and General Decoders minimizes the amount of buffer and caches (which lowers clock speeds) needed to feed the processor. A tiny 8 kb data cache with a latency of 2 cycles along with a low latency L2 cache, and double pumped ALUs that lowers the latency of the ADD instruction, keeps latency low. Strong ALUs and Decoders keeps the distances between "pulses" (the point in time when registers can be loaded with new values) low, and finally we have the 20 stage pipeline enables us to do less per clockcycle and thus scale to much higher clock rates. Thats not to say the PPC970 can't scale any higher, I haven't looked over all the technical specifications yet, however, it's design philosiphy seems to be very different than that of the P4's with it's good number of ALUs/FPUs/Decoders, large L1 cache, and general focus on being able to execute multiple instructions every clock cycle.
 
:(

I guess I'll have to wait 'til 2008 before getting a G5 with teeth -- I mean, why waste money on a 2ghz model when a 20ghz model is just around the corner 5 years from now?
 
need for faster computers

I agree with both sides of this argument. We can already see today that in certain situations we already have computers that are fast enough. In a variety of business situations running FM Pro, a spreadsheet, a word processor, a browser and an email client then something like a 1GHz processor is plenty good. The market recognizes this and businesses are not upgrading as often as they used to.

On the other hand, new applications will arise that will take advantage of the much faster computers (and graphics chips). Many have been mentioned already.

AI will improve but personally I think they have a lot more theory to develop before they will benefit a lot from faster computers. Something key is missing at the core which is holding it back.

In many other areas though, scaling up the processing power (and memory, graphics chips, storage, broadband speed, etc.) will help a lot. Voice recognition will benefit. All sorts of modeling and rendering will become much more common. It remains to be seen whether this is done with style (like Keynote) or in a tacky fashion (like powerpoint).

We could forecast a revolution in computation, much as we had a revolution in electronics. Electronics replaced many mechanical things (springs, timers, indicators). Computation rides on electronics, but will replace the "analog" way of doing things.

For example, sensors in a room might be able to measure how a stereo is sounding and modify the speaksers/signal to produce really beautiful sound from relatively cheap speakers and amplifiers. Embedded computers in cars will continue to improve on the efficiency, smoothness and safety of driving. Embedded computers in displays will be able to model the behavior of nonlinear devices and adapt signals so that the produce undistorted results.

I have seen something like this already in the area of digital signal processors. In the old days we used analog amplifiers to shape and amplify pulses before they were digitized. Now the original signal is digitized and all of the pulse extraction, shaping and measurement is done numerically. As much as has been done it is just scratching the surface.

The hardware improvement will probably lead software development. Some of these are very hard problems. However, as a problem is solved it can be implemented very quickly and inexpensively.

As to how fast computer clocks can go, it is true that traditional semiconductor design seems to be coming up to physical limits. However, everyone is working on alternate technologies. I read somewhere that IBM forecast that somewhere around 2010 they would start introducing components using carbon nanotubes. These will be very fast.
 
Altivec Pipelines

Originally posted by cuneglasus
This artcle is rubbish.It cant even get simple facts about the 970 straight.The 970 has 4 altivec pipelines not 2.(IT has a simple integer,complex integer,floating point and permute unit just like the G4).Why do people want to publish such lies.

Yeah, you're right about the four Altivec piplines. Whoever wrote this got that confused with the dual floating-point units in the G5. I'm hoping whoever this source is, he's merely lousy at passing info along and not a confabulator.
 
Re: Re: Re: Re: Re: Future of the PPC? (970, 980, 990, 9900)

Originally posted by jouster
Hmmm. I dunno. I am embarrassed to admit that I don't know what serial or parallel ports are!!!!

Originally posted by XnavxeMiyyep
I dunno either. They're probably something already outdated. I think they still make PC's that use pre-USB ports to hook up the keyboard.

Don't be embarrassed, you are a better person for not knowing. :)

Yes, serial and parallel ports date back to the pre-USB days. The are antiquated and PC makes continue to put them on their machines. I think one of them is really big by the way, larger than an Apple ADC connector. Takes up more space on a laptop.
 
Originally posted by Rincewind42
Coding in assembly is going the way of Latin. Being able to read assembly however is still the hallmark of a good programmer on his platform. There will always be a need to optimize your code because people will always expect things to be faster. So, you can calculate pi to 10 million digits in 2 seconds? Why couldn't you do it in 1 second while decrypting a 10 GB file? Sure, it's great to be able to put an app together without writing more than a handful of code - but if you can do it so can thousands of others and your app really isn't all that distinctive in that sense then is it? A distinctive app requires a programmer to take full advantage of all of the tools available to him or her - including optimizing.

Yup, and being able to read Latin is the sign of a good scholar, but very few people actually speak it... My metaphor is complete.:)

Why not be able to calculate pi to 10 million digits in half the time? Because very few people need to... There are a small number of people that need very high computation speed for special problems, but most of what I'm reading in this thread is from people that don't.

If you're doing genetics research, or particle physics then optimize your code. If you're building office apps and web browsers you want applications that add functionality and are able to interact cleanly. Making Safari an embeddable technology must surely bloat the code a bit, but rewriting and optimizing it for every new application just slows down development with no actual benefit to the user.

I'd guess that only 10% or less of Photoshop is hand tuned.

Office is one of the most successful applications around, and it certainly isn't optimized! Thousands of others could certainly build a word processor, but Word and WordPerfect took the top spots by adding features quickly.

I'm not contradicting you (is that the patter of thousands of little feet I hear?!), but trying to broaden the perspective. The initial voice recognition apps will be optimized for speed, the java app you send over an IM client to sing Happy Birthday to your girlfriend won't be.
 
Originally posted by cuneglasus
This artcle is rubbish.It cant even get simple facts about the 970 straight.The 970 has 4 altivec pipelines not 2.(IT has a simple integer,complex integer,floating point and permute unit just like the G4).Why do people want to publish such lies.

4 execution units. 2 instruction queues.

http://www.arstechnica.com/cpu/02q2/ppc970/ppc970-2.html

Not saying the article isn't bogus, but that isn't the reason why...
 
Re: Agreed

Originally posted by solafide
Does anyone remember Apple's vision for computing in their short 1987 video "The Knowledge Navigator"? It included an intelligent agent that created a more natural interface. I can't wait to get rid of the keyboard!

Did anyone read "Snow Crash"? Screw google, I want that Librarian!

For anyone who thinks we don't need more computing power, sci-fi is a good place to start.
 
Re: i want... no, NEED more than 25GHz!!!

Originally posted by asim
what if you could hum a tune and it would automatically go through your itunes library and find that song, and if not find it from the itunes music store?

Now THAT is a kick ass idea! Here's something related that I've always thought would be incredibly cool. You know how there are some small parts of songs that sound really good to you, like they're hardwired to your soul? Such samples might come from a variety of music, and be unrelated to genre or band/composer. What if you could play those snippets, and your 50 THz machine would be able to search the "iTMS" to find all songs which have large portions of emotionally similar music?
 
Originally posted by Analog Kid


If you're doing genetics research, or particle physics then optimize your code.

What's great about future speed increases is that it will break down walls of what is possible for an average genetics or physics researcher. They will be able to write sloppy programs that aren't optimized and still get useful computation done. Right now too much of this is dependent on programmers who don't know the subject or researchers who can't code as well as the programmers. It'll bring supercomputer-level computation to the masses (if you count Ph.D. researchers as the masses).

Dros
 
A little clarification added to the article

20-25GHz speeds are targeted by 2010-2011.

arn
 
Originally posted by arn
A little clarification added to the article

20-25GHz speeds are targeted by 2010-2011.

arn

[begin sarcasm] OH! Well this isn't big news! I expected 25 GHz by 2011, but not by 2008! If it would have been huge if they got 25 GHz chips by 2008, but everyone will have 25 GHz by 2011. Stupid IBM. [end sarcasm]

Still good news, even if it is three years delayed. :)
 
Re: Re: Judging by the 970 pipeline stages these speeds could be possible

Originally posted by Cubeboy
It isn't just pipeline depth that determines the clock rate of a processor,

Yes, but if you increase the pipeline stages on a given design by 1/3 then it should top out at a much higher frequency. The POWER4 has 12 integer stages and the 970 has 16 stages. Yet the Power4 is at 1.7 GHz and the 970 is only at 2 GHz. IBM obviously increased the pipeline stages for the 970 to raise the topend frequency. How high of a frequency it can achieve is unknown at this time, but 2 GHz is very low for 16 pipeline stages. The AMD Athlon XP is at about 2 GHz and yet it only has 10-12 pipeline stages and as I said the 74557 G4 is expected to hit 1.8 GHz and it has a measly 7 pipeline stages.

We will use the Pentium 4 for example as it was designed to scale to high clock rates.


The 970 is also designed to achieve high clock rates. That's why IBM put 16 integer stages and 21 stages for floating point. You don't honestly believe that the IBM design is that horribly inefficient to not get significantly beyond 2 GHz on a .13-micron process do you?

Thats not to say the PPC970 can't scale any higher, however, it's design philosiphy seems to be very different than that of the P4's


The POWER architecture is designed to execute many instructions in every clock cycle, but with the added pipeline stages for the 970 it is also designed for much higher frequencies. That's obviously why IBM added the additional pipeline stages and a topend frequency of 2 GHz on a .13-micron process would be a patheticly poor result from adding all those extra pipeline stages.
 
Re: 25 GHz?

Originally posted by Ja Di ksw
Not to be rude, but what does someone need 25 Ghz for? Honestly, that is just insane, especially if they are dual processors. I know people making music or animation or whatever would like them, but for your average Joe, do we really need that much? I guess that's what the iMac's and the like are for, though by then they will have 10 or 20 Ghz or whatever.

Remember this? :)

"640K ought to be enough for anybody"
- Satan, um, I mean, Bill Gates.

3 truisms of computers:
1). You can never have enough memory.
2). You can never have too much disk space.
3). You can never ever have a system that is too fast.

One use would be folding. Imagine how many units you could turn out with a system at 25 Ghz! Instead of a render farm you render on your home computer! Predicting weather and oil exploration will no longer be the private domain of super computers. The government will be changing their attitude when performance eclipses anything an Intel or compatible based PC can accomplish.

You get the idea. Too much speed is just not possible.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.