Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
but

what is to say that the new ibm development is for apple? doesn't ibm supply for anyone else and/or themselves? it's positive for ibm and industry, but who says for apple.
 
Originally posted by Abstract

I wish Jobs would stare straight into the eyes of the head honcho at Moto, push him over, and sit on his face with his pants down. Then he'll know the kind of crap Apple has put up with over the last few years. "Taste this, assmonkey!!!"

Remind me to never deliver anything late to you ... ;)
 
Originally posted by mvc
It looks like steve is saying to himself going "Damn - his ones bigger - I gotta get a wafer enlargement."

See, size IS important Mr Moto! - bigger wafers with smaller dies = higher yield. (Yes, I know its not quite that simple)

Although it looks that way, it's a very clever camera trick. If you compare each wafer proportionally to the hands and body, they're the same size. It's just that the camera is at the Wafer Man's waist looking up, while Steve's camera is at some distance away looking straight at him.

Which makes sense, because they're the same size wafer (the size wafer that Fishkill uses).
 
Originally posted by Phil Of Mac
Although it looks that way, it's a very clever camera trick. …Which makes sense, because they're the same size wafer (the size wafer that Fishkill uses).
Ah, I just assumed Steve was holding a wafer from Moto, which I vaguely recall reading are smaller 8-9? inch wafers.

Still, you must be right, if the wafer Steve was holding was from Moto, there would be big coffee stain rings & donut crumbs embedded in the wafer from being used as a coaster.

:D
 
Originally posted by Fender2112
WOW! Those are great articles. I had no idea such technology existed. You can't blame Apple for going with IBM. This stuff is amazing. :)

Yes, I quite agree. This is amazing stuff. And it's directly related to the area that I want to work in once I finish my PhD. I think that we do live in interesting times...
 
Re: but

Originally posted by new user
what is to say that the new ibm development is for apple? doesn't ibm supply for anyone else and/or themselves? it's positive for ibm and industry, but who says for apple.

Uh... If they develop 65nm silicon technology, you can pretty well guess that they will apply it to most if not all of their products.

I mean, why would they try to use the 130nm process for the PPC970 when there's the perfectly good .27micron process?

My point is that even if this technology isn't immediately applied to the PPC line for Apple products, I think that we can be sure that it would be applied in the foreseeable future. Will it be applied to other product, too? Absolutely.
 
from the articles mentioned before:

"Nanotubes can be microns long, but are only 1.4 nanometer in diameter, inviting the mathematical approximation of one-dimensionality."

Woah.:eek:
 
Originally posted by punter
from the articles mentioned before:

"Nanotubes can be microns long, but are only 1.4 nanometer in diameter, inviting the mathematical approximation of one-dimensionality."

Woah.:eek:

Yes, I dont have a scientific brain cell in my body but I grasped more or less what the technology was all about in the article. Couldnt they just use a magnet to attract the metally carbon bits?

Good old carbon. If its good enough for the front forks on my 2003 bicycle, I wont mind seeing it at the core of my 2010 10Ghz Powerbook.

I also loved the way the article talked about "the scientists" did this, the scientists did the other. I know they are scientists, but it just sounded like I was reading about some special beings beyond the level of us mere human end users.
 
Originally posted by mvc
It looks like steve is saying to himself going "Damn - his ones bigger - I gotta get a wafer enlargement."

In the case if size, where does that leave Microsoft? (It's a joke everybody, thanks!)

Intel procs???? I'm happy we have IBM and the RISC processor on our team...
 
Originally posted by mim
Because they don't just make chips, we'll probably see most of the solutions to smaller & faster processor (or alternatives) come from them, rather than from Intel - especially in the next 3 or 4 years as standard chip manufacturing processes (semi-conductor based) hit their walls, so to speak.

Heres a very recent quote from an interview with Intels VP Pat Gelsinger:

"Make no mistake, we're doing a bunch of hypey, nanotechnology, like our atomic layer self composition formation. So if you want to go hype we'll talk about a number of those just to keep it fun. IBM makes it sound like silicon is nearing its end, but since their silicon business is trivial and ours is huge, of course they would be motivated to present it that way."

Sounds like a man trying to put a positive spin on the concept of doing as little as possible, by spreading the FUD over anyone who has a broader vision!
 
Its really good to see that there are plans to go much smaller and much faster with the future of Mac CPUs. I just hope that they don't hit a snag and can live up to the expectations.

We don't need another Motorola issue.....

D :D
 
Originally posted by Snowy_River
It will be an interesting thing to watch how this plays out. The farther into the future, the smaller the process they try to utilize, the more likely it is that we'll actually see the whole she-bang replaced with something entirely different. When you start getting into 45nm sizes, you are getting into a range where the technological possibilities of fabricating it become smaller and smaller (no pun intended). Meanwhile, there are several other contending technologies that would allow much smaller transistor sizes, etc. So, it will prove to be interesting to see which technology takes the lead on this..

What do you mean "something entirely different"? Different from the PPC line of cpus? If it's different than the concept of producing 45nm chips NOW that's almost guarenteed, cause if they knew how to efficiently manufacture 45nm cpus now they'd be doing it. Obviously they're just in R&D phase now so any concept of what's to come a few years down the road is extremly vague at best. And as you know, when 45nm cpus start hitting the streets they'll be working on 20nm and smaller.
 
Re: Re: but

Originally posted by Snowy_River
I mean, why would they try to use the 130nm process for the PPC970 when there's the perfectly good .27micron process?

Because all chips are not created equal. Smaller design rules are usually used on custom ASICs and other relatively small, simple non-commodity chips first. For instance, IBM has been making ASICs in a 90nm process for years, yet their PPC stuff is still 130nm or larger.
 
Light emitting carbon nanotubes

Do you realize what this means? No one has mentioned it yet, so I guess not.

Conversion of electricity to light and back. Nano-transistors based on light, not electrons.

Does the term Optical Isolinear chip ring a bell to anyone?

The future is coming, fast.

Do you have any clue how much research and technology is based on science fiction? In fact, that's a favorite game among scientists, they pick their favorite scifi and they work on trying to make it reality... (When they can...) Nano-transistors at first can be made in the same design as current transistors, so at first, we will see great miniturization. However, then they will get the clue they are not limited by the constraints of silicon, and that they would be able to *craft* their own circuits with far greater freedom. Then circuit boards will start looking more and more like isolinear chips.

Just don't smudge the optics. *joking*

Jaedreth

(Of course the technology is 300 years or more off in full implimentation, but so is most of the cool stuff they're working on...)
 
Computer Chips 101 question...

This is probably excruciatingly basic for those in the know, but I'm finding it a bit counter-intuitive:

Why is it that a chip made with a 90nm process runs cooler than a chip made with a 130nm process? My intuition says that narrower = greater resistance, so more heat. (I'm not questioning whether the thinner process is cooler -- I believe them when they say it -- I'm just wondering what I'm missing...)
 
Chip basics

Actually, this is kinda counter-intuitive.

Think of a copper wire like a garden hose.

In order to have a hose have an ample spray of water, a full current, the hose has to be turned on so much.

If you had a hose with half the diameter, then you could turn the faucet down half as much to get the same flow.

Now, imagine the hose is a copper wire, and the water are electrons.

The smaller the wire (if the wire is still formed well and proper), the smaller the stream of electrons needs to be in order to properly flow through the wire.

Sure, you can vary the amount of electron flow through a wire, but there is like a sweet spot. A thick copper wire would have more resistence to a small electron flow than a very thin one. Just like a small trickle of water might not even make it out of a garden hose.

It's a crude example, but the smaller the wires, the pathways, the less electricity it needs to complete a circuit, the less power it uses, the cooler it is.

Make sense?

Jaedreth
 
Originally posted by DrGonzo
What do you mean "something entirely different"? Different from the PPC line of cpus? If it's different than the concept of producing 45nm chips NOW that's almost guarenteed, cause if they knew how to efficiently manufacture 45nm cpus now they'd be doing it. Obviously they're just in R&D phase now so any concept of what's to come a few years down the road is extremly vague at best. And as you know, when 45nm cpus start hitting the streets they'll be working on 20nm and smaller.

If you read my post, then you know that what I meant by 'something completely different' was the likely move away from silicon based electronics toward something else, such as nanotube base electronics (though this is only one possible new direction). This will revolutionize electronics in general, as they have been based on silicon for decades. This migration will likely be similar to nothing short of the migration from vacuum tubes to silicon transistors...
 
Re: Re: Re: but

Originally posted by RalphNumbers
Because all chips are not created equal. Smaller design rules are usually used on custom ASICs and other relatively small, simple non-commodity chips first. For instance, IBM has been making ASICs in a 90nm process for years, yet their PPC stuff is still 130nm or larger.

My point still stands. Even if this development isn't implemented in the PPC line immediately, it will likely migrate there eventually.
 
Re: Light emitting carbon nanotubes

Originally posted by jaedreth
Do you realize what this means? No one has mentioned it yet, so I guess not.

Conversion of electricity to light and back. Nano-transistors based on light, not electrons.

Optoelectronics has been in development for a long time. Probably as much as thirty years, or more, if you look back to the foundation work (think fiber-optics in the communications industry). It has only been within the last five years or so that we have started to see real development of this technology.

Does the term Optical Isolinear chip ring a bell to anyone?

The future is coming, fast.

Do you have any clue how much research and technology is based on science fiction?

Uh... I think you've got that backwards. So much of science fiction is based on research and technology. It's only natural that technology develops in the direction of science fiction because science fiction writers tend to look at where technology is going and guess what it will look like farther down the road.

In fact, that's a favorite game among scientists, they pick their favorite scifi and they work on trying to make it reality... (When they can...)

What scientists do you know that take this approach? None of them that I know do (myself included), at least not directly.

Nano-transistors at first can be made in the same design as current transistors, so at first, we will see great miniturization. However, then they will get the clue they are not limited by the constraints of silicon, and that they would be able to *craft* their own circuits with far greater freedom. Then circuit boards will start looking more and more like isolinear chips.

I think that the scientist involved with this already 'have a clue' as to the possibilities of incorporating optoelectronics at the microchip level.

Also, truly the next step beyond transistor-type electronics (optoelectronics will most likely exist side-by-side with nanotube transistors, not replace them) will be quantum electronics. Nanotubes, when used in a proper way, can act as electron gates (the quantum equivalent of a transistor). The prospective speed capabilities of a quantum computer put even an optoelectronic chip to shame.

(Of course the technology is 300 years or more off in full implimentation, but so is most of the cool stuff they're working on...)

I wouldn't count on it taking that long...
 
"I wouldn't count on it taking that long..."

Then to quote one comedian, "Where are my flying cars?"

An article I read recently explains exactly why technology created way outpaces technology that we see in everyday life.

The US is the most technologically advanced country in the world (unless Japan beat us recently)...

The US economy is based on Capitalism, which is *great* for dealing with scarcity, but *horrible* for dealing with abundance.

There are a lot of great technologies out there we won't ever see, or at least for not a hundred years or so, not because they aren't possible, but because our *economy* cannot cope with having abundant supply of a cheap product. In order for Capitolism to run properly, any over abundance must be controlled and made scarce enough so that it is worth something. Otherwise there won't ever be a "return on the investment".

And there have been some fantastic technologies (which I intentionally won't go into, even if asked) that if released, would cause too much turmoil in our economy for these simple facts.

It's why the Internet Age has come and gone, and people are *still* using dialup. I used to dream of days that Optical networking would run all over the US, and everyone would have high speed access. But it was only a Pipe Dream. It could have been done, and running the networks wouldn't have been near as expensive as one might think. However, whenever supply exceeds demand, capitolism doesn't work, so companies have to create scarcities in order to be able to charge prices that will keep them in business, which is the one of the reasons of inflation.

I agree the technology will come far sooner than my last post. But I don't expect to see such technology in common use for quite some time.

Jaedreth
 
Originally posted by DrGonzo
Why not? they've been making mainframes, cpus, and other such hardware for years and years man. They have the experience, the knowledge, and the capability to make very nice products (which as i said before, they've been doing).

I know the G4 is sooooo high tech and it is sooooo much better than anything by IBM, Intel, AMD, or any other microprocessor manufacturing company.

:rolleyes: :rolleyes:

scem0
 
Originally posted by scem0
I know the G4 is sooooo high tech and it is sooooo much better than anything by IBM, Intel, AMD, or any other microprocessor manufacturing company.

:rolleyes: :rolleyes:

scem0
Hmmm .... He was talking about IBM, not Moto.
 
Re: "I wouldn't count on it taking that long..."

Originally posted by jaedreth
There are a lot of great technologies out there we won't ever see, or at least for not a hundred years or so, not because they aren't possible, but because our *economy* cannot cope with having abundant supply of a cheap product. In order for Capitolism to run properly, any over abundance must be controlled and made scarce enough so that it is worth something. Otherwise there won't ever be a "return on the investment".

This is a bit of a gross over simplification. All economies depend on "return on investment". It has always been true, in any culture with any economy, that the level of technology exceeds the level that is commonly available, but it has more to do with the fact that implementation of new technology is very expensive. Whereas, as a technology that has matured a bit and has become "cheap" can much more easily be implemented.

Any technology that can be implemented as an abundant and cheap product is an incredibly attractive product. But, most of the time, the higher the tech needed, the more expensive it will be. For example, how scarce are plastic forks?

And there have been some fantastic technologies (which I intentionally won't go into, even if asked) that if released, would cause too much turmoil in our economy for these simple facts.

This sounds like it's coming from 'urban legends' to me.

It's why the Internet Age has come and gone, and people are *still* using dialup. I used to dream of days that Optical networking would run all over the US, and everyone would have high speed access.

I don't know what you're talking about. The number of people who are getting high speed connections is still increasing. For that matter, the number of people that are only just now starting to use the internet at all is still increasing. This is simply a matter of economics.

Would you argue that under Communism we'd all have high speed connections? How many people had home computers in the USSR before it dissolved, let alone internet access?

I agree the technology will come far sooner than my last post. But I don't expect to see such technology in common use for quite some time.

That was my only point. I would say that it will likely be a decade (more or less), at least, before we see nanotube based computers, let alone quantum implementation of such. But I don't think it will be measured in centuries (barring some world-wide disaster that brings a new technological dark age).
 
Communism

Are you referring to the American connotation of Communism (Leninist Stalinist Marxism) or the actual pure ideal Communism (which is as equally impossible as a pure Democracy)?

Either way, no, it wouldn't have happened then either.

Communism (not Leninist Stalinist Marxism, but the true Communism that Marx wrote about) is a pure ideal where everyone *shares*. Each according to their own need. Those with more give to those in need. In this "pure ideal", ownership is not even a concept. Capitalism is based on the concept of owership. So the two are mutually exclusive. Furthermore, Communism requires a specific mature behavior from its participants. Cooperation.

Democracy is another ideal system where everyone gets an equal say in what the group will do. Then they all discuss it together, and they vote on possible courses of action. What we have is a Representative Democracy and a Republic, which according to the pure ideal of Democracy, is not really Democracy, because some people (Congress, Senate, President) get more say than others (ordinary citizens and homeless people).

Democracy also requires the mature behavior from its participants, Cooperation. However, Democracy also requires something far more difficult to get from a large group of people. An educated public. The participants of the democracy have to be educated enough where everyone is capable of understanding and taking part in all aspects of the decisions made (aka the government).

Again, it's a pure ideal.

Our representative democratic republic works well, because, well, it's still running. And it helps keep Capitalism alive and well. Leninist Stalinist Socialism was doomed to failure from the getgo, the system was flawed. Not to say that Socialism is flawed, or even inherently bad. But that form was.

I'm merely describing how economic factors as well as social tend to hamper or dampen technological progress deployment. Not development, but deployment.

Oh, and new on the press:

http://www.theregister.co.uk/content/3/32232.html

Power5 Woot!

(By the way, I may not be doing a very good job of making my points, I'm rather sleep deprived. I read an article yesterday that really hit home, I tried to summarize it, but forgot the url, and did a poor job on the summary...)


Jaedreth
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.