Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Ouch!

I don't even want to think how hot these things will get. I just hope they come up with a good way to dissipate the heat...I mean other than liquid cooling!
 
notjustjay said:
If 90nm isn't working out, why would 45 be any better?

Isn't that kinda like saying "Well, we've been having trouble fitting 9 clowns into this Volkswagen, but don't worry, we're working on getting 15 in there now"?

I'm with you. They should fix the 90nm process first. I believe that IBM will do a 65nm process before the 45nm process. It may sound silly to be thinking of 45nm when 90nm is not working yet, however, they have to keep their plans going for the future. It may not happen in 2007 but anyway they have to start preparing.
 
Dippo said:
Doesn't 65nm come after 90nm.

Yeah it does. They won't be able to pull this 45nm thing off. They might pull the 65 nm thing off by 2007, but if its they're having this much trouble with 90nm, then I wouldn't bet on 45nm by 2007. Two years isn't THAT long.

Oh, and why does it need to go from 90nm to 65 nm? Why can't they go for dimensions in between, say 7.2 nm? :confused:
 
manthas said:
I don't even want to think how hot these things will get. I just hope they come up with a good way to dissipate the heat...I mean other than liquid cooling!

(off topic) Has there been any issues with the liquid cooling? I mean, if it is more effective and even more quiet, isn't that a good thing?

(on topic) For those with some background in basic physics this announcement is rather astounding. The wavelengths used in the photo-etching process must be approaching the ultra-ultra UV near X-ray range! Perhaps they have found a more efficient way to work with "ultra ultra UV" and thus able to skip an intermediary step?
 
I'm not a technical whiz but do smaller chips sizes mean that there will have to be more of them used. Until 90nm, 130nm chips are reliable technology at least. I'm curious, because XBox 360 uses multiple chips in lieu of one large one.

Apple software already spreads the workload between the CPU and graphics card, could that be further implemented?
 
This thread is misleading people into believing in a chain of events that is not necessarily newsworthy for impatient Macintosh fans (myself included). First, IBM is not jumping over the 65nm process. The industry is going to embrace 65nm for the next 2-4 years. The advanced fabs are just ramping up on 65nm and there remain a number of systematic problems to be worked out. The 90nm process is currently the most advanced *volume* production process and hence 65nm is considered to be N+1 technology while 45nm is N+2. At any given time the semiconductor industry is working on the N+1 and N+2 generations. At this time, 45nm is still in early R&D stage. New materials (such as low-k dielectrics for interlevel oxides and high-k for gate oxides) are being developed and tested, and even new transistor designs such as the double-gated FinFET are being studied. Historically, R&D costs for each subsequent technology node have doubled. With 45nm, the R&D cost may be prohibitive for any one company to shoulder, and hence the semiconductor industry has formed a consortium called IMEC that is based in Belgium. The idea is to share R&D costs starting with 45nm.

This announcement from IBM highlights one of the earliest and potentially most expensive and thorny problems with 45nm, namely immersion lithography. It works like this:

1. The wavelength of light used to expose the reticle is still 193nm. Several years ago, feature sizes (such as metal line widths and spacings) were 0.25 microns wide (250nm). This is safely above the stepper wavelength (193nm) and allows the pattern to be printed or exposed on the wafer surface quite easily.

2. Since the 180nm technology node, the feature size has fallen BELOW the stepper wavelength. How can a 193nm wavelength of light expose gaps and widths that are 180nm wide? The laws of optics tell us that in order to resolve or "see" a gap of X nm in width, we must use a wavelength of light that is itself LESS than X nm in width. Today's feature sizes are down to 65nm and are still being printed with 193nm light! This seeming violation of the laws of physics and optics is being achieved by very clever techniques generally known as RET or Resolution Enhancement Techniques. Since the 180nm technology node, RET has been growing in cost and complexity from simple OPC (optical proximity correction) to PSM (phase shift mask) to the combination of OPC plus PSM, and now on to SRAF (sub-resolution assist features) which is ushering in a new category of RET called X-RET or Extreme-RET. The industry could have reduced the stepper wavelength from 193nm to 154nm, but a detailed analysis showed that simply shortening the stepper wavelength would be cost-prohibitive! Instead, use of 193nm has been extended to the 45nm technology node, but the gap between 193nm and 45nm is quite large and cannot be completely resolved even by the most advanced RET.

3. Fortunately, something called Immersion Lithography has been introduced. It has been tried before with mixed results, but the need for it has never been as urgent as it is now. By immersing the wafer in water, one can reduce the effective numerical aperture (NA), allowing 193nm light to act as if it were a shorter wavelength. The wafer now has to be immersed in water, however, and this creates new challenges for new types of resist and topcoat materials that can withstand the effects of water contamination. Today, however, standard dry resist materials are being tested with wet immersion lithography, and this is leading to problems such as resist bubbles. While this problem can be controlled, it requires slowing down the stepper, which is hardly an acceptable solution for high-volume production. Hence, new resist materials are being developed, and it seems to me that IBM's partnership with Toppan is specifically aimed at the development of new photomask materials (wet photo resist and topcoat, for example).

Hence, this announcement is not especially newsworthy to Macintosh fans. It does not say anything about a new PowerPC chip on 45nm, only that IBM -- like everyone else -- is working actively on 45nm process develoment. Will Intel transition its manufacturing line to 45nm in 2007 timeframe? Sure. Will AMD? You bet. Will Freescale? Yup.
 
ksz said:
This thread is misleading people into believing in a chain of events that is not necessarily newsworthy for impatient Macintosh fans (myself included). First, IBM is not jumping over the 65nm process. The industry is going to embrace 65nm for the next 2-4 years. The advanced fabs are just ramping up on 65nm and there remain a number of systematic problems to be worked out. The 90nm process is currently the most advanced *volume* production process and hence 65nm is considered to be N+1 technology while 45nm is N+2. At any given time the semiconductor industry is working on the N+1 and N+2 generations. At this time, 45nm is still in early R&D stage. New materials (such as low-k dielectrics for interlevel oxides and high-k for gate oxides) are being developed and tested, and even new transistor designs such as the double-gated FinFET are being studied. Historically, R&D costs for each subsequent technology node have doubled. With 45nm, the R&D cost may be prohibitive for any one company to shoulder, and hence the semiconductor industry has formed a consortium called IMEC that is based in Belgium. The idea is to share R&D costs starting with 45nm.

This announcement from IBM highlights one of the earliest and potentially most expensive and thorny problems with 45nm, namely immersion lithography. It works like this:

1. The wavelength of light used to expose the reticle is still 193nm. Several years ago, feature sizes (such as metal line widths and spacings) were 0.25 microns wide (250nm). This is safely above the stepper wavelength (193nm) and allows the pattern to be printed or exposed on the wafer surface quite easily.

2. Since the 180nm technology node, the feature size has fallen BELOW the stepper wavelength. How can a 193nm wavelength of light expose gaps and widths that are 180nm wide? The laws of optics tell us that in order to resolve or "see" a gap of X nm in width, we must use a wavelength of light that is itself LESS than X nm in width. Today's feature sizes are down to 65nm and are still being printed with 193nm light! This seeming violation of the laws of physics and optics is being achieved by very clever techniques generally known as RET or Resolution Enhancement Techniques. Since the 180nm technology node, RET has been growing in cost and complexity from simple OPC (optical proximity correction) to PSM (phase shift mask) to the combination of OPC plus PSM, and now on to SRAF (sub-resolution assist features) which is ushering in a new category of RET called X-RET or Extreme-RET. The industry could have reduced the stepper wavelength from 193nm to 154nm, but a detailed analysis showed that simply shortening the stepper wavelength would be cost-prohibitive! Instead, use of 193nm has been extended to the 45nm technology node, but the gap between 193nm and 45nm is quite large and cannot be completely resolved even by the most advanced RET.

3. Fortunately, something called Immersion Lithography has been introduced. It has been tried before with mixed results, but the need for it has never been as urgent as it is now. By immersing the wafer in water, one can reduce the effective numerical aperture (NA), allowing 193nm light to act as if it were a shorter wavelength. The wafer now has to be immersed in water, however, and this creates new challenges for new types of resist and topcoat materials that can withstand the effects of water contamination. Today, however, standard dry resist materials are being tested with wet immersion lithography, and this is leading to problems such as resist bubbles. While this problem can be controlled, it requires slowing down the stepper, which is hardly an acceptable solution for high-volume production. Hence, new resist materials are being developed, and it seems to me that IBM's partnership with Toppan is specifically aimed at the development of new photomask materials (wet photo resist and topcoat, for example).

Hence, this announcement is not especially newsworthy to Macintosh fans. It does not say anything about a new PowerPC chip on 45nm, only that IBM -- like everyone else -- is working actively on 45nm process develoment. Will Intel transition its manufacturing line to 45nm in 2007 timeframe? Sure. Will AMD? You bet. Will Freescale? Yup.
I hardly think 45nm processors will be used in computers.
 
Not so sure

I'm with the other posters on this one. The 90nm jumped effectively halted the breakneck speed by which processor speed was jumping. Intel's 90nm prescotts are monsters of heat. The 90nm G5's chips took forever and a day to arrive. How is moving to an even smaller 45nm process going to help things?

The only benefit I can think of, is that by going to a smaller process, they might be able to get to dual-cores faster. But if the heat problems get worse, we may never see dual-core macs.
 
Lacero said:
I hardly think 45nm processors will be used in computers.
Why?

Intel's Montecito (new Itanium 2 which may be or has been cancelled) is a dual-core design with 12MB on-chip level-2 cache per core and lots of other wonderful wizardry for a total transistor count of ... 1.7 billion!

By comparison, the Pentium D dual-core contains about 350 million.

Intel's plans called for Montecito on 65nm along with Yonah (dual core Pentium M), but it's quite reasonable for Montecito to come down to 45nm. Intel is not alone with breaking the 1 billion transistor-per-chip mark. Xilinx, I think, has already done it with their Vertex 4 FPGA. Isn't it likely that billion-transistor chips will trend down to 45nm?
 
It gives off too much heat that no heat sink or watercooled sink can quickly draw away. Especially if you reach +3Ghz speeds and hundreds of millions of transistors. They only go to 45nm to save money on wafers.
 
Lacero said:
I hardly think 45nm processors will be used in computers.

President Rutherford Hayes said of the telephone, "A wonderful invention, but who would ever want to use one?"
 
~Shard~ said:
Sounds like we'll be seeing a G6 PowerMac in 2007... ;)

Easy now! The article states "early production by mid 2007". That certainly doesn't lead to having these on power macs until.. 2008, 2009??
 
Lacero said:
It gives off too much heat that no heat sink or watercooled sink can quickly draw away. Especially if you reach +3Ghz speeds and hundreds of millions of transistors. They only go to 45nm to save money on wafers.



There are more techniques for cooling processors, Lacero,
Than are dreamt of in your philosophy. Some are still trade secrets and are unreleased, some are known but there is apparently trouble mass producing them, and others will be dreamt of in the next two years.
 
Lacero said:
It gives off too much heat that no heat sink or watercooled sink can quickly draw away. Especially if you reach +3Ghz speeds and hundreds of millions of transistors. They only go to 45nm to save money on wafers.
If all else remained the same and we only scaled geometry down from 65nm to 45nm, we would increase heat density. Any savings from use of lower voltages could be masked by higher packing densities. To reduce power dissipation, new materials are required, particularly high-k gate oxides and low-k interlevel oxides, new processing techniques such as strained silicon, and new component designs such as FinFETs (and Germanium FinFETs), carbon nanotubes, quantum dots, and other exotic structures.

Power dissipation from CMOS is peaking now at around 13-14 watts per square-cm (compared to 5 W/cm2 for a steam iron). While active power (power needed to drive the transistor) has been rising at a rate of 1.3 or 1.4x per generation, idle power (dissipation) has been rising at a rate of 4x and is no longer sustainable. Eventually the industry will develop new materials and transistor topologies that will take us away, once and for all, from CMOS, but this will be a gradual process.
 
last time i checked i thought i read that as the size of the production process decreases, you then get less heat given off by the same unit running at the same speed. The main restriction on speed used to be the heat given off (which is why you can run an intel 486 chip at 2ghz if you cool it with dry ice). (and the reason why the 800mhz g4 in my ibook runs ALOT cooler than the first one that was put in the powermacs)

So going down a proccess means a 2.7ghz g5 based on the 45nm proccess would give off far less heat than the current one.

The reason they run so hot now is because all the cpu makers are running them faster than they should in order to compete with eachother.

I am probably wrong about a lot of this, could someone in know correct me here?
 
ksz said:
This thread is misleading people into believing in a chain of events that is not necessarily newsworthy for impatient Macintosh fans (myself included). First, IBM is not jumping over the 65nm process. The industry is going to embrace 65nm for the next 2-4 years. The advanced fabs are just ramping up on 65nm and there remain a number of systematic problems to be worked out. The 90nm process is currently the most advanced *volume* production process and hence 65nm is considered to be N+1 technology while 45nm is N+2. At any given time the semiconductor industry is working on the N+1 and N+2 generations. At this time, 45nm is still in early R&D stage. New materials (such as low-k dielectrics for interlevel oxides and high-k for gate oxides) are being developed and tested, and even new transistor designs such as the double-gated FinFET are being studied. Historically, R&D costs for each subsequent technology node have doubled. With 45nm, the R&D cost may be prohibitive for any one company to shoulder, and hence the semiconductor industry has formed a consortium called IMEC that is based in Belgium. The idea is to share R&D costs starting with 45nm.

This announcement from IBM highlights one of the earliest and potentially most expensive and thorny problems with 45nm, namely immersion lithography. It works like this:

1. The wavelength of light used to expose the reticle is still 193nm. Several years ago, feature sizes (such as metal line widths and spacings) were 0.25 microns wide (250nm). This is safely above the stepper wavelength (193nm) and allows the pattern to be printed or exposed on the wafer surface quite easily.

2. Since the 180nm technology node, the feature size has fallen BELOW the stepper wavelength. How can a 193nm wavelength of light expose gaps and widths that are 180nm wide? The laws of optics tell us that in order to resolve or "see" a gap of X nm in width, we must use a wavelength of light that is itself LESS than X nm in width. Today's feature sizes are down to 65nm and are still being printed with 193nm light! This seeming violation of the laws of physics and optics is being achieved by very clever techniques generally known as RET or Resolution Enhancement Techniques. Since the 180nm technology node, RET has been growing in cost and complexity from simple OPC (optical proximity correction) to PSM (phase shift mask) to the combination of OPC plus PSM, and now on to SRAF (sub-resolution assist features) which is ushering in a new category of RET called X-RET or Extreme-RET. The industry could have reduced the stepper wavelength from 193nm to 154nm, but a detailed analysis showed that simply shortening the stepper wavelength would be cost-prohibitive! Instead, use of 193nm has been extended to the 45nm technology node, but the gap between 193nm and 45nm is quite large and cannot be completely resolved even by the most advanced RET.

3. Fortunately, something called Immersion Lithography has been introduced. It has been tried before with mixed results, but the need for it has never been as urgent as it is now. By immersing the wafer in water, one can reduce the effective numerical aperture (NA), allowing 193nm light to act as if it were a shorter wavelength. The wafer now has to be immersed in water, however, and this creates new challenges for new types of resist and topcoat materials that can withstand the effects of water contamination. Today, however, standard dry resist materials are being tested with wet immersion lithography, and this is leading to problems such as resist bubbles. While this problem can be controlled, it requires slowing down the stepper, which is hardly an acceptable solution for high-volume production. Hence, new resist materials are being developed, and it seems to me that IBM's partnership with Toppan is specifically aimed at the development of new photomask materials (wet photo resist and topcoat, for example).

Hence, this announcement is not especially newsworthy to Macintosh fans. It does not say anything about a new PowerPC chip on 45nm, only that IBM -- like everyone else -- is working actively on 45nm process develoment. Will Intel transition its manufacturing line to 45nm in 2007 timeframe? Sure. Will AMD? You bet. Will Freescale? Yup.


To put it simply, IBM has developed a new process that may allow them to develop smaller and faster processors. The jump from 90nm to 45nm is irrelevent. This is an entirely new process. Even if they never get the current process to work at 65nm, they obviously feel confident that the new process will work at 45.

While there is no mention of PPC chips in this article, I would think that IBM would definately steer towards PPC with this technology as their fastest servers are PPC based.
 
Windowlicker said:
Easy now! The article states "early production by mid 2007". That certainly doesn't lead to having these on power macs until.. 2008, 2009??

Ah yes, true true, I'm just jumping the gun. ;) Hopefully by 2008/2009 the PowerMacs will be at 3.0 GHz... :p
 
I am not looking very much into this right now. I believe IBM is doing a very good job at not succumbing to the "feast or famine effect" In the good business years the company invests in research, in the bad years they want to pull out. But research is a steady-state thing, you can't just turn it on and off.

Right now things are slow and dull on the G5 front. But I still have hope that something great will be just around the corner. I believe that much of this current dullnes is a result of IBM and Apple putting more focus into better designed and more powerful versions of the G5. But don't get the impression that I am implying that these processors are on the verge of being released. I believe that Apple is working very closely with IBM right now to produce the best processors possible. Especially since they were shamed by the the lack of progress and production capacity brought about by the first and current versions of the G5.
 
I am on the rope with Lacero with this.

Yes there are many ways to cool electronic items, including R-134a compressed refrigerant, peltier (sp?) devices, and other ways to bring temps below room temperature. But how cost effective are these, and how much impact will it play on keeping computers small, light and quiet?

Granted more items are in the works, and new items are invented every day, but I doubt we will se processors like this in our PC's in the next 2 years, at 3+ GHZ.

Just my thoughts, nothing more.
 
pontecorvo said:
Let me be the FIRST to say, this is great news for fans of the 45-nanometer chip making process!
Someone's trying to increase post count. Naughty, naughty. ;)

Here's to hoping they can fix the 90nm issue, or better yet perfect the 65nm, first. They used to say the 970 was a good test for the POWER5, I'm hoping IBM does a little more with it. I realize Intel is also having some issues, but we were all hoping the G5 would be taking us a little further by now. Sure hope Steve's getting a good discount on those 2.7s and decides to be nice and pass those savings on to us soon (yeah right). Maybe we're just waiting for the G6. Of course, that's what we've been saying about the G4, and look how that's turned out.

Now I'm just rambling. I don't think this means anything yet. But hey, at least they seem to be thinking about the future.
 
shooterlv said:
To put it simply, IBM has developed a new process that may allow them to develop smaller and faster processors. The jump from 90nm to 45nm is irrelevent. This is an entirely new process.
Huh? A new process? Where are you getting this?

Even if they never get the current process to work at 65nm, they obviously feel confident that the new process will work at 45.
Are you suggesting that IBM developed a process for 65nm, tried to make it work, decided to give up, and are now focusing that process on 45nm feature sizes? If something could not work on 65nm, why would it work on 45nm?

While there is no mention of PPC chips in this article, I would think that IBM would definately steer towards PPC with this technology as their fastest servers are PPC based.
Well of course, so would everyone else. That's my point. IBM could just as easily announce that they are working on the 30nm process, which actually they probably are, but so what? Once the new process is stable and ready for daylight, they will move whatever designs onto that process that are justified by the economics of the process and the design. This is simply common sense.

If 45nm requires entirely new (or substantially new and different) design rules, it will require an expensive re-synthesis of the chip design. The general trend is towards fewer new design starts at smaller technology nodes because reusable cell libraries are liberally employed in today's designs. Designers do not create new devices entirely from scratch. Instead, like reusable software components, they assemble new designs by reusing preexisting cell libraries and developing any necessary new logic. If the reused cell libraries are too far removed from the design-rule requirements of the current process technology, the old design may need to be discarded in favor of a grounds-up redesign or re-synthesis. This becomes very expensive.

Hence, the economoics of the device (its lifetime, expected production volume, and selling price) may justify a redesign or it may not. If the expense of a redesign cannot be justified, the part will continue to be produced on N or N-1 technology nodes. Eventually and gradually cell libraries will migrate to new design rules and the vendors who make and sell these libraries (and the internal design departments that do the same) will update their libraries to fit the DFM (design for manufacturing) requirements of the latest process.
 
Intel in the news as well

Seems IBM may have been just trying to keep in the news, this is from http://www.forbes.com/markets/feeds/afx/2005/05/26/afx2060330.html


"Intel Corp today announced the launch of Intel's new dual-core processor Pentium D for use in desk-top PCs.

Pentium D follows the release of the Pentium processor Extreme Edition -- its first-ever multi-core processor unveiled in April.

Prices of Pentium D processors will range from 57,180 yen to 87,930 yen per unit.

'While the Extreme Edition is a high-end product, with the launch of Pentium D, we are making a major foray into the volume zone, as we prepare ourselves for the transition to the multi-core processor era,' Kazaumasa Yoshida, the president of Intel KK, the Japanese unit of Intel Corp, said."
 
hcorf said:
last time i checked i thought i read that as the size of the production process decreases, you then get less heat given off by the same unit running at the same speed. The main restriction on speed used to be the heat given off (which is why you can run an intel 486 chip at 2ghz if you cool it with dry ice). (and the reason why the 800mhz g4 in my ibook runs ALOT cooler than the first one that was put in the powermacs)

So going down a proccess means a 2.7ghz g5 based on the 45nm proccess would give off far less heat than the current one.

The reason they run so hot now is because all the cpu makers are running them faster than they should in order to compete with eachother.

I am probably wrong about a lot of this, could someone in know correct me here?

You are right, up to a point. As dimensions became smaller the chips became more efficient, they used fewer electrons to store one bit or to change that bit's logic state. However, at around the 90nm level a new effect started to gain importance. For various reasons it became more difficult to isolate the various signals in the chip. What should be perfect insulators were now leaky insulators. Current flowing through a resistor generates heat. Any imperfection in the way materials are formed is magnified into a current leak.

Fantastic levels of genius and technology are being applied to this basket of problems. If they can come up with a break through to solve this leakage problem we'll see some spectacular products.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.