Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
~Shard~ said:
Yeah, you're right - I'll probably in the same boat as you. Merom with Leopard pre-installed sounds good enough to me! :cool:

You remember that I said I doubt that Blu-Ray burners will be available in the MacBook in Jan 07? I revise my opinion on that. The first Blu-Ray burners are already coming out and it shouldn't take more than a year to get them into a slot loading laptop format. For you, since you want a desktop it is even less of a problem but for me this is excellent news. :D
 
Cinch said:
I think a width of a typical atom (C or H) is measured in Angstrom (-10), whereas picometer is -12 meters. Can't have a transistor smaller than an atom.

Cinch:D

What about quark transistors then? :p ;)
 
~Shard~ said:
What about quark transistors then? :p ;)

I guess we can have an atom in superposition (~entanglement), and do its calculation. It won't be a transistor in a traditional sense, but we'll have one superfast calculator.:D



Cinch
 
Diatribe said:
You remember that I said I doubt that Blu-Ray burners will be available in the MacBook in Jan 07? I revise my opinion on that. The first Blu-Ray burners are already coming out and it shouldn't take more than a year to get them into a slot loading laptop format. For you, since you want a desktop it is even less of a problem but for me this is excellent news. :D

I heard that AOpen was starting to release commercial Blu Ray burners already, with Panasonic to follow in March, so this would make sense. From all the Blu Ray stuff I saw at CES in Vegas a few weeks ago, it definitely looks very cool! They even had a Blu Ray movie playing on a PS3 which was neat to see. :cool:

Hopefully they'll hit laptops soon after desktops for your sake! :)

cinch said:
I guess we can have an atom in superposition (~entanglement), and do its calculation. It won't be a transistor in a traditional sense, but we'll have one superfast calculator. :D

Bring on quantum computers! :D :cool:
 
~Shard~ said:
But then by the time 2008 rolls around, there'll be something cool coming in 2009 to wait for, then 2010, then it'll be OS 11 in 2011, and then.... :p :D Nah, if you play the waiting game, all you'll do is wait. I have made a committment to myself to buy a new Mac sometime in 2007, and I won't regret that decision no matter what. ;) Buy what you need, when you need it, and you'll never be disappointed. :cool:

Good point. But this also is either good or bad news for Mac buyers. Now that we are in the Intel camp, the desire to upgrade will become stronger as Intel ups the chip ante each year. IMO the speed boosts in the G4 and G5 era for me at least have been modest between models. Is it possible that we will see quarterly updates to each line as new Intel chips come out?
 
bugfaceuk said:
Leakage is only a part of the problem, the bigger problems are yield caused by errors in the lithography process. There's now whole ranges of software from companies like Mentor Graphics (who I believe led the charge with the Calibre range, but not certain) that take IC designs and ajust them so that they are more manufacturable. In addition to that whole other areas of DFM (design for manufacture) tools have been tackling these problems. Each reduction exagerates old problems, and introduces new ones, but I think the IC industry is now saying that they can pretty much see their way to 10nm, beyond that who knows?

It's all impossibly small to me!
You're referring to a type of problem called RET or resolution enhancement technique. Although we're discussing Intel's recent accomplishments in 45nm prototyping, we discussed IBM's efforts here:

https://forums.macrumors.com/threads/128849/

More interesting is this annoucement from Intel nearly 2 years ago on a self-proclaimed breakthrough in high-k gate dielectrics.

http://www.intel.com/pressroom/archive/releases/20031105tech.htm

Finally, I described RET briefly in post 32 of the IBM thread:

This thread is misleading people into believing in a chain of events that is not necessarily newsworthy for impatient Macintosh fans (myself included). First, IBM is not jumping over the 65nm process. The industry is going to embrace 65nm for the next 2-4 years. The advanced fabs are just ramping up on 65nm and there remain a number of systematic problems to be worked out. The 90nm process is currently the most advanced *volume* production process and hence 65nm is considered to be N+1 technology while 45nm is N+2. At any given time the semiconductor industry is working on the N+1 and N+2 generations. At this time, 45nm is still in early R&D stage. New materials (such as low-k dielectrics for interlevel oxides and high-k for gate oxides) are being developed and tested, and even new transistor designs such as the double-gated FinFET are being studied. Historically, R&D costs for each subsequent technology node have doubled. With 45nm, the R&D cost may be prohibitive for any one company to shoulder, and hence the semiconductor industry has formed a consortium called IMEC that is based in Belgium. The idea is to share R&D costs starting with 45nm.

This announcement from IBM highlights one of the earliest and potentially most expensive and thorny problems with 45nm, namely immersion lithography. It works like this:

1. The wavelength of light used to expose the reticle is still 193nm. Several years ago, feature sizes (such as metal line widths and spacings) were 0.25 microns wide (250nm). This is safely above the stepper wavelength (193nm) and allows the pattern to be printed or exposed on the wafer surface quite easily.

2. Since the 180nm technology node, the feature size has fallen BELOW the stepper wavelength. How can a 193nm wavelength of light expose gaps and widths that are 180nm wide? The laws of optics tell us that in order to resolve or "see" a gap of X nm in width, we must use a wavelength of light that is itself LESS than X nm in width. Today's feature sizes are down to 65nm and are still being printed with 193nm light! This seeming violation of the laws of physics and optics is being achieved by very clever techniques generally known as RET or Resolution Enhancement Techniques. Since the 180nm technology node, RET has been growing in cost and complexity from simple OPC (optical proximity correction) to PSM (phase shift mask) to the combination of OPC plus PSM, and now on to SRAF (sub-resolution assist features) which is ushering in a new category of RET called X-RET or Extreme-RET. The industry could have reduced the stepper wavelength from 193nm to 154nm, but a detailed analysis showed that simply shortening the stepper wavelength would be cost-prohibitive! Instead, use of 193nm has been extended to the 45nm technology node, but the gap between 193nm and 45nm is quite large and cannot be completely resolved even by the most advanced RET.

3. Fortunately, something called Immersion Lithography has been introduced. It has been tried before with mixed results, but the need for it has never been as urgent as it is now. By immersing the wafer in water, one can reduce the effective numerical aperture (NA), allowing 193nm light to act as if it were a shorter wavelength. The wafer now has to be immersed in water, however, and this creates new challenges for new types of resist and topcoat materials that can withstand the effects of water contamination. Today, however, standard dry resist materials are being tested with wet immersion lithography, and this is leading to problems such as resist bubbles. While this problem can be controlled, it requires slowing down the stepper, which is hardly an acceptable solution for high-volume production. Hence, new resist materials are being developed, and it seems to me that IBM's partnership with Toppan is specifically aimed at the development of new photomask materials (wet photo resist and topcoat, for example).

Hence, this announcement is not especially newsworthy to Macintosh fans. It does not say anything about a new PowerPC chip on 45nm, only that IBM -- like everyone else -- is working actively on 45nm process develoment. Will Intel transition its manufacturing line to 45nm in 2007 timeframe? Sure. Will AMD? You bet. Will Freescale? Yup.
 
arn said:
The whole overclocking talk is silly. If it comes from the manufacturer to run at that speed, it's not overclocked.

arn
From what I've gathered over the years, from reading up on the Mac and cpu development (in general), my understanding is that off a typical wafer of silicon they have different... "abilitied-chips" (did I just coin a phrase, making me as smarty as all those tech pundits?... nah!).

What I mean is that, if they're benchmarking (aiming) for a mean/average of 2GHz CPUs, most of the chips will fall in that ballpark, while a few "won't be up to snuff", and will be down-clocked to, say, 1.8GHz; while a few of the chips are "born wunderkinds" and can be successfully over-clocked/rated at, say, 2.2GHz... They add the clock (quartz, or whatever techincathingymabob), or set it, once they've rated each chip, as far as I can fathom.

So, yes, from what I've read, the factory sets the clock-speed of each cpu. Also, I'm guessing that 3rd-party accelerator card manufacturers then try to "stretch" that number in order to boast of having faster upgrades - I think that the 1GHz G3 for the old Pismos (like mine) from... Newer Tech(?) was an example of an overclocked cpu.

TheMasin9 said:
this can only go so far, 90, 65, 45, when we start approaching 20-5 nanometers things are going to get really scary because then you are dealing with elements on the atomic scale, and then they start acting way different than they would in these kind of chip fabs.
I think it was in 2004 that Wired magazine had a cover story about artificial diamonds. Supposedly they were near-indistinguishable from the real thing - which has to have DeBeers crapping their 90%-monopoly-pants.

But I found that the most practical use for these lab-grown diamonds is that they can (theoretically or actually) grow them in any shape they choose. And that one of the most practical applications (not counting the latest boulder on J.Lo's finger) was to replace silicon in the manufacture of computer chips.

Think about it; they've added Copper Interconnects, Strained-Silicon-on-Insulator, truly exotic & expensive materials like gallium arsenide(sp?)... But they're fast approaching the technical limitations of silicon. That's part of the reason you'll always see "leakage".

Diamond, however, promises to take semiconductors forward on their next great leap. So, with two competing companies & processes mentioned in the Wired article, here's hoping they can successfully AND reasonably commercialize the process in the next few years... Not to mention some atom-scale molecular-lithography process to etch those sub-nanometre highways into thumbnail-sized supercomputers.

maya said:
MWSF 2008, Steve Jobs put an end to the MacMini and introduces the MacNano. ;) :)


This news is great, however people are wondering where will chip technology head from here. And the answer to that question is photon processors along with photo data links. Finally a developing technology back in 2001-2002 will see it into the consumer marketplace around 2010-2012. :D

Cannot be bothered to look for a link on photon processors or data links. :p
The next dot.com bubble will occur around 2015, just as all the photonic-startups are starting to cash in their stock options. That bubble will burst when the scientists in Japan (who were able to show evidence that neutrinos have mass) announce that they've successfully inserted sub-atomic data streams into neutrinos and that they will soon be able to redirect the paths of individual neutrinos... Overnight the telco industry stocks are delisted from the world's stock exchanges... Microsoft tries to buy Norse Son's brain so they can claim a prior patent on the intellectual property involved in the evolutionary breakthrough...

By the way, in the time it took me to write this, several quantities of trillions of neutrinos passed quietly & uninterrupted through my body... No wonder I feel quesy.............

dashiel said:
http://www.yellowtab.com/

this is what BeOS 5.1 source was.

while superior in many ways to NeXT, there would be no apple today if they had bought Be.
Exactly! Imagine Jean-Louis Gasse trying to troll his way through a, "One More Thing..."

Plus, what I saw of the BeOS, back when the talk was about Apple buying it,... I don't know, I just didn't care for the interface... Kinda looked Pop Art meets Coloring Book"...

Then, again, it's highly unlikely we would have had the iMac, iPod, iTunes, MacOS X - it would have been something staid & "plaid", like MacOS 10...
 
Just a reminder that it considered impolite to post response to each post in a thread. There may be times that it seems appropriate, but try to limit to two at a time. Take from so one that needed a flame retardant suit from time to time here on MR. :D
 
Chip NoVaMac said:
Good point. But this also is either good or bad news for Mac buyers. Now that we are in the Intel camp, the desire to upgrade will become stronger as Intel ups the chip ante each year. IMO the speed boosts in the G4 and G5 era for me at least have been modest between models. Is it possible that we will see quarterly updates to each line as new Intel chips come out?

To keep pace with the PC industry, this is a definite possibilty - and not necessarily a bad thing...
 
sporadic, not quarterly

Chip NoVaMac said:
Is it possible that we will see quarterly updates to each line as new Intel chips come out?
Updates will probably happen on Intel's timetable, which is not rigidly tied to any particular calendar cycle or Apple Event schedule.

When Intel introduces something, it means that they've already been producing it for a while, and they've been sending chips to the OEM vendors to build into systems. At the introduction, the other vendors are onstage showing off new systems with the new chips.

You saw this at CES with Yonah. Apple wasn't there, but they had MWSF a few days later, and Apple didn't have any MacIntels during CES.

For minor bumps, it's very low key. If/when Intel has enough 2.26 GHz Yonahs to start selling them, there will be a press release (and maybe coordinated releases from Dell, HP, Lenovo,...) saying "new 2.26 available". What if this happens in mid April - does Apple stick with chips slower than everyone else? Does Apple "obsolete" the current iMacIntel and crow about "new and faster"?

I've said several times that Apple should follow the lead of Dell and the other Intel vendors, and make chip speed simply of the BTO menu options. This makes it pretty simple to start shipping the new chip - on the day of Intel's introduction, you'll see a new menu option under "CPU:" in the store menu. (Why does Apple force you to buy a faster CPU if you only need more video RAM? Why does Apple force you to pay for extra video RAM and bigger screen if you only need a faster CPU?)

What if Merom starts shipping late July? Can Apple afford to wait until Paris? Does The Steve get onstage with Michael Dell and show off the MacBook SuperPro64™ alongside the 64-bit Latitudes?

It's a new world for Apple - one in which Apple is a passenger, not the driver.

(P.S: note the distinction between "announced" and "introduced" for Intel. Merom, Conroe and lots of other things (like the 45nm circuits) have been announced, but only Yonah has been introduced.)
 
Overclocked..

Of course the G5 chips were overclocked. They needed water cooling to hit that 2.5 GHZ speed. Any PC processor will do the same thing. "Unlocked" extreme and FX chips routinely run well above their stock frequency with more exotic cooling systems in place.

Dell actually demoed a future system running at 4.26 GHZ. The fact that apple had to resort to factory overclocking their entire line of PowerMacs should give you guys a sign as to why they switched.

Now you can argue all you like that they aren't overclocked because they were sold from Apple but we all know that Apple would have gone with the much cheaper aircooling solution if possible. Those chips NEEDED to be watercooled to hit those speeds. That's overclocking.

Pete
 
GuyClinch said:
Of course the G5 chips were overclocked... Now you can argue all you like that they aren't overclocked because they were sold from Apple but we all know that Apple would have gone with the much cheaper aircooling solution if possible. Those chips NEEDED to be watercooled to hit those speeds. That's overclocking.

Pete

Unfortunately clock speed is not the only factor that determines the need for cooling. Your argumant only holds true if the watercooling was used for because fans were not capable, Apple's reasoning has always been that it was used to keep noise down this makes sense. The majority of PC's have fans located directly above the CPU's the G5 blows air across so the actual surface area available to blow air on compared to the amount of air being pushed is very small. Both the 2.7 and Quad use watercooling... by your argument both must there fore be overclocked :rolleyes:

The use of 2.3's with out watercooling obviously indicates that much above that the heat dispiation is not acceptable but the difference of 200MHz or even 400MHz (2.7) is not enough of an increase to require watercooling even if they were overclocked. The G5's cooling system with fans is clever but not efficient in the CPU bay because of their location. In order to cool faster you need a much larger volume of air to pass over the chip this increases noise. The 2.7's were also watercooled but there has never been any suggestion of these having been overclocked. (Edited for garbage). If Apple had wanted to use fans they would have done simple as that. Even with watercooling the temperatures of the G5 (2.5, 2.7, dual core 2.5) is high yet the power consumption never reaches Pentium levels.

A redesign of the CPU cooling system would allow all models to be air cooled but at the price of noise which is the one thing Apple have been criticised for in the past. Once again more conjecture but no proof.
 
300 watts is about a factor of two too high

psycho bob said:
The fastest Pentium dual cores require over 300 watts flat out...
The Thermal Design Power of the dual-care 3.46 GHz Extreme Edition is 130 watts. This is the highest sustained ("flat out") power use to be expected. (page 76 of http://download.intel.com/design/PentiumXE/datashts/31030602.pdf)

The absolute instantaneous peak is 167 watts (page 20 of the same document).
 
AidenShaw said:
The Thermal Design Power of the dual-care 3.46 GHz Extreme Edition is 130 watts. This is the highest sustained ("flat out") power use to be expected. (page 76 of http://download.intel.com/design/PentiumXE/datashts/31030602.pdf)

The absolute instantaneous peak is 167 watts (page 20 of the same document).
Dual core processors also require less power than their single core predecessors overall. These days it isn't about speed as much, it's performance per watt and multi core technology is the way to achieve this.

We'll see when Conroe arrives as to how the MacTower Pro will develop and perform. Quad core is just around the corner too.
 
AidenShaw said:
The Thermal Design Power of the dual-care 3.46 GHz Extreme Edition is 130 watts. This is the highest sustained ("flat out") power use to be expected. (page 76 of http://download.intel.com/design/PentiumXE/datashts/31030602.pdf)

The absolute instantaneous peak is 167 watts (page 20 of the same document).

Me bad, the 300+ watts is the overall system based on 2 hard drives, 1 graphics card, motherboard and DVD drive.
 
GuyClinch said:
Of course the G5 chips were overclocked. They needed water cooling to hit that 2.5 GHZ speed. Any PC processor will do the same thing. "Unlocked" extreme and FX chips routinely run well above their stock frequency with more exotic cooling systems in place.

Dell actually demoed a future system running at 4.26 GHZ. The fact that apple had to resort to factory overclocking their entire line of PowerMacs should give you guys a sign as to why they switched.

Now you can argue all you like that they aren't overclocked because they were sold from Apple but we all know that Apple would have gone with the much cheaper aircooling solution if possible. Those chips NEEDED to be watercooled to hit those speeds. That's overclocking.

Pete

Well the test for overclocking as nothing to do with anything Apple do/say/sell as they are consumers of the chips. Were they running at the speeds that IBM sold them to run at? If they were, then it's NOT over-clocking (as they are running on-clock, even if they have water-cooling).

Water cooling can provide other benefits such as increased reliability from reduced heat-stress (the temperature increases more slowly and disapates more slowly also).
 
psycho bob said:
We hear this a lot but does anybody have proof, not just conjecture, that the 2.5GHz G5's were overclocked?

Some say the 2.5's were clocked from 2 overs from 2.2. Some will think the 2.7's were further examples of overclocking but can anybody prove it? :rolleyes:

The thing is that if IBM says its a 2.7 GHz processor, then its not really "overclocked", per se. However, when manufacturers like AMD sell a processor at 2.8 GHz, they set that rating based on what the chip can do using conventional aircooling. If they knew the proc was only going to be used in a water-cooled environment, then they could sell them as 3.0 GHz parts. IBM knew Apple's cooling strategy, so they sold the chips at 2.7 GHz parts, even though the chips could never do that with conventional cooling. For that reason, one could say that while the chips weren't technically overclocked, they were overclocked for all practical purposes.
 
psycho bob said:
Unfortunately clock speed is not the only factor that determines the need for cooling. Your argumant only holds true if the watercooling was used for because fans were not capable, Apple's reasoning has always been that it was used to keep noise down this makes sense.

Your argument is countered by the fact that the aircooled dual-core G5s are widely reported to be quieter than their water-cooled counterparts. Indeed, logically, there is no reason watercooling should necessarily decrease noise. It's possible for watercooling to decrease noise, if that allows you to use a larger heatsink with larger, but slower-running fans, but the watercooled G5 doesn't do that --- their radiator and fan setup isn't any larger than the heatsink on the aircooled models. Indeed, under such conditions, watercooling increases noise, since a large fraction of the overall noise signature comes from turbulent airflow through the radiator/heatsink, and radiators, being more constricting than heatsinks, tend to cause more turbulence noise.

To tell the truth, the cooling system of the G5s never really made sense. My 2.3 GHz 970MP runs quite hot, according to the temperature probes, but has a fairly over the top (and relatively loud) cooling system. The thing is, in theory, the CPU only dissipates around 90W. There is no way a 90W CPU can run at 55C when the cooling system has four fans, a huge heatsink, and is sitting in a chamber that sucks in cold air directly from the outside. I'd guess IBM's watt ratings are like the ones Intel uses for the Pentium 4 --- sketchy "average use" ratings instead of absolute-max ratings like what AMD uses for their chips.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.