arn said:
The whole overclocking talk is silly. If it comes from the manufacturer to run at that speed, it's not overclocked.
arn
From what I've gathered over the years, from reading up on the Mac and cpu development (in general), my understanding is that off a typical wafer of silicon they have different... "abilitied-chips" (did I just coin a phrase, making me as smarty as all those tech pundits?... nah!).
What I mean is that, if they're benchmarking (aiming) for a mean/average of 2GHz CPUs, most of the chips will fall in that ballpark, while a few "won't be up to snuff", and will be down-clocked to, say, 1.8GHz; while a few of the chips are "born wunderkinds" and can be successfully over-clocked/rated at, say, 2.2GHz... They add the clock (quartz, or whatever techincathingymabob), or set it, once they've rated each chip, as far as I can fathom.
So, yes, from what I've read, the factory sets the clock-speed of each cpu. Also, I'm guessing that 3rd-party accelerator card manufacturers then try to "stretch" that number in order to boast of having faster upgrades - I think that the 1GHz G3 for the old Pismos (like mine) from... Newer Tech(?) was an example of an overclocked cpu.
TheMasin9 said:
this can only go so far, 90, 65, 45, when we start approaching 20-5 nanometers things are going to get really scary because then you are dealing with elements on the atomic scale, and then they start acting way different than they would in these kind of chip fabs.
I think it was in 2004 that Wired magazine had a cover story about artificial diamonds. Supposedly they were near-indistinguishable from the real thing - which has to have DeBeers crapping their 90%-monopoly-pants.
But I found that the most practical use for these lab-grown diamonds is that they can (theoretically or actually) grow them in any shape they choose. And that one of the most practical applications (not counting the latest boulder on J.Lo's finger) was to replace silicon in the manufacture of computer chips.
Think about it; they've added Copper Interconnects, Strained-Silicon-on-Insulator, truly exotic & expensive materials like gallium arsenide(sp?)... But they're fast approaching the technical limitations of silicon. That's part of the reason you'll always see "leakage".
Diamond, however, promises to take semiconductors forward on their next great leap. So, with two competing companies & processes mentioned in the Wired article, here's hoping they can successfully AND reasonably commercialize the process in the next few years... Not to mention some atom-scale molecular-lithography process to etch those sub-nanometre highways into thumbnail-sized supercomputers.
maya said:
MWSF 2008, Steve Jobs put an end to the MacMini and introduces the MacNano.
This news is great, however people are wondering where will chip technology head from here. And the answer to that question is photon processors along with photo data links. Finally a developing technology back in 2001-2002 will see it into the consumer marketplace around 2010-2012.
Cannot be bothered to look for a link on photon processors or data links.
The next dot.com bubble will occur around 2015, just as all the photonic-startups are starting to cash in their stock options. That bubble will burst when the scientists in Japan (who were able to show evidence that neutrinos have mass) announce that they've successfully inserted sub-atomic data streams into neutrinos and that they will soon be able to redirect the paths of individual neutrinos... Overnight the telco industry stocks are delisted from the world's stock exchanges... Microsoft tries to buy Norse Son's brain so they can claim a prior patent on the intellectual property involved in the evolutionary breakthrough...
By the way, in the time it took me to write this, several quantities of trillions of neutrinos passed quietly & uninterrupted through my body... No wonder I feel quesy.............
dashiel said:
http://www.yellowtab.com/
this is what BeOS 5.1 source was.
while superior in many ways to NeXT, there would be no apple today if they had bought Be.
Exactly! Imagine Jean-Louis Gasse trying to troll his way through a, "One More Thing..."
Plus, what I saw of the BeOS, back when the talk was about Apple buying it,... I don't know, I just didn't care for the interface... Kinda looked Pop Art meets Coloring Book"...
Then, again, it's highly unlikely we would have had the iMac, iPod, iTunes, MacOS X - it would have been something staid & "plaid", like MacOS 10...