PDA

View Full Version : Supercomputers Break Petaflop Barrier, Transforming Science


Prof.
Nov 19, 2008, 06:30 PM
http://blog.wired.com/wiredscience/images/2008/11/17/jaguar1.jpg

A new crop of supercomputers is breaking down the petaflop speed barrier, pushing high-performance computing into a new realm that could change science more profoundly than at any time since Galileo, leading researchers say.

When the Top 500 list of the world's fastest supercomputers was announced at the international supercomputing conference in Austin, Texas, on Monday, IBM had barely managed to cling to the top spot, fending off a challenge from Cray. But both competitors broke petaflop speeds, performing 1.105 and 1.059 quadrillion floating-point calculations per second, the first two computers to do so.

These computers aren't just faster than those they pushed further down the list, they will enable a new class of science that wasn't possible before. As recently described in Wired magazine, these massive number crunchers will push simulation to the forefront of science.

LINK (http://blog.wired.com/wiredscience/2008/11/supercomputers.html)

TuffLuffJimmy
Nov 19, 2008, 06:33 PM
What sort of calculations are these computers used for?

Hosting myspace? ;)

Vivid.Inferno
Nov 19, 2008, 10:01 PM
What sort of calculations are these computers used for?

Hosting myspace? ;)
WoW actually lol

Cave Man
Nov 19, 2008, 10:21 PM
I'm sure the People for the Ethical Treatment of Animals (flop) will have a fit using a Tiger in this clear violation of animal rights. :cool:

notjustjay
Nov 19, 2008, 10:33 PM
... but isn't that a jaguar? :P

Big-TDI-Guy
Nov 19, 2008, 10:38 PM
I thought Folding at Home smoked that barrier over a year ago with the PS3. ;)

Awww yeah.

http://fah-web.stanford.edu/cgi-bin/main.py?qtype=osstats

iMacmatician
Nov 20, 2008, 03:49 PM
What sort of calculations are these computers used for?Running Windows Vista.

:D

I see the whole thing of simulation coming to the fore as quite a big change. I agree with the article that many more things (extrapolation, prediction) would become more feasible.

ReanimationLP
Nov 20, 2008, 06:34 PM
Maybe Blizzard should gobble one of these up so theres no more server lag with WoW. :D

Killyp
Nov 25, 2008, 03:24 PM
What sort of calculations are these computers used for?


Well I'm guessing they're connected to the internet so there's only one thing they're good for...

NT1440
Nov 25, 2008, 03:31 PM
Funny that with all that power we still cant do LARGE scale weather simulations.

Man that takes alot of power for some reason.


Edit: Can it run Crysis?

Killyp
Nov 25, 2008, 03:57 PM
Edit: Can it run Crysis?

Probably not.

Schtumple
Nov 25, 2008, 04:41 PM
Funny that with all that power we still cant do LARGE scale weather simulations.

Man that takes alot of power for some reason.


Edit: Can it run Crysis?

Argh beat me too it :p

Wonder if Safari is snappy on it...

TuffLuffJimmy
Nov 25, 2008, 06:35 PM
That's pretty amazing that they're only at the jaguar stage. It'll be crazy fast once they get it up to tiger.

MagnusVonMagnum
Nov 26, 2008, 01:23 AM
That's pretty amazing that they're only at the jaguar stage. It'll be crazy fast once they get it up to tiger.

I thought the Jaguar was discontinued by Atari many many moons ago? Does that mean we'll be seeing a new version of Pole Position soon? :p

All that talk about climate models.... Geeze. Sticking my finger out the window is a more reliable measure of weather than than weather forecasts these days. People like Dick Goddard in Cleveland used to be pretty good at a 2 week forecast. But ever since they started using computer models for weather forecasting, things seem to be wrong more often than not. I've taken trips to Cedar Point where it said it was supposed to storm and there was not a cloud in the sky and other trips when it said sunny and clear and a thunderstorm came in and ruined the day. I have NO FAITH in weather forecasting.

Now if they can't forecast an ACCURATE weather forecast for a lousy week ahead of time (forget those 21 days forecasts; they're not even REMOTELY accurate), how the heck do they expect me to believe in in this global warming business? The Climatologists claim the world is on this carbon based trip to temperature oblivion, but it's been COLD as heck in the MidWest this month, MUCH lower than normal for November. Last winter was one of the snowiest and coldest on record. I haven't seen it reach 100 degrees in Ohio since the 1980s. The past two summers have been EXTREMELY MILD here. Where is this Global Warming supposed to be?

The funny thing is if you look at Astronomy instead of Climatology you'll find the real answer. We've had higher than normal sunspot activity for the 25 years or so, which correlates DIRECTLY with the supposed Global Warming whereas if you look at the carbon dioxide charts, there's this massive DIP in temperatures during the late sixties into the late '70s for about 10 years or so where Climatologists thought we might be entering a new ice age (Global Cooling) that makes NO SENSE given a constant INCREASE in Carbon emissions over the past 100 years. There was no drop in carbon PERIOD to explain that drop in temperature during that period. There WAS, however, a drop in sun spot activity during that same period! BINGO! There IS NO "Global Warming". Sun spot activity is expected to be down for the next 11 years or so and I fully expect colder temperatures for that same period of time. If the Climatologists are right, temperatures will keep going up. If not, we'll know they're all idiots. Frankly, if you run the simulations we have now with the data from the start of the century projected into today, it says the average temperature SHOULD have increased 10 degrees. Measurements show an increase of closer to 1 degree. Something is VERY wrong with their models (their weather models too).

And that's the problem with this whole Super Computer line of thinking that having multiple Petaflops is somehow going to change the world. All it's really going to do is give idiots out there new reasons so SCREAM the sky is falling over some simulation's outcome when the problem isn't the computing power, but the simulation itself being an inaccurate model of whatever it is they think they're simulating. The computer will only calculate the data it's given. Garbage In = Garbage Out.

dextertangocci
Nov 28, 2008, 02:38 AM
I thought the Jaguar was discontinued by Atari many many moons ago? Does that mean we'll be seeing a new version of Pole Position soon? :p

All that talk about climate models.... Geeze. Sticking my finger out the window is a more reliable measure of weather than than weather forecasts these days. People like Dick Goddard in Cleveland used to be pretty good at a 2 week forecast. But ever since they started using computer models for weather forecasting, things seem to be wrong more often than not. I've taken trips to Cedar Point where it said it was supposed to storm and there was not a cloud in the sky and other trips when it said sunny and clear and a thunderstorm came in and ruined the day. I have NO FAITH in weather forecasting.

Now if they can't forecast an ACCURATE weather forecast for a lousy week ahead of time (forget those 21 days forecasts; they're not even REMOTELY accurate), how the heck do they expect me to believe in in this global warming business? The Climatologists claim the world is on this carbon based trip to temperature oblivion, but it's been COLD as heck in the MidWest this month, MUCH lower than normal for November. Last winter was one of the snowiest and coldest on record. I haven't seen it reach 100 degrees in Ohio since the 1980s. The past two summers have been EXTREMELY MILD here. Where is this Global Warming supposed to be?

The funny thing is if you look at Astronomy instead of Climatology you'll find the real answer. We've had higher than normal sunspot activity for the 25 years or so, which correlates DIRECTLY with the supposed Global Warming whereas if you look at the carbon dioxide charts, there's this massive DIP in temperatures during the late sixties into the late '70s for about 10 years or so where Climatologists thought we might be entering a new ice age (Global Cooling) that makes NO SENSE given a constant INCREASE in Carbon emissions over the past 100 years. There was no drop in carbon PERIOD to explain that drop in temperature during that period. There WAS, however, a drop in sun spot activity during that same period! BINGO! There IS NO "Global Warming". Sun spot activity is expected to be down for the next 11 years or so and I fully expect colder temperatures for that same period of time. If the Climatologists are right, temperatures will keep going up. If not, we'll know they're all idiots. Frankly, if you run the simulations we have now with the data from the start of the century projected into today, it says the average temperature SHOULD have increased 10 degrees. Measurements show an increase of closer to 1 degree. Something is VERY wrong with their models (their weather models too).

And that's the problem with this whole Super Computer line of thinking that having multiple Petaflops is somehow going to change the world. All it's really going to do is give idiots out there new reasons so SCREAM the sky is falling over some simulation's outcome when the problem isn't the computing power, but the simulation itself being an inaccurate model of whatever it is they think they're simulating. The computer will only calculate the data it's given. Garbage In = Garbage Out.


Thank you! Finally someone with sense! Global warming doesn't exist.