NPR covers academic use of cluster "supercomputers" made from PS3s

Discussion in 'Current Events' started by mkrishnan, Feb 22, 2009.

  1. Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #1
    Something everyone here has heard about already, but it was interesting to hear that these things are actually being used at some Universities for real academic computing.

    http://www.npr.org/templates/story/story.php?storyId=100969805
     
  2. macrumors 68020

    SactoGuy18

    Joined:
    Sep 11, 2006
    Location:
    Sacramento, CA USA
    #2
    Sounds interesting, but wouldn't they be better off by installing bunch of blade server computers (let's say a couple of thousand of them) and sync them all together using the Beowulf clustering software that is available for Linux?
     
  3. thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #3
    I'm thinking that people who want to build a $5000 computer out of networked PS3s probably can not afford a "couple of thousand of" blade servers. ;)
     
  4. macrumors 68020

    SactoGuy18

    Joined:
    Sep 11, 2006
    Location:
    Sacramento, CA USA
    #4
    While using a bunch of PS3's sounds like a good idea if you need true 24/7 reliability you have to go with a real blade server setup. A lot of the clustered supercomputer setups are built this way, usually with a couple of hundred blade servers installed in racks in a dedicated room.
     
  5. thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #5
    Yes, but once again, not something one can do for $5000....I'm not saying it's practical. You would think the PS3 is not exactly designed from an energy/heat standpoint to participate in cluster computing. I'm just saying it's interesting. The PS3 has economies of scale no supercomputer / cluster hardware can have, which bring its price down dramatically, but it seems that as a result, the variant of the Cell that it uses can be configured to do some fairly hefty processing at very low prices, which is impressive.

    In a similar note there have been reports of Mac Mini clusters too, i.e. being used as rendering farms.

    Plus you have to appreciate the ingenuity, I think, of the community involved.
     
  6. Guest

    garybUK

    Joined:
    Jun 3, 2002
    #6
    Do you mean PS3? The main advantage the PS3 has is it's cell processor. The cluster WILL be running Linux / Beowolf, you can easily get YDL for PS3.
     
  7. thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #7
    Yeah, sorry, PS3. And yes, that's essentially exactly what I was saying, save that my point was that, were the Cell not in the PS3 or another similar high volume device, it would not have reached economies of scale that would allow this solution to be so inexpensive....
     
  8. macrumors Penryn

    Abstract

    Joined:
    Dec 27, 2002
    Location:
    Location Location
    #8
    Can't you just take a bunch of Intel quad-core desktop processors and link them together, and up to a node computer? For $5000, you can get a lot of processing power, desktop cases, and RAM for that much money.
     
  9. macrumors 6502

    Joined:
    Aug 15, 2003
    #9
    The processing power of an average cell processor or GPU is much higher than that of the best CPU.

    It would certainly not be better to have thousands of blade servers. It would be a better idea to use that money to have a farm of servers each connected to an NVIDIA Tesla.

    Intel simply doesn't make a CPU that can match the speed and parallelism of a GPU, so even the fastest multi-core multi-CPU Xeon machine money can buy will be slower than a single graphics processor. The main problem with CPUs is their limited number of cores; a GPU or cell processor can run hundreds or thousands of parallel threads.

    Scientific computing on a CPU is inefficient nowdays.
     
  10. macrumors 68030

    .Andy

    Joined:
    Jul 18, 2004
    Location:
    The Mergui Archipelago
    #10
    This sounds like a ridiculous idea. Every time they boot up for an experiment they'll be forced by Sony to undergo a sotware update. They'll hardly get any work done...... It's hard enough just to play a new game on one.
     
  11. macrumors 65816

    jodelli

    Joined:
    Jan 6, 2008
    Location:
    Windsor, ON, Canada
    #11
    Think of being able to address the GPU's power directly without using the standard graphics programming interface. A programable GPU. They already handle more calculations on the fly than a typical CPU.
     
  12. thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #12
    Or else the conversation might go,

    "Come on, man, I need to run a simulation."

    "Dude, I can finish this level. Just give me five minutes. I can totally beat it."
     

Share This Page