What Would You Do With a Million CPU’s?
29 January, 2008 - 2 min read
There’s a new podcast on Futures in Biotech with Dr. Pande from Folding@Home. Macresearch summarized it well:
How a bunch of Sony PS3s have become the largest component of the world’s fastest computer The challenges of distributed computing, and in particular how data storage and CPU usage can actually complement each other After the hype in the 80s around computational modeling of protein structure, the computational power available today could finally make that hype a reality How to take a non-parallel task and transform it into a series of computational chunks (a.k.a. how to make a baby in 1 day with 270 women) How modeling of protein structure will be able to get more into the dynamics of protein conformational changes What would you do if you had 250,000 CPUs? I really like the final point, “What would you do with 250,000 CPU’s”, because it’s an important question. Petascale computing has arrived but most applications aren’t ready to scale to thousands or millions of cores. Folding@Home is as a distributed computing project as it is biomedical. What they’ve been able to do is treat simulations as data and use bayesian data mining techniques to put together the whole picture with suprising efficiency. A clever workaround for Folding@Home’s “supercomputer”, which is severely limited by network latencies and individual agents with slow hardware compared to ‘real’ supercomputers. Finally he reports that PS3’s and GPU’s are achieving 20-30x acceleration. Exciting stuff! image taken from Flikr, CC licence