HPC and cell biology research

Share on Google+Tweet about this on TwitterShare on FacebookShare on LinkedInPin on PinterestEmail this to someone

Biologists at the Ohio State University have been using supercomputers to answer one of the most fundamental questions in biology – how cells know what to become as they grow – according to an article from the Ohio Supercomputer Center.

The researchers were studying the plant Arabidopsis thaliana, a species that is very often used to study plant biology due to its short growth period (like fruit flies or mice for animal studies). They were using supercomputers as a kind of massive guess-and-check calculation experiment to determine how proteins interact within Arabidopsis to determine the fate of its cells. According to the article, “The mathematical model Siegal-Gaskins constructed consists of seven differential equations and twelve unknown factors. For his preliminary studies, he turned to OSC to choose random values for the unknowns and solve the equations for millions of different random value sets.”

The supercomputer, after running trillions of guess-and-check calculations, returned the most likely values for the equations that accounted for protein function. “The center’s flagship supercomputer system, nicknamed the Glenn Cluster, features 9,500 cores, 24 terabytes of memory and a peak computational capability of 75 teraflops – which translates to 75 trillion calculations per second.”

In a related article on www.genomeweb.com, Matthew Dublin outlines the future of supercomputing technology and what it means for biological research.

The fastest high-performance computing system in the world is the National Center for Computational Science’s “Jaguar,” a Cray XT5 supercomputer with a peak performance of 2.33 petaflops — roughly 2,000 trillion calculations per second — that came online in 2009. Given that the petascale barrier has only recently been broken, it may come as a surprise that the next jaw-dropping level of HPC is already receiving serious attention. Exascale computing — 1018, or a quintillion, floating-point operations per second — is a scale of computing that most in the HPC community have only recently fantasized about (and been terrified by).

Exascale computers, which may be operational and ready for scientific use by 2019, offer a new vista of possibilities for scientific endeavor. According to Rick Stevens, the Associate Director for Computing, Environment, and Life Science at the Argonne National Laboratory,

Right now, molecular models don’t take into account the molecular physiology of the organisms — they’re abstractions — and as we get more observational data, and as genomes fill out in terms of annotations, and protein function fills out and we have more structural data, we’ll start to move from abstract models towards real 3D computational models of cellular processes and I think that will happen over the next five to 10 years.

For scientists, more computing power means less abstraction in their models and thus results that are truer to biological life. The most exciting possibility for exascale computing is to model the evolution of biological microorganisms, which requires immense computations, to better understand the process of biological evolution.

With the help of supercomputing technology, the future of biological inquiry looks very promising.

Leave a Reply