Top of the page

GPUs in the news - medicine

\NVIDIA, a maker of GPUs, reports that it has launched AMBER 11, a new version of a program that is frequently used by scientists to aid in the discovery of new medicine.

This version of the program is optimized for GPUs, allowing scientists to use private workstation computers instead of public supercomputers to run the same computational load.

From the article:

œThe supercomputing resource we use is constantly over-subscribed, forcing us to wait a day or more to run a simulation, adding weeks to our research projects, said Dr. Ross Walker, research professor at the San Diego Supercomputer Center, at the University of California, San Diego, and a principle AMBER contributor.

AMBER 11 is designed to take advantage of NVIDIA Tesla„¢ 20-series GPUs, which utilize the massively parallel CUDA„¢ architecture for the specific needs of high performance computing applications. In early trials with the AMBER user community, Dr. Walker received over a dozen reports of speedups over 30-times on a range of bio-molecular simulations.

œWith GPUs, we can now do most of our work at the desktop and that changes everything. Any research department looking to invest in computing resources to run AMBER should start by equipping every researcher with GPU-enabled workstations, Walker added.

My dad recently recounted to me his university experiences in the 1970s, when as a graduate student in Russia, he had a one-hour weekly time slot allotted to him on a public ""supercomputer"" (which had less power than today's cell phones) to run his programs. It seems the shared public supercomputing situation has not changed much since the 1970s, though the computers themselves have become more powerful.

Maybe we will constantly have a need for greater computing power, and the expensive supercomputers at research institutions will always be rented out to scientists whose models (ever growing in complexity) can not be run on private workstations. Maybe, but maybe not.

Just as today's personal computers dwarf the supercomputer my dad used in the 1970s, GPUs are helping make today's super-technology seem like yesterday's calculators.

GPUs in the news - medicine

\NVIDIA, a maker of GPUs, reports that it has launched AMBER 11, a new version of a program that is frequently used by scientists to aid in the discovery of new medicine.

This version of the program is optimized for GPUs, allowing scientists to use private workstation computers instead of public supercomputers to run the same computational load.

From the article:

œThe supercomputing resource we use is constantly over-subscribed, forcing us to wait a day or more to run a simulation, adding weeks to our research projects, said Dr. Ross Walker, research professor at the San Diego Supercomputer Center, at the University of California, San Diego, and a principle AMBER contributor.

AMBER 11 is designed to take advantage of NVIDIA Tesla„¢ 20-series GPUs, which utilize the massively parallel CUDA„¢ architecture for the specific needs of high performance computing applications. In early trials with the AMBER user community, Dr. Walker received over a dozen reports of speedups over 30-times on a range of bio-molecular simulations.

œWith GPUs, we can now do most of our work at the desktop and that changes everything. Any research department looking to invest in computing resources to run AMBER should start by equipping every researcher with GPU-enabled workstations, Walker added.

My dad recently recounted to me his university experiences in the 1970s, when as a graduate student in Russia, he had a one-hour weekly time slot allotted to him on a public ""supercomputer"" (which had less power than today's cell phones) to run his programs. It seems the shared public supercomputing situation has not changed much since the 1970s, though the computers themselves have become more powerful.

Maybe we will constantly have a need for greater computing power, and the expensive supercomputers at research institutions will always be rented out to scientists whose models (ever growing in complexity) can not be run on private workstations. Maybe, but maybe not.

Just as today's personal computers dwarf the supercomputer my dad used in the 1970s, GPUs are helping make today's super-technology seem like yesterday's calculators.