Access and Feeds

Cloud Computing: Scientists Interested, but find the Cloud isn't Ready for 'Heavy' Scientific Computing

By Dick Weisinger

Two government laboratories, Argonne and Lawrence Berkeley Labs (LBNL), recently decided to give public cloud computing a kick in the cloud’s tires to see whether or not they’d be able to use the cloud to handle some of their scientific computing.  The National Energy Research Scientific Computing Center (NERSC) at LBNL is home to the second most powerful computer in the US, a 153,408 processor computer which is capable of running at petaflop speeds.  The scientific computations and simulations that researchers and scientists at these laboratories perform are demanding to say the least.

In 2009, the labs set up a special project to begin looking into cloud computing and how it could best be used to help with solving scientific problems.  The Department of Energy (DOE) set aside $32 million for the project called Magellan for running an investigation.  Pete Beckman, director of Argonne Leadership Computing Facility and leader of the ALCF Magellan team, said that “The question the Department of Energy has is pretty straightforward: what kind of science can be done on clouds, and are there specializations or customizations that we can do on the software to get more science out of clouds?”

After three years, the Magellan project released their answers in a final report in December, 2011.

The report found that the public cloud isn’t yet competitive with the existing computer infrastructure at the labs.  The report says that it’s hard to beat the efficiencies that have already been built into the governement laboratory computing facilities where utilization levels are over 85% and have a Power Usage E ffectiveness (PUE) ratings in the range of 1.2to 1.5.

“Many of the cost bene fits from clouds result from increased consolidation and higher average utilization. Because existing DOE centers are already consolidated and typically have high average utilization, they are usually cost eff ective when compared with public clouds. Our analysis shows that DOE centers can range from 2-13x less expensive than typical commercial off erings. These cost factors include only the basic, standard services provided by commercial cloud computing, and do not take into consideration the additional services such as user support and training that are provided at supercomputing centers today. These services are essential for scientifi c users who deal with complex software stacks and dependencies and require help with optimizing their codes to achieve high performance and scalability.”

But the report goes on to say that the labs computing facilities have much that they can learn from cloud computing, particularly the cloud computing ‘business model’.  ”Users with applications that have more dynamic or interactive needs could benefi t from on-demand, self-service environments and rapid elasticity through the use of virtualization technology,and the MapReduce programming model to manage loosely coupled application runs.”

The report recommends that the labs continue to run frequent comparisons of their environment against the current state of the public cloud.  The public cloud is evolving at a rapid pace and it is likely that the gap between the two will begin to close.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>