Future Trends in HPC, part 2

Google+TwitterFacebookLinkedInPinterestEmail

Future Trends for High Performance Computing Image

This is a continuation of our look at future trends in high performance computing. In part 1 we covered the first five of the top ten trends. In this installment we’ll wrap up with the remaining five.

6. Memory usage growing with multi-core

Efficient support for a shared memory model is of high importance as we approach the era of exascale multi-cores. While programmers require and demand the performance and convenience of coherent share memory, increasing core count introduces an additional burden on memory systems, while greater on-chip distances result in higher interconnect delays for memory access latencies. Various approaches have emerged, and are continuing to emerge, seeking to offset any resource overhead resulting from the increased demand on memory. These approaches include traditional appraoches, from directory-based cache coherence and remote-access, to more novel methods such as execution migration and library cache coherence.

7. Memory per core is remaining costant

While memory usage per node and processor are growing, memory per core has remained relatively constant over the past 5 years. As core count increases, so will memory requirements, impacting system design and cost:

Modern High Performance Computing (HPC) systems feature increasingly complex node architectures with a rising number of compute cores per node, while the total amount of memory per node remains constant. Under such scenarios, flat programming models such as pure MPI will fail. We must provide programmers with multiple levels of concurrency, and the most common approach is a combination of MPI for cross-node communication with OpenMP for intra-node threading. We study this setup on the Hera cluster at LLNL, a Linux cluster with 864 nodes.

-Scaling Algebraic Multigrid Solvers: On the Road to Exascale [pg. 8]

8. Cloud is still only a small part of enterprise spending, but will grow enormously in coming years

This year Gartner, Inc. estimated that while public cloud spending was $74 billion in 2010, it represented only 3% of enterprise spending. The shift away from traditional IT acquisition models to public cloud services is still young, though Gartner expects it to grow five times faster than overall IT enterprise spending in the near future – 19% annually through 2015.T

“What supply chain models did to manufacturing is what cloud computing is doing to in-house data centers. It is allowing people to optimize around where they have differentiated capabilities,” Mr. Sondergaard said.

-Gartner Inc: Worldwide Enterprise IT Spending to Reach $2.7 Trillion in 2012

With the massive market opportunity for enterprise collaboration in the cloud, we are seeing large companies (HP, Dell, and many others) looking for ways to get ahead of the game.

9. Microsoft’s share flattened after years of growth

While shares of Google and Apple have seen high double and triple digit growth in past years, Microsoft’s stocks are now worth less than they were five years ago.

Don’t misunderstand, Microsoft remains a powerful company. It continues to grow in top-line revenue each year. It’s simply at a slower pace than previous years. What that means, however, is that Microsoft may be on the cusp of transitioning from a growth company to a value company. It remains the dominant force in laptop and desktop operating systems. But, owing to its late entry into the smartphone and tablet realm, has only a minor presence in the mobile operating system world, an area of increasing importance in IT.

Moreover, as its operating margins slowly begin to decline, Microsoft will feel the squeeze to stay afloat as a company and competitive in the marketplace. It isn’t complacent about its future direction. With the release of the Windows 8 developer beta earlier in the year and the Metro UI clear application for tablet computing, Microsoft hopes to break into the industry. Yet, if the status of Windows Phone 7 is any indication, Microsoft has an uphill battle. Windows Phone 7 is not a bad product. While some have heavily decried its tile-based interface (which Window 8 Metro replicates) and the lack of applications, it serves almost a different purpose and serves it very well. By all accounts, it is a solid piece of technology.

But is it relevant in world of Android and iOS growth? Much like the company itself, it’s something that Microsoft must figure out.

So what does this all have to do with HPC? The cloud represents the direction Microsoft must take to remain relevant. Microsoft sees the potential for applying cloud technology to its consumer markets with applications like Office 365 (MS Office in the cloud.) But with free alternative such as Google Docs, it is doubtful the Microsoft can rely on this as its bread and butter. Windows Azure perhaps represents their front-line movement in this direction, as they attempt to make utility and cloud computing a more central (if not THE central) aspect of their business. This is most critical for the goal of retaining its enterprise base.

In that sense at least, HPC has become more of a first-class citizen at Microsoft. But the HPC business itself, now under the direction of Ryan Waite, the general manager for High Performance Computing at Microsoft, has been folded into the Server and Cloud Division, which itself is under the purview of Nadella’s STB. The integration of HPC into the server-cloud orbit reflects the company’s overarching strategy to use the Windows Azure cloud platform as the basis for its enterprise business.

10. Big Data represents a major opportunity for HPC vendors to grow

Big data is not just a buzzword. It represents the greatest of the actual opportunities within and beyond HPC.

So what does it mean exactly?

We can say that big data represents a growing set of applications touching large enterprises, research, SMBs, large datacenters. Big data is:

  • made possible by the creation and availability of data
  • continually fueled by the creation and availability of data

Many organizations are seeking to bridge the gap between having better data and making better decisions. These application areas include:

  • enterprise analytics
  • research analytics
  • real-time analytics
  • complex event processing
  • data mining
  • visualization

The growth of these application areas has created a market opportunity for providers of HPC technology to come in and serve traditionally unserved customers. This is the central component of the “missing middle” which really can be expanded to include all industries where HPC can be beneficial but in which only high-end enterprises are currently employing such technology.

It’s important to remember that big is relative to where an organization has come from previously. In keeping with that, 50TB could conceivably be big for an organization.

Summary

Moving forward into 2012, one of the most important aspects of HPC must be an emphasis on awareness and education. Smaller-scale organizations who know little about the benefits of HPC must be guided to solutions that are best suited for them. When we attended the HPC360 conference it was very comforting to meet so many people from different backgrounds (corporate technology, small business, academia, independent research firms) who were passionate about moving HPC forward in this way.

We look forward to the future of computing possibilities with HPC. Happy New Years!


Other Top Technology Trend Articles You Might be Interested In

TechNewsWorld – “Top Technology Trends for 2012″ - http://www.technewsworld.com/rsstory/73881.html

Deloitte Predicts the Top Technology Trends for 2012 http://www.prnewswire.com/news-releases/deloitte-predicts-the-top-10-technology-trends-for-2012-135248878.html

CNN – “The Top 10 tech trends for 2012″ http://www.cnn.com/2011/12/19/tech/innovation/top-tech-trends-2012/index.html

CIO Blogs – “Peering into the Crystal Ball: Top IT Trends for 2012″ http://blogs.cio.com/consumer-it/16717/peering-crystal-ball-top-it-trends-2012

Information Management – “Managing Big Data and Mobile BI in 2012″ http://www.information-management.com/news/big-data-mobile-BI-IT-consumerization-ISACA-10021690-1.html

Comments

  1. MySchizoBuddy says:

    In term of complexity are GPU based clusters for HPC more difficult to deploy than CPU based clusters? Can you do an article on How to create a basic GPU based cluster? Specially the software part. Can the OS (Redhat) allow me to access all my GPUs for compute?

    • Colin Cronin says:

      Not quite sure what you mean by difficult to deploy. You have to be familiar with whatever compiling procedures, protocols, and programming needed to execute your tasks, just as in a CPU-only environment. Redhat is only the operating system software. You would need to install whatever graphics architecture supports the units you are using. For example, if you are using a NVIDIA GPU, you will use CUDA development architecture/toolkit to compile and run your applications via Visual Basic or whatever other language was called for at the time.

Speak Your Mind

*