main article image
NASA Goddard Space Flight Centre/Flickr

Obama Pledges US Will Build ‘Exascale’ Supercomputers Within 10 Years

30 JUL 2015

The White House has announced the establishment of its National Strategic Computing Initiative (NSCI) this week, which is designed to keep the US at the forefront of high-performance computing well into the coming decades. The key to keeping up will be the development of giant exascale supercomputers, which the US government says it will build within the next 10 years to perform incredibly fast and complex operations.


The announcement comes as scientists and technologists grapple with the issue of technological bottlenecks as the demands of computational research in the 21st century dramatically escalate in complexity. This problem is particularly compounded when working with big data - such as storing and processing the massive amounts of genomic information involved in gene sequencing - which requires extremely fast computers with extremely capacious data banks.

So what’s an exascale computer? Well, supercomputers are measured in terms of how fast they are - specifically, in terms of how many operations they can can perform every second. These operations are known as floating-point operations per second (aka ‘flops’). The kinds of computers the US government intends to build will be capable of performing one exaflop per second (1 billion billion or 1018 operations per second). By comparison, most computers in the world today are measured in terms of gigaflops (109 operations per second).

But it’s not just about speed. As concluded by the President’s Council of Advisors on Science and Technology, the big data age means high-performance computing "must now assume a broader meaning, encompassing not only flops, but also the ability, for example, to efficiently manipulate vast and rapidly increasing quantities of both numerical and non-numerical data”. What this means is that exascale supercomputers will not only need to be extremely fast, they’ll also have to be capable of managing and analysing massive data sets up to one exabyte in size (1018 bytes).

The benefits of such amazing supercomputional (if that’s a word) power will be felt in all sorts of fields. The White House lists a few examples, including new ways of modelling turbulence and dynamic flight conditions in aviation simulations, better ways to compute health and genome data, and smarter methods of teaching artificial intelligence systems how to learn via access to a greater database of examples from which to draw upon.

If Google’s DeepDream is what today’s computers are capable of coming up with, we can’t wait to see what tomorrow’s exascale supercomputers will be getting up to.