Massively Parallel Computing and Massive Impacts for ChE

9/12   in the series We Are ChE: Entering a Golden Age
Chemical engineers saw the value of computers for their work early, as described in the previous post. For ChEs today, computing power has taken two directions, toward powerful simulations on one hand and powerful use of data on the other. In this post, I'm writing about simulation and how supercomputing is making valuable predictions possible.

Simulation and predictions


Sharon Glotzer

"Simulation" for ChEs usually means either predicting properties, such as enthalpies or crystal shapes, or predicting system behavior, such as with a process design model. Simulation tools are being used in ChE around the world, according to a 2009 study led by Sharon Glotzer, Churchill Professor of Chemical Engineering and Professor of Materials Science and Engineering at the University of Michigan. Her blue-ribbon panel visited 52 academic and industrial sites in Europe, Japan, and China, asking how "simulation-based engineering and science" were being used and developed. The responses were fascinating, and the visions of the future even more so. A few examples:

  • Custom-performance materials are being developed, including tougher polymers and new catalysts.
  • Storm impacts are being predicted for tide levels and the stability of refinery facilities and supply chains.
  • Oil fields are being mapped and extraction strategies are being planned.
  • Possible pollutant-emission paths are being projected through the atmosphere and water.
  • Controlled drug release is being refined, and personalized medicines are becoming possible.
  • Not only new detergents are being designed, but so are their breakage-resistant bottles.
  • Time and costs are being slashed for developing new jet engines by combining detailed chemical kinetics from ChEs with computational fluid dynamics from mechanical engineers.

These findings helped lead to the current U.S. Materials Genome Initiative and Advanced Manufacturing Partnership. An earlier study, focused on applying molecular modeling and simulation, led to university-industry initiatives in Germany and the UK.

How big are today's supercomputers?

three-hIdeally, the bigger and faster the computer, the bigger, faster, and more complete are the predictions. Supercomputers have moved toward astonishing levels of performance. One simulation milestone was last year's crystalline-silicon simulation on China's Tianhe-1A supercomputer, which was the fastest computer in the world in the winter of 2010-11 at 1.87 petaflops - 2o1015 floating-point (real-number) arithmetic operations per second. The key initial calculation was when ChEs from the Institute of Process Engineering at the Chinese Academy of Sciences simulated the molecular dynamics of 110 billion atoms for 0.16 nanoseconds, supposedly the largest such calculation up to that time. As of this month, the Titan supercomputer at Oak Ridge National Laboratory is now rated fastest in the world. It uses about 300,000 computing cores in parallel to operate at 17.56 petaflops to a peak of 27 petaflops. It edged out Lawrence Livermore's Sequoia supercomputer at 16.3 petaflops, Japan's Riken K computer at 10.5 petaflops, and Argonne's Mira at 8.2 petaflops. In only two years, Tianhe-1A has fallen to #8!


The Titan supercomputer at Oak Ridge National Laboratory

In another measure of capacity, the US Department of Energy has granted 5.7 billion CPU-hours on Titan and Mira for 2013 to applicants from around the world through its INCITE program. A typical project size might be 100 million processor hours. Such numbers are hard to comprehend. Consider: 100 million hours is 11,000 years; that long ago was when the last Ice Age was receding and agricultural societies were just beginning. The larger number, 5.7 billion CPU-hours, equals 650,000 CPU-years. That is hundreds of thousands of years longer than the existence of modern Homo sapiens!

We're the solution and the challenge

There's a catch. All this speed depends on making the predictions using parallel computing - breaking the problem into many threads of calculation running at the same time. The good news is that abilities to use almost any computer applications have never been better among ChEs. What about the ability to develop applications? That's tougher.

Computer programming was a base skill in the ChE curriculum at one time, but it fell out of favor in the past 25 years. The main reason it had been emphasized was that we thought we would have to do it. Instead, we now see that's seldom the case. Spreadsheets take care of many simple calculations, and it seems like there's an app for everything else. What was lost wasn't the ability for everyone to be coding whizzes but simple computer-science concepts. A=A+1 isn't an unsolvable algebraic problem, but an expression for modifying a number and storing it. Repeating a calculation can be done by an iterative loop instead of a tedious repetition. More subtly, the traditional solution methods we learn in ChE are rooted in graphical or stepwise calculations. The shift toward massively parallel codes changes things, demanding that we recast the algorithms to take advantage of parallelism. For most ChEs, that calls for new understanding of computer architectures and computer science. The application codes may well be written by software engineers, but the algorithms must come from us.

But what about real data?

For these simulations to be most useful, the data they produce must be tested against experimental data, complete with quantified experimental uncertainties. In the next post, I'll take up the "Internet of Things," Big Data, analytics, and how that all ties into "Smart Manufacturing."

Comments

I was at the AIChE Sustainability Gala last night <a href="http://(http://www.aiche.org/resources/conferences/events/gala/2012-11-28)" rel="nofollow">(http://www.aiche.org/resources/conferences/events/gala/2012-11-28)</a>. High-performance computing wasn&#039;t mentioned and yet it was implicit in so many topics, like trying to resolve the question and sources of climate change. More obvious was ubiquitous computing: PowerPoint-generated slides and computer-generated videos using Ken Burns Effect stills and background vocals -- Smart phones popping out to settle questions, check on the best subway, send contact information.