College of Engineering News • Iowa State University

When it rains, it pours: HPC@ISU powers advanced agronomy research

This story was originally published by Ken Strandberg with Scientific Computing

The American Midwest has recently seen significant precipitation and two major floods — in 1998 and 2008 — from extraordinary rain falls across the Great Plains. What is causing this dramatic change in weather patterns? Is it the warming planet? Are the crops themselves influencing dramatic weather changes taking place over the last couple decades? These are questions researchers and scientists in Agronomy at Iowa State University are asking. And high performance computing clusters at ISU are being used to help discover answers.

“Persistent spring and early summer rains have been the hallmark of our weather over the last 10 to 12 years in Iowa,” says Gene Takle, holder of the Pioneer Hi-Bred Professorship in Agronomy at Iowa State University. “The rains are leaving the soybeans in standing water, and they’re drowning crops. The rains are also delaying when we can get crops into the ground, and they’re preventing some plantings completely. We had over 700,000 unplanted acres in Iowa alone in 2013 because of the rains.” Dr. Takle and his team are using ISU’s HPC systems to understand to what they can attribute these rains and whether or not they will persist into the future.

First human impacts on Midwest land use

Before the American migration into the Great Plains and the development of crops, the northern Midwest — Iowa, the Dakotas, Minnesota and Illinois — was potholed with depressions across the lands left behind by glaciers from the last ice age. These areas held water, making the land too wet for cropping. “With the American migration and development of farming across Iowa, farmers have laid many miles of drain tiles under nine million acres across the state,” comments Takle. “We essentially dried out the soil. Now with the heavier rains, we’re laying even more miles of drainage.” The opposite is true with the drier states of Nebraska, Kansas and eastern Colorado, where pivot irrigation has turned these more arid lands into humid environments. “Have these land use practices across the Midwest influenced the weather changes we’re seeing? That’s also one of the things we’re trying to figure out,” states Takle.

“Corn plants use an enormous amount of water, and they give that water back to the atmosphere,” says Ray Arritt, Professor of Agronomy. “Being in the middle of an Iowa corn field in summer feels like standing in a tropical jungle. So, our idea is that, because we’ve changed the kinds of crops from, say, wheat and oats, to corn and soybeans and other plants that use a lot of water, the increase in the water going back into the atmosphere could actually have an effect on climate.”

Iowans are used to quarter-inch of rain a day. But three to five inches a day cause significant soil erosion, affecting the livelihood of a lot of people who engage in agriculture. And then there are the effects of nutrient runoff. “There’s a very large area in the Gulf of Mexico, called the hypoxic zone, where huge algae blooms occur, caused by nutrient runoff, such as phosphorous and nitrogen from fertilizers,” explains Arritt. “The algae dies and, as it decays, it takes up a lot of oxygen, depleting it for other life. This dead zone has been a big environmental problem for some time. If we can understand what is causing the heavy downpours, and whether they’re likely to persist into the future, we can also say something about how it affects the hypoxic zone.”

Dealing with a very big problem

Their research into climatic changes across the Midwest involves very large and sophisticated four-dimensional problems. “Weather systems are very complex,” comments Takle. “There are many physical processes going on, a lot of heat exchange, turbulence, a lot of motion, and so on. So, it becomes a very complicated process to model and simulate them.”

While the weather issue at hand is in the central U.S., Midwest rains are influenced by the Rocky Mountains, the Great Plains themselves, and the Gulf of Mexico. So, the climate domain is massive. It extends east-west beyond both coastlines of the North American continent, north-south from the Artic to Southern Mexico, and from a few meters into the soil to about eight miles into the atmosphere. “We map out a 3-D space across that domain with horizontal grid points of 10 to 50 kilometers across,” states Takle. “We slice the altitudes into 30 sections. At the surface of the earth, we’re presented with some special problems, because we grow crops there and we have to account for rainfall, where it goes, the exchange of moisture, heat, and so on between the earth and atmosphere.”

Clouds are the biggest challenge for simulation. They come in all shapes, sizes, cloud water amount, and droplet size differences. “That’s bad enough in itself,” adds Takle, “but the real computational challenges reveal themselves when sunshine impinges along the tops of clouds or obliquely through clouds.” Cloud water droplet sizes and densities impact the absorption of solar radiation, the scattering and changes in paths of the light rays as they go through the cloud, and the sunlight transmission and reflection. “That’s energy we have to account for minute by minute. Our simulations have to include the energy that comes from the sun, how it’s processed by these clouds and, ultimately, how much of it gets down to the surface of the earth. It’s an enormous source of computational expense.”

Simulations are repeated at short intervals, which are affected by the resolution of the grid points, from about three minutes for 20-kilometer grid cells, to as short as about one minute for 10-kilometer points. “Just increasing our resolution by two across all four dimensions means an increase on the computational burden by a factor of 16,” comments Takle. “We’ve done some focused studies on short periods to optimize our processes, but our ultimate goal is to simulate multiple 100-year periods of climate and agricultural change to understand the factors involved. With large HPC clusters, it becomes feasible to complete these long simulations.”

Running climate simulations with the weather research and forecast model

Professor Arritt and Taleena Sines, a Ph.D. student in Agricultural Meteorology at ISU, are running some of the simulations on one of the ISU HPC clusters called CyEnce. Sines runs the climate simulations using the Weather Research and Forecast model, or WRF. “It’s a widely used standard toolset — actually a cousin of the model the National Weather Service uses for day to day forecasting — designed to be run on HPC clusters,” says Sines. WRF has two main solvers. Sines and Arritt use the Advanced Research WRF (or ARW) for Mesoscale and Microscale Meteorology. “It has a lot of options to do cloud physics. We are interested in rain falls, so we use some of the 17 different options. Someone interested in hurricanes would use different modules in the ARW,” states Sines.

According to Professor Arritt, their simulations depend on two interacting climate domains. One includes the large continental domain. The other is more focused on the Midwest. But the crop contributions are also significant. “We need to understand what the land use has been across the study area, the crops grown, and how they have changed from decade to decade,” says Arritt.  “One of our collaborators has gone back through all the agricultural records for every county across the Midwest to the 1940s.” This compilation becomes part of the data that feeds ARW.

ARW is very mature and portable across multiple platforms. So, running it on CyEnce at ISU did not require special optimizations, according to Sines. The vast majority of her work with the model has been to select and then test the best combination of physics options that would address their particular problems. “We started out on a departmental cluster with 156 processors,” comments Sines. “We could do about one year of simulation in 12 clock days. Then we moved to the CyEnce cluster, and we have the ability to use over 400 processors. The same one-year simulation completes in three clock days.” That gives Arritt and Sines a 4X speedup with only 2.56X more processors. “Once we get our processes fine-tuned, we can launch simulations for longer time periods.”

“Eventually, they will run many, many 100-year simulations with different scenarios,” says Takle, “to take into account the changes over the years and the potential changes in the future. Only with large HPC machines, like the new Condo cluster, can we begin to attempt such comprehensive simulations in a reasonable time period.”

Condo, shortened from Condominium, is a recent purchase with 158 nodes (expandable to 324), 2,568 Intel Xeon processor cores, a Lustre parallel file system, and Intel True Scale Fabric architecture. “Over the years, we’ve grown our HPC resources through specific grants for research in a department, such as genomics,” says Arun Somani, Associate Dean for Research, College of Engineering at ISU. “Each department was responsible for their own machines and, eventually, research always outgrew the machines. They were never big enough.”

With CyEnce (a National Science Foundation-funded project through Major Research Instrumentation and CISE Research Infrastructure grants) and Condo, Somani and his colleagues took a different approach. The HPC@ISU team showed to the University the value of looking at HPC in the same way they do buildings — make it part of the overall infrastructure. “Everyone wanted a big machine, but nobody could afford to have one. We polled the community, and realized that people would want to share their cycles.” That was the beginning of a change in how ISU does HPC. Somani structured a purchase, with the University facilities covering space, power, cooling and management costs. Researchers in different departments could then buy the number of machines they needed to populate the cluster without having to take care of the system (much like condominium properties are done today). Under the share model, more cycles could become available at times. “Today, it is understood around campus that if you want to do big HPC, this is the way to do it,” he said.

The beginning of insights

Back in the lab, Taleena Sines is excited. “We’ve just finished the first year of simulations. We’ll have the first 10 years done in about a month.”

“Climate varies from year to year,” says Arritt, “so, to say anything intelligent about climate at all you need a good number of years — at least 30. Soon, we’ll be able to start looking at real results, and we can start seeing what the trends are. And that’s very exciting after an enormous amount of work and all the meetings to get the input data.”

“We expect to refresh our HPC technologies every couple years,” says Somani. “We’ll continue taking the approach of shared resources, so we can give any researcher the compute resources she needs. That way, more research will get done at ISU, leading to more discoveries and insight.”

Ken Strandberg is a technical story teller for technology areas that include software, HPC, industrial technologies, design automation, networking, medical technologies, semiconductor and telecom.

Loading...