Dr Qin Jim Chen explains how hurricane modelling enables the prediction of natural phenomena that could endanger lives
Louisiana State University’s hurricane modelling team is on the frontline in protecting communities from monumental natural disasters using big data. Dr Qin Jim Chen explains how this enables them to predict natural phenomena that could endanger lives.
Storm surges are the result of water being pushed toward the shoreline by winds from cyclonical storms. The size of the storm surge depends on storm intensity, forward speed, size, direction of approach toward the coast, central pressure, the properties of the coastal features, and width and slope of the continental shelf. The combination of storm tides (water level rise due to the combination of storm surge and the astronomical tide), waves, and currents can erode beaches and coastlines; severely damage buildings, marinas and boats, kill vegetation, and often result in loss of life.
It is estimated that up to 60% of people living along the Louisiana coastal state line currently live at or below sea level; this coupled with natural land erosion can see swathes of territory overcome by flood waters at an alarming rate. For the protection of our state and its citizens, it is paramount that we can better predict effects of wind and storm surges on our communities. As such, LSU Center for Computation & Technology (CCT) and Louisiana Sea Grant (LSG) have spearheaded the Coastal Emergency Risks Assessment (CERA) group – a coastal modelling research & development effort providing operational advisory services related to impending hurricanes in the United States. CERA is part of the Advanced Surge Guidance System (ASGS). Based on the open source Advanced Circulation and Storm Surge model (ADCIRC), coupled with the SWAN Wave model, the ASGS generates real-time storm surge guidance for updating emergency response groups during a tropical or hurricane event. Information and maps related to the official hurricane advisories as issued by the National Hurricane Center are available every six hours during a storm.
ADCIRC and SWAN are both very complex data intensive applications. But the complexity becomes further compounded when we combine additional simulations with these models. We are looking at a combination of models and trying to see how they all work together and interact with each other and the likely impact they will have on our communities. Wind simulations, atmospheric conditions, storm surge simulations and rainfall predictions are combined and analysed in near real time to help federal governments, the emergency services and residents to prepare and respond appropriately. Beyond the immediate threats of storms, waves, surges and river flooding, we also have to prepare for the future. In the aftermath of Hurricane Katrina in 2005, it was quite apparent that the state’s complex system of levees and floodwalls that were intended to keep the city of New Orleans and its residents safe, failed. According to a report by the American Society of Civil Engineers, the defences were breached in 50 separate locations allowing billions of gallons of water to surge into the city.
It is estimated that up to 60% of people living along the Louisiana coastal state line currently live at or below sea level
Since Katrina, billions of dollars have been invested in Louisiana’s new Hurricane and Storm Damage Risk Reduction System (HSDRRS). Designing this system and future flood defence systems requires drawing on hundreds of years (cyclones) worth of data on surge heights, wave fields, wind patterns and atmospheric conditions. And, it requires thousands of simulations. These simulations require reliable and significant HPC resources to complete in near real time. We need very high fidelity models that combine multiple physics – the density and computation involved is very demanding. We need forecast and predictive models available for use within a 1-2 hour window of a storm approaching to enable the relevant parties to prepare. A typical run of a hurricane model or storm surge model ranges from 20MB to 100MB in output. Wave models are typically in the region of 200MB in size. Input files for each of them range anywhere from 200MB to 500MB. The data size of the input and output files are correlated with the resolution of the simulations. Higher the resolutions of the simulation, larger the input and output file size. On average each simulation run requires about 2GB of space not including scratch space, which is much higher. Each individual output might not be huge in absolute terms, but multiple ensemble runs comprising 10s if not 100s of simulations continuously are needed for emergency preparedness and hundreds of simulations for engineering design, which increases the data management complexity. Both the models (ADCIRC and SWAN) are computationally intensive and I/O bound.
[caption id="attachment_53481" align="alignnone" width="450"] Louisiana’s new Hurricane and Storm Damage Risk Reduction System (HSDRRS) cost $14bn.[/caption]
Like all HPC infrastructures, compute can only run at optimum speeds if it is coupled with proven high performance storage technology. Avoiding workload bottlenecks from storage to compute is a key consideration for the typical modelling applications that we run. Given that the weather simulations tend to be I/O Bound, the runtimes/wall clock times of these codes are extremely sensitive to I/O bottlenecks. In order to ensure timely execution and predictable turnaround times for our codes, we rely on high performance parallel file systems based on Lustre. The effective setup of SuperMike II compute and storage ecosystems enables scientists like myself to focus on the core hydrodynamics involved rather than worry about data management across the ecosystem. Our high performance storage system is based on the DDN’s 12K platform, which provisions extreme scalability of the Lustre file system with extreme performance delivered by its Storage Fusion Architecture. Coupled with the compute ecosystem, it enables us to execute highly intricate and time sensitive workflows in the shortest time possible, which in turn enables delivery of finished products to key stakeholders (Researchers, State DHS and emergency preparedness officials.)
We need forecast and predictive models available for use within a 1-2 hour window of a storm approaching to enable the relevant parties to prepare.
A central storage resource delivers benefits beyond our immediate hurricane modelling team. Ensuring the entire region can prepare for storm surge effectively is beyond the responsibility of LSU. The LSU CERA group works very closely with a number of universities in the U.S., including the University of Notre Dame, the University of North Carolina at Chapel Hill and the University of Texas at Austin. We have to share a lot of data and having high performance storage facilitates effective data sharing with those organisations outside of LSU and also interdepartmentally within the university. LSU has in excess of 200 researchers looking at the interactions of physics as well as ecological, economic and social systems along the coast. Many of them want access to our HPC system and access to the huge data generated from field observations and numerical simulations. They want to be able to run their own models and queries against existing data sets. Not all of our researchers are HPC experts, or understand how to run HPC workflows. We needed to lower the bar of entry to HPC so we developed the SIMULOCEAN gateway – a web based portal that enables students to access our HPC resource. Users can prepare their input files on their desktops and then upload them to the HPC where they can run their models, download and visualise the results. Running multiple workloads concurrently requires high available compute and storage.
[caption id="attachment_53484" align="alignnone" width="450"] The damage caused by Hurricane Katrina was estimated at $108bn.[/caption]
Climate is not a static phenomenon, it is constantly changing and most evidence and commentators suggest that, with climate change, it is going to become even more unpredictable. According to a National Climate Assessment Report released by The White House in 2014, the cost of coastal disasters in Louisiana and other Gulf State coastal parishes and counties averages about $14 billion a year for losses from hurricane winds, land subsidence and sea level rise. Another report from Entergy Corporation & the America’s Wetland Foundation in 2010 suggests that by 2030, just 14 years away, those costs could be between $18 billion to $23 billion. The report suggests half of that increase is related to climate change. Climate change for Louisiana is a threat multiplier – more intense hurricanes, heat waves, floods, droughts and extreme weather. The State of Louisiana is investing $1 billion each year on coastal restoration and protection for the next 50 years. The services of our research teams at LSU will be called upon more and more going forwards and the demands for more simulations and higher fidelity models will continue to increase.
The internal HPC resource dedicated to further our understanding and handling of extreme and regular weather events is called SuperMike-II. SuperMike-II, named after LSU’s original large Linux cluster named SuperMike that was launched in 2002, is 10 times faster than its immediate predecessor, Tezpur. SuperMike-II is a 146 TFlops Peak Performance 440 compute node cluster running the Red Hat Enterprise Linux 6 operating system. Each node contains two 8-Core Sandy Bridge Xeon 64-bit processors operating at a core frequency of 2.6 GHz. Fifty of the compute nodes also have two NVIDIA M2090 GPUs that provide an additional 66 Tflops total Peak performance.
Author: Dr. Qin Jim Chen is CSRS Distinguished Professor in Coastal Engineering and Professor of Civil and Environmental Engineering at Louisiana State University (LSU).