If a person standing outdoors is asked to report how many clouds passed overhead in the last one hour, he/she would only be able to provide a good estimate of the cloud coverage in that particular location. It certainly would not be fully accurate because of subjective human judgment (is this a cloud?) and it would not certainly be illustrative of the coverage in the larger surrounding area.
This analogy emphasizes two challenges that atmospheric sensors face: location bias and sensitivity bias. The type of sensor used to study clouds such as radar and LiDAR, the location of the instrument, and its mode of operation (for example pointing direction) can influence cloud measurements. For instance, a profiling radar that samples just the atmospheric column that passes over its location will give a very diverse answer from a scanning radar that samples a larger area. At the same time, the distance of the clouds from the radars establishes if the clouds will be detected or not because radar sensitivity reduces with distance.
To evaluate if observational systems are capturing a skewed or accurate picture of the atmosphere, Researchers began producing instrument simulations that can replicate the technical characteristics of the sensors such as sensitivity and resolution, their interaction with the clouds (for example scattering and absorption), and their sampling approach and placement in space.
Understanding cloud processes
Clouds reflect sunlight back to space, capture thermal radiation discharged by the Earth’s surface, create precipitation, and carry moisture, so they have a substantial impact on climate and weather. But the behaviors of clouds are still not properly understood. Research conducted through the U.S. Department of Energy’s (DOE) Atmospheric System Research (ASR) program and at DOE’s Atmospheric Radiation Measurement (ARM) Climate Research Facility concentrates on improving knowledge of major cloud processes and interactions that disturb Earth’s energy balance (incoming sunlight and outgoing thermal radiation) and water cycle.
Merging observation and modeling efforts is the only approach to progress this understanding, ultimately leading to better climate model predictions and weather forecasts. However, the combination of observations and models is not direct because of sensitivity and sampling bias and other issues.
“Comparing synthetic instrument observations to the real observations helps us determine if we are effectively sampling critical processes such as the convective motions of deep precipitating clouds,” said Pavlos Kollias, who leads DOE’s ASR Radar Science Group. Kollias is also an Atmospheric Scientist in the Environmental and Climate Sciences Department at DOE’s Brookhaven National Laboratory, a Professor at Stony Brook University’s School of Marine and Atmospheric Sciences, and an Adjunct Professor in McGill University’s Department of Atmospheric and Oceanic Sciences.
Simulating radar observations
A few years ago, Kollias and his Research Assistant Aleksandra Tatarevic at McGill University developed the preliminary version of the Cloud Resolving Model Radar SIMulator (CR-SIM), which produces a virtual (synthetic) view of what a radar would see if integrated into an atmospheric model that resolves clouds.
“Though models can simulate clouds with very high resolution, the problem is how to validate the simulation—how well does it represent reality?” said Postdoctoral Research Associate Mariko Oue of Stony Brook University’s School of Marine and Atmospheric Sciences, who updated CR-SIM in recent times to be compatible with more sensors, including lasers that probe the atmosphere.
The CR-SIM software creates virtual observations—that account for all sensor limitations—from virtual clouds, enabling us to fairly compare real observations with model output. In addition, we can use CR-SIM to investigate the optimum setup of our atmospheric observatories in terms of the type, number, and configuration of sensors needed to get a true picture of the atmosphere.
Mariko Oue, Postdoctoral Research Associate, School of Marine and Atmospheric Sciences, Stony Brook University
To produce the virtual observations, CR-SIM uses atmospheric variables — such as cloud water content, humidity, temperature and wind speed — simulated by large-eddy simulations (LES) or high-resolution cloud-resolving models (CRM). These models divide the atmosphere into a 3D computational grid, with each grid cell holding numerical values of the original atmospheric conditions for a particular moment in time. Computer algorithms then solve central physics equations describing how these values alter over time. If the size of the grid cells (spatial resolution) is big, these equations have to be streamlined, or parameterized, to a specific degree so as to approximate the effects of cloud-relevant processes that cannot be openly resolved.
Global climate models, which contain grid cells that can measure up to hundreds of kilometers on a side, can resolve large-scale atmospheric phenomena caused by circulating air, such as jet streams and trade winds. CRMs, whose grid cells are approximately 10 km, can model thunderstorms and other regional-scale dynamics. With cells as small as 20 m, LES are capable of modeling moderate atmospheric motions, including downdrafts and cloud updrafts.
The high resolution of LES makes the simulations extremely computationally intensive, so Researchers have usually run them only for ideal conditions. But developing a big database of realistic simulations would provide the statistics required to enhance the accuracy of the parameterizations.
“We are trying to run more realistic LES because they provide detailed information about how clouds form, grow, and produce precipitation,” explained Andrew Vogelmann, an Atmospheric Scientist in Brookhaven’s Environmental and Climate Sciences Department and Technical Co-Manager of the Cloud Processes Group. Vogelmann is also a Co-Principal Investigator for LASSO (for LES ARM Symbiotic Simulation and Observation), one of four projects in which Researchers from Brookhaven Lab and other DOE national labs are using CR-SIM to assess model performance and comprehend observational capabilities.
Improving climate models
LASSO’s goal is to enable repetitive LES modeling and provide a statistical library of data bundles that integrate the simulations with measurements collected at ARM’s fixed atmospheric observatories. These capabilities will help Scientists enhance the reliability of parameterizations in climate models such as DOE’s Accelerated Climate Modeling for Energy (ACME). Powered by next-generation supercomputers, ACME will provide an ultrahigh-resolution modeling capability for forecasting the future global climate.
The other three projects supported by CR-SIM are all Climate Model Development and Validation (CMDV) efforts, which aim to upgrade the representation of parameterizations in ACME for a wide variety of model resolutions. Specifically, the projects are looking to enhancing deep convective and shallow cloud representations in regional and global simulations.
The approach to advancing ACME is to develop an understanding of cloud properties and processes through observations and high-resolution modeling and use that understanding to improve the parameterizations that go into the model.
Andrew Vogelmann, an Atmospheric Scientist in Brookhaven’s Environmental and Climate Sciences Department and Technical Co-Manager of the Cloud Processes Group
However, CR-SIM was not initially designed to support the high computational load involved in repetitive simulations at high-resolution scales. Consequently, it ran too slowly for practical application to CRM and LES.
For example, the LASSO project at present has 192 LES simulations and will have another 544 for the next data release. “There are a lot of computational grid points for each LES variable, and performing computations at every single one of these points would cause a computational bottleneck,” explained Vogelmann. “It could take 18 hours to compute instrument-equivalent output from the LES variables for one case—about the same amount of time as the LES simulation itself.”
Overcoming the computational bottleneck
According to Nicholas D’Imperio, Chair of Brookhaven Lab’s Computational Science Laboratory, the CR-SIM code was operating slowly because it was input/output (I/O) bound, “The time it took to complete a computation was largely determined by the time spent receiving and sending data. Input and output are very slow processes in a computer—up to 1000 times slower than processing or memory access.”
To eliminate this holdup, Kwangmin Yu, an Advanced Technology Engineer in the Computational Science Laboratory, rewrote the I/O-bound portion of the code and placed it into memory.
With this rewrite, Yu boosted the code’s speed by 57 times. Using a parallel programming paradigm known as OpenMP (for Multi-Processing), he additionally accelerated the code by a factor of three, for a total speedup of 168 times the original.
“A run that previously took 18 hours now runs in 18 minutes (without parallelization) and 6 minutes (with parallelization),” said Yu, who is also working on automating CR-SIM settings to make the software more easily moveable to a wide range of computers.
Extending CR-SIM worldwide
Due to the success attained by the Computational Science Laboratory, Kollias expects the CR-SIM user community to grow considerably. Colleagues from the Max Planck Institute for Meteorology (MPI-M) in Germany have already shown interest in using the software with ICON, a modeling system for weather prediction and climate research that is being co-developed by MPI-M and the Germany Weather Service. Other interested countries include China, Brazil and South Korea.
In contrast to most other simulators, CR-SIM is properly maintained and free to download.
Our philosophy is to widely distribute this simulator package to the research community, providing not only software documentation and regular updates but also support personnel who can help users set up the software and continue to interact with them. Brookhaven Lab has both the expertise and coding center resources for nurturing such a software package.
Pavlos Kollias, Leader, DOE’s ASR Radar Science Group
In the days to come, Kollias’ team is planning about ways to make CR-SIM more user-friendly and interactive, such as through graphical user interfaces and other visualization tools. There is also the possibility of additionally increasing speed of the simulator code to support real-time computations within the models.
In the interim, CR-SIM will start to support activities of Brookhaven’s new Center for Multiscale Applied Sensing, which Kollias directs. At this center, Researchers will build systems for observing and predicting weather and environmental conditions around energy hot spots, including coastal and urban locations and renewable energy facilities.
The ARM Climate Research Facility is a DOE Office of Science User Facility.