REU Home | 2014 Home | Students & Mentors | Projects | Schedule | Travel Tips | Photos

 

 

NWC REU 2014

May 21 - July 30

 

 

Projects

 

Rebuilding Decisions in Central Oklahoma

Nadajalah Bennett — UT Arlington
Mentors: Dr. Mark Shafer, Alek Krautman, and Putnam Reiter

A large tornado struck the central Oklahoma communities of Newcastle, Oklahoma City, and Moore on May 20th, 2013. A door-to-door survey was conducted of homeowners throughout the cities of Moore and Oklahoma City in the month of June 2013 to understand how residents may have incorporated mitigation techniques and emergency preparedness options since the May 20th, 2013 tornado. The survey was broken into categories of: damage done to homes, factors or reasons for homeowners implementing mitigation strategies, costs of implementing mitigation applications, and emergency preparedness strategies homeowners use to prepare for severe weather. Most homeowners were either considering or had already installed a storm shelter inside their home to help bring them a better sense of safety. Many homeowners were unaware of other mitigation techniques they could add to their homes to help protect them from wind damage and other severe weather. Some of the reasons why homeowners did not implement mitigation strategies were because of the additional cost of having to pay for them and most homeowners did not have a personal budget for out of pocket costs.

Full Manuscript

Basis For This Study

  • Investments in mitigation applications can greatly reduce or eliminate costs of recovery after a storm.
  • Government and city programs are offered to communities to help them become more prepared for severe weather.
  • Twelve new building codes were adopted by Moore for homes that are being newly built to better withstand severe weather.

What This Study Adds

  • Motivation as to why some residents or homeowners forgo some options of mitigation
  • This study investigates what homeowners use as a method of emergency preparedness.
  • Families are becoming more actively staying on top of being safe, and becoming more aware of mitigation measures they can implement in their home.

Forecast Sensitivity of Lake-Effect Snow to Choice of Boundary Layer Parameterization Scheme

Robert Conrick — Indiana University
Mentor: Dr. Heather Reeves

This study assesses the forecast sensitivity of lake-effect snow to various boundary layer parameterization schemes using the WRF-ARW model. Six boundary layer schemes are tested on a case- study of lake-effect snow over Lake Erie in Dec 2009. The experiments reveal significant precipitation differences (as much as 20 mm over 6 h) between the schemes. Consideration of the heat and moisture fluxes shows that schemes producing more precipitation have higher fluxes over the lake. Forcing all schemes to use the same over-water heat and moisture fluxes causes the precipitation forecasts to be in closer agreement. The heat and moisture fluxes are found to be strongly dependent on the similarity- stability functions for heat, momentum, and moisture (š›¹š», š›¹š‘€, and š›¹š‘„). When the over-water values for š›¹š», š›¹š‘€, and š›¹š‘„ are set to be the same in all schemes, precipitation forecasts are similar in all experiments, thus indicating that the parameterization used to determine š›¹š», š›¹š‘€, an have profound impacts on forecasts of this type of weather. Comparison of the forecast accumulated precipitation to observations shows that most schemes over predict the precipitation. The scheme in closest agreement is the Mellor-Yamada-Nakanishi-Niino scheme.

Full Manuscript

Basis For This Study

  • Relatively little work has been done to understand constraints and limitations of.high-resolution numerical weather prediction of lake-effect snow.
  • Given that lake-effect snow cloud systems are completely contained within the boundary layer, it is reasonable to suspect some sensitivities to choice of PBL scheme may exist.

What This Study Adds

  • Six boundary layer schemes are tested on a case-study of lake-effect snow to assess forecast sensitivity to various boundary layer parameterization schemes.
  • The experiments reveal significant precipitation (as much as 20 mm over 6 h) and heat/moisture flux differences between the schemes.
  • Forcing all schemes to use the same over-water heat and moisture fluxes causes the precipitation forecasts to be in closer agreement.
  • The heat and moisture fluxes are found to be strongly dependent on a turbulence parameterization called the similarity-stability functions.
  • When the similarity-stability functions are set constant, precipitation forecasts are similar in all schemes. Therefore these functions have a significant impact on lake-effect snow
  • Comparison of the forecast accumulated precipitation to observations shows that most schemes over predict the precipitation; however, closest agreement occurs with the Mellor-Yamada-Nakanishi-Niino scheme.

Verification of Earth Network's Dangerous Thunderstorm Alerts and National Weather Service Warnings

Rebecca DiLuzio — Millersville University
Mentors: Tiffany Meyer, Dr. Kristin Calhoun, and Matthew Elliott

Earth Networks Incorporated (ENI) has expressed the potential for their Dangerous Thunderstorm Alerts (DTAs) to increase lead time by an additional nine minutes over current National Weather Service (NWS) tornado warnings while maintaining a similar probability of detection (POD) and false alarm ratio (FAR). These automated, storm-based alerts combine lightning-based storm tracking with total lightning flash rate thresholds to designate regions with an increased potential for severe and hazardous weather. ENI produces alert polygons at three different levels: (1) basic thunderstorm, (2) significant thunderstorm, and (3) dangerous thunderstorm. Verification statistics (POD, FAR and lead time) were calculated for ENIā€™s level 3 DTAs and NWS severe thunderstorm and tornado warnings are calculated for a year of data, March 2013 through Feb 2014. A more in depth case study was done for 20 May 2013. The goal of this comparison is to evaluate how well DTAs perform relative to NWS warnings and if use within operational meteorology will improve warnings.

Full Manuscript

 

Basis For This Study

  • Increases in total lightning activity have been known to precede severe weather events.
  • Earth Networks Inc. releases Dangerous Thunderstorm Alerts (DTAs) when their Total Lightning Network reads lightning flash rates that breach a certain threshold.
  • DTAs may be a new aid in operational forecasting of severe weather but must be verified first.

What This Study Adds

  • When DTAs Probability of Detection (POD) was highest, so was their False Alarm Ratio (FAR).
  • DTAs performed best during the months with the most convective activity.
  • DTAs performed similar to NWS tornado warnings over the course of a year, but performed poorly when compared to NWS severe thunderstorm warnings.

Warming Southeast Australian Climate: The Effects of Sea Surface Temperatures (SSTs)

Kwanshae Flenory — Langston University
Mentors: Dr. Michael Richman, Dr. Lance Leslie

In the past few decades, the climate in Australian has been warming at an alarming rate when compared to historical variations. Associated with that warming, extended heat events, lasting for weeks to months have plagued the country. Climate model projections suggest that such events will occur more frequently and intensify in the future. The extreme temperatures have damages ecosystems through droughts and fire and resulted in the loss of human life.

This study examines how the combination of sea surface temperatures (SSTs) and climate drivers predict summer mean maximum temperature at selected locations in SE Australia. Ninety-one ocean grid boxes of SST surrounding Australia were used for simultaneous and lag1 relations as well as 42 climate drivers, creating a suite of 224 potential predictors. Variable reduction using 5-fold cross validated linear regression and bagging, resulted in ~ 90% reduction in the number of variables passed to the final prediction equations. Linear multiple and nonlinear kernel regression methods were applied to predict the January anomalies of maximum temperature using this reduced set of predictors. For the nonlinear regressions, two kernels were evaluated: polynomial and radial basis function. The polynomial degree and radial basis function kernel width were optimized for sea surface temperatures and climate drivers by maximizing their 10-fold cross validated correlations with the air temperatures at the various locations in SE Australia. The key findings were (1) climate drivers had as much significant influence on the prediction accuracy as SSTs and (2) the combination of the reduced sets of SSTs and climate drivers often accounted for 40-60% of the January mean maximum temperature variance. Such a large percentage of predictable variance is expected to lead to more effective monthly temperature predictions.

Full Manuscript

Basis For This Study

  • The Australian climate is warming at an alarming rate, with a higher frequency of, and more intense, hot months.
  • Impacts have been widespread and severe including habitat reduction, crop damage and human loss.
  • Past work has used linear regression techniques with limited climate driver attributes to establish predictive relationships to warming.

What This Study Adds

  • A combination of linear regression and bagging trees used to select from a large pool of sea-surface temperature and climate drivers.
  • Predictions of January mean maximum temperatures were made at four sites using kernel regression techniques.
  • At the four sites, correlation between predicted and observed air temperature ranged from 0.67 to 0.84, and the mean absolute error 0.65 to 1.18Ā°C.
  • In comparison to previous studies, these predictions were more accurate.

Sensitivity of Simulated Supercell Thunderstorms to Horizontal Grid Resolution

Montgomery Flora — Ball State University
Mentors: Dr. Corey Potvin

The effects of horizontal grid spacing on idealized supercell simulations are investigated. Motivation for the study largely stems from the NOAA Warn-on-Forecast program, which envisions a paradigm shift from ā€œwarn-on-detectionā€, where convective hazard warning decisions are primarily based on current observations, to a new paradigm where storm-scale numerical weather models play a greater role in generating forecasts and warnings. Unlike most previous grid spacing sensitivity studies, we focus on impacts to operationally significant features of supercells. Using the WRF-ARW model, idealized supercell simulations are run for 2 hours using three different environments and horizontal grid spacings of 333 m and 1, 2, 3, and 4 km. Given that forecasts under the Warn-on-Forecast paradigm will be initialized after several radar data assimilation cycles, we initialize our coarser simulations with filtered versions of the 333m ā€œtruthā€ simulation valid at t = 30 min. To isolate differences in storm evolution arising from finer-scale processes being unrepresented in the coarser simulations, the latter are compared to appropriately filtered versions of the truth simulations.

Results show that operationally significant errors in supercell evolution arise as the grid spacing is increased. Furthermore, the grid spacing sensitivity is strongly influenced by the environment. The 4 km grid spacing is too coarse to even qualitatively reproduce the supercell evolution, with the storm dying before the end of the simulation in one of the three environments. The improvement as grid spacing decreases from 2 to 1 km is greater than that from 3 to 2 km. Implications of this and other findings for Warn-on-Forecast are discussed.

Full Manuscript

Basis For This Study

  • Supercell thunderstorms produce the majority of significant severe weather (1ā€+ diameter hail and EF-2+ tornadoes) in the United States and are difficult to forecast numerically.
  • Sensitivity of supercell development, particularly operationally significant impacts, to grid spacing has not been thoroughly explored.
  • A major challenge confronting Warn-on-Forecast is the tradeoff between increased model resolution and increased computational cost.
  • How well do horizontal grid spacings of 1, 2, 3 and 4 km capture operationally significant features of supercells?

What This Study Adds

  • A 4 km grid spacing appears too coarse to well represent supercell dynamics and evolution.
  • Grid spacing sensitivity varies strongly with storm environment.
  • In general, more benefit gained moving from 2 km to 1 km spacing than from 3 km to 2 km.

Verification of the Bragg Scatter Method on the WSR-88D

Joshua Gebauer — California University of Pennsylvania
Mentors: Dr. Jeffrey Cunningham, Dr. W. David Zittel, and Robert R. Lee

For the purpose of radar quantitative precipitation estimates, differential reflectivity (ZDR) plays a crucial role and must be accurately calibrated. Currently, some WSR-88Ds in the Next Generation Weather Radar (NEXRAD) fleet may have systematic ZDR biases due to errors in the measurement of the H and V channels. The Radar Operations Center (ROC) monitors these systematic ZDR biases by measuring returns from external targets that should produce or can be adjusted to zero decibels (dB). One such target that has an intrinsic ZDR = 0 dB is Bragg scatter, a clear-air return caused by turbulent mixing in refractive index gradients. The ROC implemented a method the National Severe Storms Laboratory developed to detect Bragg scatter on the WSR-88D. This study uses atmospheric sounding data as truth to verify the radar based Bragg scatter detection method from January to June 2014 (11,521 radar/sounding pairs). Measurements of refractivity gradients and Richardson number from the 00Z sounding (indicators of conditions conducive to Bragg scatter) are compared to radar-based method detections between 00Z and 02Z. Sounding analyses reveal that the potential for Bragg scatter occurs 95.43% of radar/sounding pairs at vertical layers below 5 km in the continental United States (CONUS). However, due to the methodā€™s strict data filters and volume coverage pattern (VCP) requirements, the method only detects Bragg scatter 4.03% of the time (464 radar/sounding pairs). Of the 464 pairs, Bragg scatter detection is verified 84.7% of the time at the same height indicated by the sounding. Climate region characteristics influence variability of verification statistics. We expect that improvements to the data filters for Bragg scatter detection, better use of available VCPs, and improved scanning techniques will increase frequency of Bragg scatter detection.

Full Manuscript

Basis For This Study

  • The ROC has implemented a Bragg scatter method to estimate ZDR bias on the WSR-88D radars.
  • The climatology of Bragg scatter detection in the CONUS has not been studied.
  • The Bragg scatter method needs to be verified to ensure ZDR bias estimates are accurately calculated for all CONUS regions.

What This Study Adds

  • The mechanisms for creating conditions conducive to Bragg scatter vary by climate region.
  • There is potential to detect more cases of Bragg scatter with improvements to the Bragg scatter method.
  • The radar-based method for identifying Bragg scatter verifies on average 85% of the time, but the skill varies by climate region.

Determining the Likelihood of Observing the Tornado Article Debris Signature at Different Geographic Locations throughout the United States

Shawn Handler — Plymouth State University
Mentor: Dr. Valliappa Lakshmanan, Dr. Terry Schuur, and Dr. Matthew Van Den Broeke

With the upgrade of the National Weather Service network of weather radars to dual-polarization, it has become possible to use the new radar moments to detect tornado debris. This study investigates the likelihood of observing the tornado debris signature (TDS) at different geographic locations throughout the United States given that an ongoing tornado is present. The likelihood of observing a TDS varies according to radar geometry and the presence of materials that can be lofted by a tornado. To estimate the likelihood of observing a TDS at different geographic locations, we employed datasets of range from the nearest radar, lowest unblocked height of the radar beam, population density, and a normalized differenced vegetation index (NDVI). We also modeled the relationship of tornado intensity and the vertical extent of the debris signature. Maps for three distinct seasons in 2012 (spring, summer, fall) were generated identifying areas where TDS detection would or would not be likely for tornadoes of EF0-EF2 and EF3+ intensities.

The study indicates that a tornado is likely to be depicted by a TDS on radar if it occurs in regions of close proximity to the radar site, high population density or rich vegetation, and if the tornado itself is strong. The signature is less likely to be seen for weak tornadoes, rural areas that have little vegetation, and regions that experience beam blockage. Tornadoes of EF0 or EF1 intensities are unlikely to exhibit a TDS, and in some areas, like the Gulf Coast, the TDS may only be observed for tornadoes of EF3+ intensity. The range of TDS detection was also found to be limited in areas susceptible to tornadoes which included portions of the Central Plains, Midwest, and Mississippi Valley.

Full Manuscript

Basis For This Study

  • With the upgrade to dual-polarization radar, it has become possible to use new radar moments to detect tornado debris, specifically with the use of the tornado debris signature (TDS).
  • Criteria used for detecting the tornado debris signature on radar have been researched extensively.
  • Little research has looked into observing the tornado debris signature at different geographic locations throughout the United States based on different radar, environmental, and societal variables.

What This Study Adds

  • Given that there is an ongoing tornado, overall likelihood maps were created for three seasons to best represent the likeliness of observing a TDS.
  • Results have shown that it would be most likely to observe a TDS for tornadoes of EF2 or greater intensity. However, the lowest unblocked height of a radar beam dictates if the TDS will be seen.
  • Forecasters could apply these maps to determine whether a TDS depicted on radar is in a likely or unlikely area for detection which would be helpful for verifying tornado warnings.

Spatial and Temporal Variability of Albedo From Enhanced Radiation Measurements in Oklahoma

Nathan Kelly — Valparaiso University
Mentor: Brad Illston

In 1999, the Oklahoma Atmospheric Surface-layer Instrumentation System (OASIS) project placed instrumentation focused on observing the surface energy budget at 89 Oklahoma Mesonet stations. At any given time, 10 stations (designated ā€œsuper sitesā€), were outfitted with additional instrumentation including a four component net radiometer with the capability to observe incoming and outgoing shortwave (solar) and longwave radiation. Data are available from the beginning of 2000 until October 2008. This data was filtered to remove observations non-representative of the days albedo (e.g. sunrise and sunset periods, cloudy days, and erroneous instrument readings) and monthly averages were computed for each of the super sites in order to develop a better understanding of the spatial and temporal variability of albedo in Oklahoma.

Full Manuscript

Basis For This Study

  • Albedo is an important factor in boundary layer and surface energy budget parameterizations.
  • Current estimates of potential evapotranspiration in Oklahoma, based on the American Society of Civil Engineers standardized equation, rely on an albedo value of 0.23 for all locations andĀ seasons.
  • Is the albedo in Oklahoma as measured at 11 Oklahoma Mesonet sites significantly differentĀ  from 0.23, and how does albedo vary with the seasons?

What This Study Adds

  • Albedo is lowest in summer and highest in winter, varying by about ~0.04 depending on station.
  • On average, albedo is lower than 0.23 in all seasons.
  • 0.23 is not a suitable estimate for albedo in Oklahoma across all seasons and locations.

Examining Polarimetric Characteristics of Electronic Interference in Weather Radar Data

Thong Phan — East Central University
Mentors: Dr. Valiappa Lakshmanan and John Krause

Meteorologists have been able to examine the atmosphere using weather radars to look at what kinds of precipitation have been occurring for many decades. With the recent upgrade to dual-polarization radars (dual-pol) for the WSR-88D (Weather Surveillance Radar 1998 Doppler), meteorologists can now examine the atmosphere with dual-polarization products. These products are: Velocity (V), Reflectivity (Z), Differential Phase on Propagation (PhiDP), Correlation Coefficient (RhoHV), Differential Reflectivity (Zdr), and Spectrum Width (SPW). Though the products are very useful in determining what type of precipitation are in the atmosphere, how large the precipitation event is, and how severe it can be, it picks up many non- meteorological echoes. Electronic interference is a type of non-meteorological echo that has high reflectivity values and is mistakenly forecasted as precipitation by automated systems. This study looks at the reflectivity, differential reflectivity, and correlation coefficient of electronic interference and precipitation to find objective criteria to distinguish a difference between them. The findings are meant to aid in the current quality control algorithm to be more efficient for operational use.

Full Manuscript

Basis For This Study

  • The National Weather Service network of radars has been recently upgraded to dual-polarization radars, allowing quality control algorithms to be implemented.
  • Quality Control algorithms can aid in the removal of non-meteorological echoes.
  • Automated systems mistakenly forecast electronic interference as precipitation due to its high reflectivity values.

 

What This Study Adds

  • Reflectivity values at or greater than 20 dBZ along with RhoHV values greater than 1 could indicate the radar is forecasting a false echo.
  • Zdr values of electronic interference occur well below -2dB and above 6 dB.
  • Dual-polarization product comparisons other than Z, Zdr ,and RhoHV values of electronic interference must be assessed to find more objective criteria for aiding in the quality control algorithms.

Motivators and Important Factors Influencing Decisions Made During the Oklahoma Tornadoes of May 2013

Julia Ross — Olivet Nazarene University
Mentor: Dr. James Correia, Jr., and Dr. Daphne LaDue

There were three deadly tornado events in central Oklahoma in a two week time span in May 2013. A mass exodus of drivers occurred during the third event, clogging multiple interstates upwards of 60 miles away from the main storms. Scientists needed to understand what motivated people to act the way they did so they could better anticipate peopleā€™s actions and better communicate to the public in the future. To gain a reliable understanding of this, surveys about what people did during the events were created, distributed, and collected. Factors correlated to driving were those with incomes of less than $30,000 and incomes between $70,000 and $100,000; younger age (20-39 years old), and some higher education (a complete or incomplete Bachelorā€™s degree). Past direct experience with tornadoes was correlated to people staying at home, yet 33% of respondents did not feel safe at home. Of the 77 surveys collected, 27 (35%) respondents had never heard of mitigation beforeā€”the strengthening of their homes. Fear was commonly expressed (44%) with an undercurrent of self and home feeling vulnerable. Through these findings, scientists will be better able to anticipate Oklahomansā€™ responses to tornadic events and the reasons behind them.

Full Manuscript

Basis For This Study

  • 3 deadly tornado events occurred in central Oklahoma in a two week time span in May 2013 (National Weather Service Weather Forecast Office: Norman, OK 2014).
  • During the third tornado event, a mass exodus of people tried to drive away and ended up gridlocked on the interstates as the storms barreled through. When Garfield (2014) calculated that hundreds of drivers could have died.
  • Vehicles do not make safe shelters (Marshall et al 2008). Scientists wanted to know what motivated people to drive away and put themselves in danger.

What This Study Adds

  • It helps us understand why people drove: -Ā Ā  Important factors: Respondents with past direct experience with tornadoes were more likely to stay home. Respondents with a lesser income, those who were younger (20-39 years old), or those who had some higher education (ā‰¤ a bachelor's degree) were more likely to drive away.
  • 44% of respondents expressed fear during the events.Ā Vulnerability of both self and home was expressed in survey responses.
  • 33% don't feel safe in their homes. 35% didn't know that they could strengthen their homes (mitigate) but 50% would be willing to spend/have spent some money to mitigate.

An Evaluation of Applying Ensemble Data Assimilation to an Antarctic Mesoscale Model

Lori Wachowicz — Michigan State University
Mentor: Dr. Steven Cavallo and Dr. Dave Parsons

Knowledge of Antarctic weather and climate processes relies heavily on models due to the lack of observations over the continent. The Antarctic Mesoscale Prediction System (AMPS) is a numerical model capable of resolving finer-scale weather phenomena. The Antarcticā€™s unique geography, with a large ocean surrounding a circular continent containing complex terrain makes fine-scale processes potentially very important features in poleward moisture transport and the mass balance of Antarcticaā€™s ice sheets. AMPS currently uses the 3DVAR method to produce atmospheric analyses (AMPS-3DVAR), which may not be well-suited for data-sparse regions like the Antarctic and Southern Ocean. To optimally account for flow-dependence and data sparseness unique to this region, we test the application of an ensemble adjustment Kalman Filter (EAKF) within the framework of the Data Assimilation Research Testbed (DART) and AMPS model (A-DART). We test the hypothesis that the application of A-DART improves the AMPS-3DVAR estimate of the atmosphere. We perform a test using a one-month period from 21 September - 21 October 2010 and find comparable results to both AMPS-3DVAR and GFS. In particular, we find a strong cold model bias near the surface and a warm model bias at upper-levels. Investigation of the surface bias reveals strongly biased land-surface observations while the warm bias at upper-levels is likely a circulation bias from the model warming too rapidly aloft over the continent. Increasing quality control of surface observations and assimilating polar-orbiting satellite data are expected to alleviate these issues in future tests.

Full Manuscript

Basis For This Study

  • The lack of observations in the Antarctic make it difficult to observe the changing mass balance of the ice sheets.
  • The use of high resolution numerical models provide insight to finer-scale processes that may be affecting the Antarctic ice sheets.
  • Ensemble data assimilation is optimal for data-sparse regions by providing uncertainty estimates in order to improve the model

What This Study Adds

  • Using the analysis increment provides a diagnostic for determining model bias.
  • Surface observations may be biased and not well-represented by the ensemble.
  • An upper-level warm bias is leading to a potential circulation bias in the ensemble model

Wind Farm Layout Optimization Problem by Modified Genetic Algorithm

Grant Williams — Oklahoma State University
Mentors: April Taylor and Dr. Renee McPherson

Wind energy is a rapidly growing source of energy for the United States, but there are still technical problems to resolve before it can become the major source of energy production. One of the biggest problems with land based wind farms is minimizing wake- turbine interactions within a constrained space and thus maximizing power. When wind blows through a turbineā€™s blades, a choppy, turbulent wake is created that interferes with the ability of nearby turbines to produce power. Research has already been done on finding ways to model wind farms and place the turbines in a way that minimizes wake-turbine interactions, but current methods are either computationally intensive or require proprietary software. I present a modified genetic algorithm that is able to pro- duce optimized results in a relatively short amount of computation time. The algorithm presented is able to make use of the computation power of graphical processing units and multiple processors and by doing so produces results much quicker than an algorithm run sequentially on a single processor.

Full Manuscript

Basis For This Study

  • Wind energy is the fastest sector of the energy field, but it has a lot of technical complications that need to be worked out.
  • The wake turbine interaction is a very difficult computation problem that can only be modeled with heuristic models.
  • Minimizing turbine-wake interactions is a difficult, but necessary part of planning a wind farm and a model to do so efficiently is needed.

What This Study Adds

  • A modified genetic algorithm is presented that is capable of modeling a large number of turbines relatively quickly.
  • The model produces relatively optimal solutions to the wind farm layout optimization problem.
  • The model presented produced very promising results and is robust enough to incorporate more complex wake models or other parameters.

 

Copyright © 2014 - Board of Regents of the University of Oklahoma