REU Home | 2016 Home | Students & Mentors | Projects | Schedule | Travel Tips | Photos

 

 

NWC REU 2016

May 23 - July 29

 

 

Projects

REU Final Paper: Analysis and Improvement of a Six-Axis Robotic Scanner for Millimeter-Wave Active Phased Array Antenna Calibration

Robert Baines —  Rice University
Mentor: Dr. Jorge Salazar and Rodrigo Lebron

Herein, a novel proof-of-concept experimental setup proposed by Dr. Salazar at the Radar Innovations Laboratory was constructed and optimized to assess the effects of temperature variations on phased array radar calibration. The system is fully automated and programmable with a graphical user interface and allows surface, thermal, and radio frequency characterization. The controllable hardware consists of an environmental chamber and a robot arm mounted with an infrared camera, high definition camera, laser, and radio frequency scanner that perform automated analysis of a chosen antenna under test. This paper, in addition to describing the experimental setup in detail, catalogues the analysis and amelioration of system parameters.

Link to Final Paper [PDF]

What is already known:

  • Phased array radars are the next logical step in both weather and military radars.
  • Current work is optimizing phased array radar performance to match that of conventional radar.

What this study adds:

  • This study provides a never-before created novel setup that enables comprehensive phased array radar calibration characterization under varying temperature conditions.
  • This laboratory study is an important step toward full-scale phased array radar implementation in realistic conditions.

UAV-Based Calibration for Polarimetric Phased Array Radar

Christian Boyer — Millersville University
Mentors: Dr. Caleb Fulton

Calibrating dual polarization in phased array radars is an important aspect of risk mitigation in moving towards a nationwide multifunctional phased array radar (MPAR) system for weather surveillance and to track aviation. Since dual polarization was implemented into the WSR-88D, the new products have been vital for hydrometeor classification. The calibration of scan-dependent polarization in phased arrays is a primary goal in achieving the same products provided by traditional dish-based systems. There are many challenges to the calibration process, including isolating the horizontal and vertical polarizations that are sent by the radar and making sure that their amplitudes are identical. In the project described herein, the focus is on the calibration of the radar’s receive patterns, the first step in the overall calibration process. An unmanned aerial vehicle (UAV) has been developed to facilitate scan-dependent calibration of a fixed phased array, and the focus of this part of the project is on the so-called “Twitching Eye of Horus” circuit. It provides a means for transmission of calibrated horizontal (H) and vertical (V) electric fields towards the radar in a controlled manner from the UAV, but in and of itself it requires its own processing and calibration procedures. Removal of the frequency offset between the circuit and the radar is a primary challenge. This study takes a look at the process of calibrating the radar’s receiver using a UAV and the Twitching Eye of Horus, as well as presenting initial results.

Link to Final Paper [PDF]

What is already known:

  • Previous studies have detailed the challenges in calibrating polarimetric phased array radars to provide a high degree of polarization purity.
  • Dual polarization phased array radars offer many advantages, and is a candidate for a future multifunctional phased array radar (MPAR) system that combines weather radar and air surveillance/traffic monitoring.

What this study adds:

  • A circuit, called the "Twitching Eye of Horus,” transmits horizontally- and vertically- polarized fields for calibration of the radar receiver.
  • This study created a technique for extracting estimates of a radar receiver scan-dependent polarimetric pattern from raw radar data.
  • This study enables a sensor-equipped unmanned aerial vehicle (UAV) for scan-dependent calibration of a fixed phased array radar.

Ensemble Forecasts and Verification of the May 2015 Multi-Hazard Severe Weather Event in Oklahoma

Austin Coleman —  Valparaiso University
Mentor: Dr. Nusrat Yussouf

Dual threat severe weather events in which both tornadoes and flash floods affect the same area within a short time frame pose a complex problem since the life-saving actions for these two events are contradictory. One such event is the 6-7 May 2015 tornado and flash flood event over Oklahoma. This study explores the capability of a rapidly-updating 3-km horizontal grid spacing convective-scale ensemble data assimilation and prediction system developed as part of the Warn-on-Forecast initiative to forecast features of this dual threat severe weather event. Results indicate that the 0-1 h probabilistic forecasts of reflectivity verify reasonably well with the observations. However, beyond the 1 hour forecast period, the forecast accuracy is degraded, including biases in storm motion as well as spurious cell generation. The ensemble probability matched mean quantitative precipitation forecasts capture the placement of most intense areas of precipitation very well, but underestimate the amount of accumulated precipitation. These quantitative precipitation forecasts are found to outperform the deterministic quantitative precipitations forecasts of the operational High-Resolution Rapid Refresh model as well. Additional ensemble forecast experiments from simple downscaling to 1-km grid spacing from the 3-km ensemble do not significantly reduce the storm motion bias found in the original results and introduce more spurious cells.

Link to Final Paper [PDF]

What is already known:

  • The current prototype Warn-on-Forecast (WoF) system is a storm-scale ensemble at 3-km horizontal grid spacing that has the potential to extend probabilistic low level mesocyclone forecast lead times for severe convective events.
  • The science and technology being developed to achieve WoF goals can also be used to improve 0-3 hour extreme rainfall forecasts for convective systems.
  • The 3-km horizontal grid spacing of the current prototype WoF system is too coarse to resolve tornadic circulations.

What this study adds:

  • The 0-3 hour heavy rainfall forecasts from the prototype system verifies better with NCEP’s Stage IV analyses and Mesonet observations in terms of locations compared to that from the operational HRRR forecasts for this case study.
  • The ensemble forecasts systematically underestimate precipitation amounts, which indicates further investigation of microphysics scheme sensitivities as well as grid-spacing sensitivities are needed.
  • Ensemble forecasts at 1-km horizontal grid spacing from the downscaled 3-km prototype system introduces many spurious cells with embedded spurious mesocyclones.

Assessing Future Projections of Climate Extremes Over the South Central USA

Dana Gillson —  Mount Holyoke College
Mentor: Dr. Esther Mullens and Dr. Derek Rosendahl

Climate extremes (heavy precipitation, drought, heat waves, storms, etc.) adversely affect numerous socioeconomic systems including infrastructure, economy, agriculture, and ecosystems. Understanding observed extremes events in the past and being able to determine how well climate models capture these will help planning and adaptation to climate stressors. The Expert Team on Climate Change and Detection (ETCCDI) have defined and developed a list of 27 core climate extreme indices that measure temperature and precipitation. Previous studies have compared the reliability of these extremes in a variety of regions but very few have done so with a focus on the south-central Untied States. This study uses 11 of the climate extreme indices to analyze climate extremes from historical observation-based reanalyses (ERA40, ERA-Interim, NCEP1, NCEP2) as well as historical and future projections of 31 global climate models (GCMs) from the Couple Model Intercomparison Project Phase 5 (CMIP5). We split the south-central region into three sub-regions (west-central, south-central and east-central). Results indicated that observation-based reanalyses can be significantly different from one another and therefore result in varying model biases depending on which reanalysis is used. Model performance is dependent on region, season, and extreme indices, and therefore no single model was found to be best for all situations. Similar models from the same institutions tend to contain similar biases within and across regions. This study also provides future projections that show a possible differentiation between the best and worst performing models.

Link to Final Paper [PDF]

What is already known:

  • Extreme weather events (heavy precipitation, drought, heat waves, and storms) impact multiple sectors of life including infrastructure, people, ecosystems, economy, and agriculture.
  • The Expert Team on Climate Change Detection & Indices (ETCCDI) has defined 27 core extreme indices that can be calculated in Global Climate Models (GCM) such as the Coupled Model Intercomparison Project Phase 5 (CMIP5).
  • Previous studies have compared the reliability of global reanalyses in a variety of regions but very few (if any) have been done in the south central USA.

What this study adds:

  • Observation-based reanalyses can be significantly different from one another and therefore result in varying model biases depending on which is used.
  • Model performance is dependent on region, season, and extreme indice, and therefore no single model was found to be best for all situations.
  • Similar models from the same institution tend to contain similar biases.
  • This study provides future projections that show a possible differentiation between the best and worst performing models.

Arctic Weather and Abrupt Sea Ice Loss

Uriel Gutierrez — Texas A&M University
Mentors: Dr. Steven Cavallo and Nicholas Szapiro

September Arctic sea ice extent is decreasing rapidly, especially over the past few decades. While the mechanisms contributing to this climate trend are relatively-well understood, the year-to-year variability is not. This study examines 2-d decreases in summer sea ice extent to quantify the year-to-year variability that is due to synoptic time-scale processes and isolate its possible source. It is hypothesized that the abrupt reductions in sea ice are a consequence of synoptic–scale cyclones, and in particular the anomalously strong surface winds over the periphery of the cyclones from a strong pressure gradient.

A spectral analysis of two-day changes in sea ice extent is performed to determine whether events at synoptic time-scales have significant contributions to sea ice loss with respect to red noise.  Several significant periods are found at synoptic time-scales, at 5, 6, 8, 10, and 16 days.  A Butterworth filter is then applied to high-pass periods shorter than 18 days to isolate the abrupt sea ice loss events corresponding to these high frequencies and compile a set of significant events.  Defining the top 1% of the high-pass filtered two-day decrease in sea ice extent, there is found to be two annual maxima: July and December, and only summer cases (June-August) are retained for the present study.  Composite sea level pressure of the 25 cases reveals the presence of a 998 hPa mean surface cyclone, which varies in strength from 999 to 978 hPa. While there is always a cyclone, there is often, but not always, a nearby anticyclone that can further enhance the pressure gradient over the sea ice loss region.

Link to Final Paper [PDF]

What is already known:

  • The long term climate trend for Arctic sea ice is well established with the ice-albedo effect as a main driver.
  • Wind patterns from inter-seasonal oscillations (such as Arctic oscillation) have effects on sea ice motion and extent.
  • Abrupt sea ice loss events have been observed on time scales of days and coincide with surface cyclones.
  • A better understanding of year-to-year variability is important for improving future predictions of sea ice.

What this study adds:

  • Oscillations in change of sea ice extent (1979-2014) at synoptic time scales were shown to be statistically significant.
  • Synoptic time scale reductions in sea ice extent occur most frequently in July and December.
  • Composite of top 1% of abrupt loss in sea ice extent events revealed strong winds over loss area.
  • These conditions always occurred with a nearby surface cyclone; enhancement from anticyclones could sometimes also occur.

Impact of Rain Gauge Location Errors on Verification of Radar-Based Precipitation Estimates

Sebastian Harkema — Central Michigan University
Mentors: Dr. Heather Grams and Steve Martinaitis

Flash flooding can cause hundreds of deaths and billions of dollars worth of dam- age each year. In 2015, there were 176 fatalities in the United States, Puerto Rico, Guam and Virgin Islands, which is roughly five times higher compared to those caused by torna- does. The Multi-Radar Multi-Sensor (MRMS) system, which generates a 1-km grid of quantitative precipitation estimates (QPE), can provide insight to forecasters when issu- ing flash flood warnings. The most accurate data are needed for the high spatial resolu- tion of MRMS. Rain gauges are treated as ground truth and can provide the most accu- rate verification of QPE. The most well-known gauge network is the 838 rain gauges from Automated Surface Observing System (ASOS) stations. It is a standard to accept that QPE values can vary from collocated observed gauge values however, location errors of the rain gauge can have an impact on the verification of MRMS QPE. Using Google Earth, it is determined that ASOS location errors varied from less than 3 m to 80,163 m. The locations errors resulted in 79.31% of ASOS stations in the CONUS to be in a differ- ent MRMS QPE grid box. Of those stations, 19.44% were found more than 1 km away from the expected locations. QPE values for the new and old locations were compared to observed precipitation data with the correlation increasing from 0.777 to 0.810. This com- parison highlights the need to update rain gauge metadata to improve the verification of radar-based QPE and other hydrometeorological products.

Link to Final Paper [PDF]

What is already known:

  • The Multi-Radar Multi-Sensor (MRMS) system, which generates a 1-km grid of quantitative precipitation estimates (QPE) products, can provide insight to forecasters when issuing flash flood warnings.
  • Rain gauges are treated as ground truth and can provide the most accurate verification of radar-based QPE.
  • Verification and data accuracy using gauges is only as good as the instrumentation and observations. Even with the advancement of technology, inaccuracies of rain gauges persist and are well documented.

What this study adds:

  • It is standard to accept that radar-based QPE values can vary from collocated observed gauge values; however, location errors with rain gauges can have an impact on the verification of MRMS QPE.
  • A majority of Automated Surface Observing System (ASOS) stations in the Continental United States (CONUS) were found to be in a different MRMS grid box than the one indicated by the latitude and longitude in their original metadata.
  • New QPE values based on the new latitude and longitude coordinates had better correlation with the observed precipitation than those QPE values based on the original metadata locations.

Quantifying the Carbon Footprint on Paris with Remote Sensing Observations

Briana Lynch — University of Massachussetts Lowell
Mentor: Dr. Sean Crowell

Understanding the complex temporally and spatially varying carbon dioxide (CO2) emissions in urbanized areas is crucial to identifying causes of climate change and how they can be addressed. While many studies have been conducted to better quantify urban CO2 (and other pollutant) fluxes, there are still many open questions about the interpretation of these measurements and how emissions scale with population, as well as how to attribute concentrations measured in urban environments to anthropogenic and natural sources and sinks. After deployment for six months (November 2015-April 2016) in Paris, France, data from the Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE™), an observing system that combines laser-based differential absorption spectroscopy measurements with tomographic techniques to create a two dimensional map of CO2 concentrations, was interpreted and analyzed. An evolution equation for CO2 mixing ratio within the Planetary Boundary Layer (PBL) was applied to identify and separate sources and sinks of CO2. Results from this analysis directly verify the impacts that wind speed and direction have on CO2, namely dilution and enhancement. Preliminary analysis characterized the relationships between CO2 and nitrogen dioxide (NO2) and ozone (O3). GreenLITE™) proves to be an accurate measuring tool for CO2, but further interpretation and analysis of data is necessary to estimate the emissions of Paris.

Link to Final Paper [PDF]

What is already known:

  • Previous studies have shown that urban areas have a larger amplitude seasonal cycle of carbon dioxide concentrations.
  • The basic interaction between atmospheric dynamics and pollutants in the planetary boundary layer has been studied, but its contribution to urban environments is not well understood.
  • Near surface horizontal lidar instruments quantify carbon dioxide concentrations for the boundary layer more effectively than airborne and space-borne measurements, since the column average is less sensitive to the boundary layer.
  • Disaggregation of natural and anthropogenic sources of CO2 is difficult and relies on existing inventories to interpret measurements.

What this study adds:

  • This study used a new observational technique determine carbon dioxide concentration sources and sinks.
  • This work confirmed that the Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE) made science-quality measurements for six months in an urban environment.
  • A preliminary result shows that nitrogen dioxide is positively correlated with carbon dioxide, while ozone seems to be anti-correlated with carbon dioxide, though the mechanism is still not understood.

Modeling the Physical, Dynamical, and Chemical Characteristics of Extreme Extratropical Convection in the Upper Troposphere and Lower Stratosphere

Russell Manser — Saint Cloud University
Mentor: Dr. Cameron Homeyer and Daniel Phoenix

Stratosphere-troposphere exchange via extreme extratropical convection has implications for climate change. We test the ability of the ARW-WRF model to simulate the physical aspects of a real case of extreme extratropical convection that injected cloud particles into the stratosphere. We find that the model resolves storm structure sufficiently, and proceed to examine the representation of trace gas transport within the same case of convection. Additionally, distributions of trace gas concentrations across the nested model domain are considered in diagnosing irreversible transport. Trace gas transport is seen in model output within the cloud, but little evidence exists for out of cloud transport.

Link to Final Paper [PDF]

What is already known:

  • Convection that penetrates the tropopause transports gases into the stratosphere.
  • Numerical models can resolve gravity wave breaking and lofting of cirrus clouds in convection, which is a mechanism for transporting gases into the stratosphere.
  • Stratosphere-troposphere exchange impacts the chemistry of the upper troposphere and lower stratosphere, the radiation budget through modification of greenhouse gases and, in turn, climate.

What this study adds:

  • The ARW-WRF model coupled with chemistry is capable of simulating a real case of convection that penetrates the tropopause and has an above-anvil cirrus plume.
  • Numerical models can resolve the irreversible transport of trace gases into the stratosphere. While water vapor is enhanced at all levels the cloud reaches in the stratosphere, transport of carbon monoxide (a tropospheric pollutant) is limited to the maximum height of the simulated radar reflectivity echo top, which typically lies 2 km below the cloud top.

Climate Change Hazards: Extreme Precipitation Events & Flooding in Oklahoma’S Tribal Nations

Kristina Mazur —  Rutgers University
Mentors: April Taylor, Dr. Esther Mullens, and Dr. Derek Rosendahl

Extreme weather hazards are important because they can result in the loss of life, destruction of property, and damage to the environment. It is therefore important to understand how the frequency and intensity of these hazards may change in the future due to climate change. For this study, we address extreme precipitation across Oklahoma, specifically the area which the Citizen Potawatomi Nation, Choctaw Nation of Oklahoma, and Chickasaw Nation encompass. Based on interviews with tribal emergency managers of these three nations, flooding was indicted as a particularly impactful hazard. As a result, this project develops nation-specific projections in heavy precipitation, using various metrics to quantify extremes. High-resolution (~6 km) statistically-downscaled climate model projections, from the Multivariate Adaptive Constructed Analogues (MACA) project were used to assess future changes in heavy precipitation. The results of the future projections show that the average daily precipitation rate does not change in the mid-century (2021-2051) or late-century future (2060-2090); however, heavy precipitation at higher thresholds are likely to increase over these tribal nations in Oklahoma for the same time periods. This was evident by the trends identified in frequencies of 2 inches and 4 inches per day, and 5-day accumulations exceeding 8 inches. Overall, this research will assist tribal emergency mangers in planning for and mitigating potential impacts of floods.

Link to Final Paper [PDF]

What is already known:

  • Flooding caused by extreme precipitation events is one of the top hazards effecting Citizen Potawatomi Nation, Choctaw Nation of Oklahoma and Chickasaw Nation.
  • Climate change can lead to more extreme weather events such as extreme precipitation and flooding.
  • Patterns such as the intensity, frequency, timing, and duration of extreme weather events are changing as a result of climate change.

What this study adds:

  • This study provides tribal nations with future climate projections to help them prepare for and mitigate damages caused by extreme precipitation events.
  • Tribal nations in the future should expect an increase in extreme rainfall events, rather than an increase in the average daily precipitation rate.
  • There is reasonable confidence in extreme precipitation changes across this region; however, the 15 models displayed a range of possible outcomes.

The Impact of a Violent Tornado in Norman, Oklahoma

Karen Michelle Montes Berríos — University of Puerto Rico, Rio Piedras Campus
Mentors: Dr. Ashton Robinson Cook, Amber Cannon, Somer Erickson, and Dr. Mark Shafer

Several previous studies have estimated impacts from significant tornadoes in large metropolitan areas like Chicago and Dallas-Fort Worth. Such a study has not been completed for the Oklahoma City/Norman, Oklahoma areas, despite their residence in one of the most tornado- prone areas in the world. Norman has had several close calls with violent tornadoes in recent years, including the May 24, 2011 Chickasha-Newcastle and Dibble-Goldsby EF4 tornadoes, the May 20, 2010 Little Axe tornado, the May 19, 2013 Shawnee/Bethel Acres EF4 tornado, and the May 20, 2013 Moore EF5 tornado. Norman has been rather fortunate with regard to significant tornadoes, which have largely avoided the most densely populated areas of the city.

The current study investigates the potential of a violent tornado impacting the most densely populated areas of the city of Norman. In order to evaluate this impact, a simulated tornado track was created by transposing the May 24, 2011 Chickasha-Newcastle EF4 tornado track into the most populated areas of Norman using ArcGIS software. GIS datasets provided by state and local governments, including the locations of buildings within Norman, were analyzed to assess specific impacts on critical infrastructure, commercial, and private residences.

Results from this study indicate that totals from structures impacted directly by this tornado have a cumulative value of approximately $800 million. This figure does not incorporate other peripheral losses (i.e., from vehicles, power poles/street markers, or contents of homes) nor does it incorporate damage at businesses/commercial infrastructure. Additionally, five city government buildings were directly impacted: as well as nine schools and the main hospital in the city. Several major highways in Norman are also included in the damage path, with likely traffic jams on these roads similar to past tornado events occurring in the region. 8,186 buildings were affected by the simulated tornado in this study, which is nearly double the number of buildings impacted in the May 20, 2013 Moore, Oklahoma tornado, where 4,253 buildings were damaged. It is conceivable that losses in Norman could easily exceed the $2 billion of damages that occurred in the Moore tornado, suggesting a potential worst-case scenario for the region.

Link to Final Paper [PDF]

What is already known:

  • Previous studies with geographic information systems (GIS) have estimated natural hazards’ impacts in metropolitan areas, like in Chicago and Dallas-Fort Worth.
  • Oklahoma City and Moore have had a history of tornadoes in its metropolitan area, but some communities have not.
  • GIS analyses can help emergency management officials, meteorologists, and sustainability researchers to effectively prepare for and respond to a disaster because they incorporate infrastructure and land use information.

What this study adds:

  • This study analyzes the impact of an 1.07 mile wide EF4 tornado path through central Norman, Oklahoma.
  • Damage to residential areas were calculated using Z Estimates, and showed a potential cost of over $1B.
  • Evaluation of key locations throughout the tornado track, evaluating their sustainability and vulnerability according to its position.

Verification of Automated Hail Forecasts From the 2016 Hazardous Weather Testbed Spring Experiment

Joseph Nardi — Carleton College
Mentors: Dr. Amy McGovern, Dr. Nate Snook, and Dr. David John Gagne II

Every spring, the Storm Prediction Center (SPC) and the National Severe Storms Laboratory (NSSL) run an experiment to improve the prediction of severe weather called the Hazardous Weather Testbed. One of the major goals of the experiment is to forecast individual hazards, such as hail. These hail forecasts are run on the Center for Analysis and Prediction of Storms (CAPS) mixed physics ensemble. This ensemble is run using the Advanced Research Weather Research and Forecasting (WRF-ARW) numerical weather prediction model with 9 ensemble members and horizontal grid-spacing of 3 km. Automated hail forecasts are run for a 24 hour period using three different methods: HAILCAST, the Thompson Hail Size Method, and the Gagne Machine Learning Method.

To verify the three hail forecasting methods, neighborhood ensemble probabilities are calculated for a 24 hour period for both 25 mm and 50 mm hail. These hail forecasting methods are verified against data from the NSSL Multi-Radar Multi-Sensor (MRMS) radar mosaic using the Maximum Expected Size of Hail (MESH) method. Relative Operating Characteristic (ROC) curves as well as Attribute Diagrams were created along with calculating the ROC Area Under the Curve (ROC AUC) and Brier Skill Score. A case study of May 26, 2016 was performed; on this day a large complex of storms moved over Nebraska, Kansas, Oklahoma, and Texas, producing 204 reports of severe hail, 183 reports of severe wind, and 21 tornado reports.

Overall, the Gagne Machine Learning Method has greater skill, in terms of the Brier Skill Score, than the other two hail forecasting methods. The Gagne Machine Learning Method also exhibits better discrimination for 25 mm hail in terms of the ROC AUC score. Lastly, the Gagne Machine Learning Method consistently performs well across all microphysics schemes because it is calibrated on each microphysics scheme. For the May 26, 2016 case study, the Gagne Machine Learning method exhibited greater capability to predict hail exceeding 25 mm in diameter while producing relatively few false alarms.

Link to Final Paper [PDF]

What is already known:

  • It is challenging to forecast severe hail due to uncertainties in numerical weather prediction models and observations.
  • HAILCAST, a popular hail prediction model, shows considerable skill in its ability to forecast hail size
  • Machine learning approaches show some advantages over physics-based hail forecasts.

What this study adds:

  • The Gagne Machine Learning Method has slightly higher skill and discrimination in both the forecasts of 25 mm and 50 mm hail than HAILCAST or the Thompson Hail Size Method.
  • HAILCAST performed better at forecasting hail greater than 50 mm in the case study, however, it also has a greater false alarm rate.
  • The Gagne Machine Learning Method is more consistent over all the microphysics schemes as the model is calibrated to each microphysics scheme.

Analysis of Anti-Ice Coatings on Field Operational Anemometers

Jamin Rader — University of Washington
Mentors: Brad Illson

Ice accumulation on anemometers, a side effect of freezing precipitation, makes reliable wind measurements nearly impossible to collect during winter conditions. Over the last decade, the Oklahoma Mesonet has lost more than 26 days worth of wind measurements at its location in Norman, Oklahoma, USA as a result of this freezing precipitation. This study tested the reliability of two anemometers with anti-ice technologies through icing conditions: an R. M. Young Wind Monitor coated in NeverWetTM, a superhydrophobic coating, and an R. M. Young Alpine Wind Monitor. Wind measurements collected between 19 Nov. 2013 and 30 Nov. 2015 showed little difference between the performance of the anemometers with anti-ice technologies and an unaltered R. M. Young Wind Monitor through six periods of freezing precipitation. At best, the Alpine anemometer remained iced for 40 fewer minutes than the uncoated anemometer (0.7% of the length of the freezing precipitation event) and the coated anemometer remained iced for 80 fewer minutes (5.1% of the length of the freezing precipitation event). In these six events, the anti-ice technologies did not prove to be more reliable alternatives to the R. M. Young Wind Monitor during freezing precipitation and their implementation would not provide suf cient bene t for operational use in the Oklahoma Mesonet.

Link to Final Paper [PDF]

What is already known:

  • Norman, Oklahoma, has lost more than 26 days of wind data over the last decade due to ice accumulation on anemometers—a product of freezing precipitation.
  • Many have researched the success of superhydrophobic coatings as anti-ice technologies, though no effective solutions have been found.

What this study adds:

  • An R. M. Young Alpine Wind Monitor, made for winter conditions, and an R. M. Young Wind Monitor covered in NeverWet were not successful as anti-ice technologies through six freezing precipitation case studies in Norman, Oklahoma.
  • Neither of these technologies were deemed beneficial for operational use for the Oklahoma Mesonet.

 

 

 

 

 

Copyright © 2016 - Board of Regents of the University of Oklahoma