Forecaster Thoughts – Bill Martin (2010 Week 4 – CASA)

I spent last week in Norman at NSSL and the Hazardous Weather Testbed helping to evaluate how the CASA radar network can be used in operations.  In addition to the radar people in Norman, I got to work with systems engineers from the Univ. of Virgina who are studying how forecasters make use of information and software tools.

As you may recall, the CASA radar network consists of 4 radars in southwest Oklahoma which are relatively low powered radars designed to work collaboratively.  A much larger network has been envisioned; I was told that a national network would require 10 000 such radars.  The close spacing of the radars allows them to see the lower levels of the atmosphere much better than 88Ds, and allows them to be closer to targets and, thus, have considerably better resolution than a typical 88D (though an 88D has better resolution for targets close to it).  The collaborative aspects of the network includes things like dual-Doppler analysis.  A large network would be able to give 2-D wind vector fields, instead of just towards and away wind values.  This would reduce the intellectual load of radar interpretation quite a bit.  Disadvantages of the network include attenuation and data quality problems (which would be mitigated by a larger network), and cost.  Each of the prototypes cost around $250K plus maintenance, though this would presumably come down with mass production, if it ever came to that.

CASA radars are sometimes considered as gap-filling radars, and they could certainly fill this role.  However, gap-filling radars have been available from vendors for some time, and the CASA project was designed to be more than that through collaborative properties.  Core funding for CASA has been from the NSF and has another 2 years to run.  After this, progress may come more slowly as funding for different aspects of CASA becomes diffuse, unless a source for new funding is identified.  I was involved in CASA from the beginning, having attended the NSF review that originally funded the project (as a graduate student).

As there was no active weather the week I was there, most of the time in the Testbed was spent playing back archived cases and issuing experimental warnings based on the CASA data, in addition to the usual data.

Some of the interesting issues that came-up:

—The systems engineering people were fascinated by the fact that all the forecasters they had evaluated used the available information differently.  I’m not sure if that is good or bad.  One the one hand, it is good to have variety so that new ideas can come to light; on the other hand, for some things there are probably “best” ways to proceed.

–WDSS II versus AWIPS.  The WDSS II software was used to visualize data.  This was much more sluggish and difficult to use than D2D.  FSI that we use as a plug-in to D2D is a subset of WDSS II.  For operations, we need fast and highly responsive access to data.  I recommended WDSS II be redesigned to be more efficient.  They had recently gotten D2D to work with real-time CASA data, and it was good to have both of those available so I could show them that software to look at radar data can actually be zippy.

–Having high-resolution data routinely available allows tornadoes to be discriminated based on reflectivity signatures.  I believe this would be a relatively new concept in operations.  The reflectivity “donut” associated with tornadoes that is seen in high-res research radars has been recognized form some years as verification of a tornado.  “Donuts” or similar features were seen in all tornado cases available with CASA, and such features are rarely seen in 88Ds due to lower typical resolution.  With super-res data in the 88Ds, I suspect tornado reflectivity features are now more often seen in 88Ds, though.  The TVS algorithm we currently use relies only on velocity information, and many forecasters do likewise; however, it is becoming clear that greatly improved detection can be achieved by considering both velocity and reflectivity signatures.

Data overload.  CASA radars give a volume scan every minute, there are 4 CASA radars to look at, they have 2-D wind analyses to look at as well, and have short-term forecasts to look at, in addition to all the usual things.  It is very difficult to keep up with all these data sources and simultaneously make warning decisions.  The data overload problem is recognized as an issue with many new data streams.  Possible solutions include greatly improved algorithms to handle some, or most of the analysis, and putting all the data from different sources into some sort of combined 4-D space than can be perused (similar to the FAAs 4-D cube).  With a 4-D cube concept, a short term forecast can be combined with the data in the same 4-D space to show an extrapolation (similar to the warn-on-forecast concept).

–Using CASA radars did help quite a bit in issuing warnings because of improved resolution of features, because of seeing closer to the ground, and because of better time resolution.  Having a dense network of CASA radars (with good software tools for analysis) would be quite an advance.  Of course, doubling the density of the 88D network might achieve many of the same goals, and it is really a question of cost-effectiveness.

A couple other things I learned on the trip:

–The MPAR (Multi-function Phased Arrray Radar) is scheduled for an large increase in funding next year.  This is mostly to prove the concept of dual-pol phased array, which hasn’t been done before.  A phased-array radar network is envisioned as a potential replacement for the 88D network.  This one network would be used by multi-agencies, including the NWS, the FFA for air traffic control, and by DHS.  For this concept to be palatable to the NWS, the replacement for the 88D network should be at least close in performance to the current 88D network, and this includes dual-pol.

–NOAA is developing a roadmap for radar which extends through 2025.  I suspect this is fairly fluid, but ideas include MPAR, gap-filling radars, and integrating private sector radars (TV stations), as well as assimilating radar data for warn-on-forecast.  The only thing really firm is the dual-pol deployment over the next 3 years.

Bill Martin (Science and Operations Officer, NWS Glasgow MT – 2010 Week 4 Evaluator)

Tags: None