Forecaster Thoughts – Jonathan Howell (2008 Week 4)

The Hazardous Weather Testbed (HWT) provided an excellent opportunity for forecasters to work side by side with researchers to evaluate radar technology and warning strategies developed for future use.  While participating in the experiment, I had the opportunity to evaluate the Phased Array Radar (PAR) and the Collaborative Adaptive Sensing of the Atmosphere (CASA) radars in both a real-time and archived environment.  Our group also tested the new experimental process which integrated short-term severe threat probabilities into operational severe thunderstorm and tornado warnings.  The outcomes of the experiments demonstrated potential significant advantages and challenges if implemented into an operational weather forecast office (WFO) setting.

My experiences with the PAR radar equipment were very positive.  This technology will likely exceed the capabilities of the current WSR-88D radar network.  The primary advantage to using the PAR radar technology is the rapid update of base radar data.  During the experiment, I was able to rapidly identify important convective structural features that led to quicker warning decisions and longer warning lead times.  Another important advantage of the rapidly updating PAR data was the ability to identify important storm features in the 1 minute PAR data that would not have been sampled by the 5 minute WSR-88D scans.  In addition, the 1 minute update PAR data better portrayed storm evolution.  This was particularly advantageous during an archived unorganized pulse severe thunderstorm event which we used the PAR radar to conduct warning operations.  As a warning decision maker, I was able to observe the initiation of collapsing cores aloft and issue warnings with enhanced lead time.  The current scan limitations of the WSR-88D likely would have further delayed my decision to warn during this event.

Advantages of the PAR radar exceed any limitations that I experienced.  The only minor limitation to the PAR radar will be the ability of the radar operator to adjust to the much faster influx of additional radar data.  I think this will primarily be an initial challenge which will be overcome quickly by warning forecasters.  In fact, I felt comfortable with the increased radar data flow after a few days of use.  Overall, my experiences using the PAR radar were very positive and I hope that this technology will eventually be implemented into the NWS field offices.

Another radar system which we experimented with was the CASA radar network.  The CASA radars also proved to be a robust network that provided advantages to warning forecasters during experimental warning operations.  As is the case in most NWS County Warning Areas (CWA) and in the Memphis CWA, sampling low-level storm characteristics using the current WSR-88D radar network at extended distances from the radar is very difficult or impossible due to radar spacing issues and curvature of the earth.  This greatly limits warning forecaster’s ability to observe important low level storm at large distances from the radar.  The greatest advantage that I see to implementing the CASA radar network is the ability to limit gaps in low-level radar coverage.  This will provide warning forecasters with improved radar information necessary for longer lead time and improved warnings.  In addition, enhanced low level radar information allows warning forecasters to better define geographically where the greatest storm threats exist.  While participating in the HWT, I experienced these benefits firsthand.  The CASA network would likely be best implemented as a compliment to the PAR radar network.  By locating CASA radars in between the traditional PAR (current WSR-88D sites) radars, the system would become very robust and most beneficial to the NWS warning process.

The HWT also tested the capabilities and practicality of probabilistic warnings.  Probabilistic warnings appear to present the greatest challenge of all the new techniques tested.  Researchers envision probabilistic warnings eventually replacing current NWS warning practices.  As a forecaster using the new probabilistic warning technique, I found the process difficult to employ and likely confusing to the public.  The primary limiting factors of probabilistic warnings in my opinion include, (1) quantifying specific threats and expressing those threats in a proper manner to the public, (2) warning forecaster work load issues, and (3) public response problems associated with different threat percentages.  The NWS mission statement clearly reflects the important role that severe weather warnings play in protecting life and property.  The primary reason that the NWS issues severe weather warnings is to encourage the public to take actions required to protect themselves from dangerous weather.  I believe that eliciting public response to probabilistic warnings will be a significant challenge since every person’s threat threshold is different.  Probabilistic warnings may create confusion and limit public response to warnings and should primarily be available only to very high end users (if they can understand the process and find it beneficial).

Overall the HWT was a great opportunity for me to evaluate potential NWS technology of the future.  Collaboration between researchers and operational forecasters is a great way to share ideas, provide feedback, and get useful technology into NWS field offices.  The HWT and similar collaborative experiments hopefully will continue into the future.  I look forward to the eventual release of new and improved technology into the field.  Finally, I would like to thank those involved for giving me the opportunity to participate in this experiment and hope to again be involved in similar projects in the future.

Jonathan Howell (NWS Memphis TN – Week 4 Participant)

Tags: None

Forecaster Thoughts – David Hotz (2008 Week 3)

First, thanks for the opportunity to provide feedback on the various programs-research. Your work is crucial to the future success of the National Weather Service.

I appreciate your willingness to listen to my viewpoints about the CASA, PAR, and PROBWARN programs. Even though we did not see eye-to-eye on some of the issues related to PROBWARN, I do believe we must explore innovate ways of expanding our services to the EMAs. I feel the future of the local WFO offices will be to expand our short-term warning and forecast programs to our customers. Our daily relationships with our core customers must be closer, either face-to-face or through up-to-date technologies.

I commend you and your staff for your dedication, hard work, and hospitality. Thanks again for allowing me to participate.

David Hotz (NWS Morristown TN – Week 3 Participant)

Tags: None

Forecaster Thoughts – Bill Rasch (2008 Week 2)

Coming into this experiment, I truly did not know what to expect. I did my homework best I could but was still concerned I didn’t learn all the acronyms correctly and would in the long run not be able to add much to the experiment. By the end of the first day, however, any worries I had were put to rest by the great crew at the EWP as they made me feel comfortable and sure I was prepared for the battle of the coming week.

Where to start…let me start with the CASA. Being from a Western Region office that has noticeable radar data gaps, these were the folks I drifted to immediately as I wanted to learn more about their system. After simulations and a presentation of the CASA data I was blown away quickly. I felt as if I was watching some type of different radar data world when viewing the data. To me it made the returns appear as it they were living things. The high resolution and rapid refreshing of the data made it appear as if you were watching weather in High Definition. All I could think of was to fill our radar gaps with this type of data. There is no doubt in my mind that additional lives and property would be saved if this system was added to the NWS’s operational forecasters suite of tools. I was very impressed what the CASA team had developed with their adaptive scanning strategies as well. I could see no flaw in this technique during the time I was involved with the CASA.

From an operational aspect the most challenging thing that I could think of regarding the CASA was how this data would be presented to the NWS forecasters. In my opinion, the less software sources a forecast has the more efficient they will be in the long run. Since the CASA data is data from the lowest levels of the atmosphere, would this data be “appended” to 88D data? Or would it actually be a different source they would view during an event. I guess I would have to see how this is done, but to me if it can some how be morphed together with the 88D it would be better. But, of course, I sure would like to just have this problem. We’ll see what time will brings!

On to the PAR. I found it incredible what the PAR folks had accomplished taking a military radar and converting it into something that closely resembled 88D data (better of course). Like the CASA, I was very impressed but at the same time, disappointed, knowing that this technology was out there and NWS forecasters could not use right now. With the rapid refreshing of the data again, viewing of the data had a HD type of effect.

I was lucky enough to be in front of the PAR during a cold pool weak tornado event in the Norman area. Probably most notable during this event and viewing archived data was what I would call “brain overload”. This was a common feeling with the other forecasters at the experiment when I was there. The rapid scanning of data was so much more data than we were compared from the 88D, one tended to tire quicker. I think operational forecasters may get used to this, but maybe not, and it may add to more fatigue when compared to current data set viewing in the 88D. A possible simple solution is for software to dynamically adjust to each users preference for this effect. Of course, the more data we have the better, but viewing in real time of this data will probably differ from forecaster to forecaster. Or, as they become more familiar with the data they could probably become more adapted to it.

On to probabilistic warnings. Now this was something I had to prepare for but really had no idea what to expect. After getting used to the warning software, I was very surprised that this technique of warning was not as challenging for me as I thought it would be. As a matter of fact, it was really fun and I enjoyed the challenge. I don’t know why, maybe it was the fact that it was a different type of challenge for me. I do suspect that the skills to produce these types of warnings will vary from forecaster to forecaster, but that will always be present for operational forecasters. There area still a numerous unanswered questions regarding this technique, but I think the EWP group is on the right track of solving them and the questions that have not popped up yet.

As far as the unanswered questions, there are numerous, but just a few I can remember. Exactly how does verification occur. At what threat value do probabilistic warnings begin being issued. It would likely be useful for users to know when storms have “no threat” at all. Does this mean we need to provide probabilistic forecasts for nearly all thunderstorms? How much automation can be provided to the process with the help of climatology, near storm conditions, algorithm output…etc.

In summary, my experience at the EWP was totally fulfilling. It was a privilege for me to partake in the experiment and interact with the friendly, intelligent and very professional folks at the NWC. In my opinion, getting NWS operational forecasters involved in this process (not just MIC’s/WCM’s and SOO’s) is a great way to go. I really felt that my input was appreciated and would be taken seriously to possibly improve everything that we were involved with. I have high confidence these experiments will result in something beneficial to the NWS and their users. Thanks to everyone that I met. I only wish I could come back again, but at the same time, realize how important it would be for others to be involved in this process.

Bill Rasch (NWS Billings MT – Week 2 Participant)

Tags: None

Forecaster Thoughts – Mike Cammarata (2008 Week 1)

Here is a summary of my experience in the 2008 EWP Spring Experiment:

  • I was most impressed with CASA data
    • High resolution of storm scale features and rapid refresh of data
    • Adaptive scan strategy
    • Overcomes horizon problem
    • Overlapping radar coverage
    • Attenuation was a problem
    • Noisy data was also a problem but it appears that efforts to mitigate this were working
    • Wish we could have used the CASA network in real time
  • Was not as impressed with the PAR data (did not see much difference in comparison to the 88D)….but must remember that an operational PAR will have better spatial resolution.
    • Got to use the PAR in real time for a supercell outbreak
    • Rapid rate of incoming data was both an advantage and a challenge…was difficult to keep up with the incoming data
    • FSI was useful with the PAR during this event but slowed the system down

  • Rapid rate of incoming data will be a significant challenge for forecastsers
    • Will have to learn/be trained on how to selectively interrogate data
    • Will require more help from algorithms
    • Forecasters will fatigue more quickly

  • Poor system performance and lack of familiarity with WDSSII was an obstacle for me to focus on evaluation of the data
    • Would be better if data could be viewed on AWIPS (perhaps AWIPS II)
    • Familiar procedures and color curves would help even more
    • The system had trouble keeping up with the incoming data (performance slowed considerably)

  • I feel that interviews rather than surveys would be the best for getting feedback. Would lead to better (quality), more (quantity), and more targeted (specific/focused) communication.

  • During the real time event there was a lot of commentary from individuals in the area of the workstation (PAR) and SA display. I found this to be somewhat distracting.

  • Everyone that I interacted with during the evaluation was extremely helpful. I am thankful for and appreciate everyone’s hard work and helpfulness.

  • Not sure what to say about gridded probabilistic warnings. I think this is a direction we need to go but the approach during this evaluation was very subjective. Ultimately this has to be much more objective to get consistency between forecasters and events. Both users and forecasters will need to have a better understanding of what theses warnings mean. That said, I thought the software was a good tool for drawing the warnings.

I am thankful for the opportunity to participate.

Mike Cammarata (NWS Columbia SC – Week 1 Participant)

Tags: None

Forecaster Thoughts – David Blanchard (2008 Week 1)

Monday: 04/28/08

Started the day with a weather briefing and overview of the warning program. This was followed by a tour of the National Weather Center, which houses both OU and NOAA components.

A severe weather event was already in progress in the eastern portion of Virginia and North Carolina. Instead of getting formal training on the probabilistic warning software, we jumped right in to take advantage of the situation—especially since it was obvious that the severe weather was moving offshore and out of our data domain. Afterwards, we reverted back to the more formal training of the WDSSII software.

The software is pretty amazing in that we can view data from any radar in real time using standard tilt sequences, CAPPIs, and cross sections. It is similar to GR2AE but can combine multiple radars.

Tuesday: 04/29/08

After the briefing (no convective/severe weather expected today), we broke into groups. I worked on a CASA case. Volume scan data was updating at less than 1-minute intervals. Also, the data resolution was higher than WSR-88D data and the nearness of the storm to the radars allowed an exceptionally detailed view of the evolution of the supercell, hook echo, TVS, and TC. We were able to see the “debris ball” and weak echo center of the tornado circulation. The “knobology” of the WDSSII system, however, was frustrating and we spent too much time trying to understand how to view the data instead of analyzing the data.

We viewed and analyzed a second case using PAR data. This data was similar in azimuthal and radial resolution to the WSR-88D data but its temporal resolution was much higher with volume updates every 45 s or so. This was a low-topped, low-CAPE tropical environment and most of the relevant data was contained in the lowest two to three tilts so it wasn’t necessary to step through an entire sequence.

Both the CASA and PAR cases released a torrent of information at us and it become evident that more automation would be required to free the warning meteorologist from the mundane tasks so that he can focus on the meteorology and science of the evolving situation.

The next case was a ProbWarn situation for a severe thunderstorm that was capable of producing both large hail and tornadoes. The goal was to assign threat areas and probabilities and update as required. This case used WSR-88D data so the data flow was more typical of an operational warning environment. The “knobology” of the software again got in the way of the science.

All forecasters agreed that this would be an easier task if the radar data were integrated into AWIPS/D2D so that we could use a more familiar environment. Maybe next year.

Wednesday: 04/30/08

Once again no significant convective weather is expected across the CONUS today with the possible exception of late evening initiation in western Nebraska. We break into groups for additional training on both CASA and PAR cases.

The PAR case is done with Les Lemon as the facilitator. It is a near-tropical environment with weak shear and only modest CAPE. Most likely threat is hail. Data volumes update frequently and this makes it easy to see the development of high reflectivity cores aloft. The cross section tool is also useful once I get the hang of how it works. It becomes fairly easy to monitor the upper levels of the storm and to issue “warnings” for large hail. Using WSR-88D data would result in 4.5–6 minute update times for volumes and it would be easy to miss important details in the evolution of these storms. I did, however, miss the strong surface winds and possible microburst because I was focused on viewing the higher tilt sequences for hail signatures.

The CASA case was a southward moving squall line with very strong winds located some distance behind the initial gust front. We were able to resolve the evolution of various “swirlies” on the leading edge of the convection, some of which developed moderate rotational velocities. None, however, had significant updrafts overhead (i.e., the updrafts tilted upshear which is not atypical for a mature squall line) and would not be considered tornadoes. We agreed that a high wind warning was appropriate.

Behind the squall, a Rear Inflow Jet (RIJ) was developing. The nearness of the radars to the convection allowed a detailed examination using RHI cuts through the system and we were able to see a classic RIJ structure behind the convection underneath the mesoscale anvil shield. This RIJ was nearly quasi-horizontal as it flowed under the anvil shield but tilted down sharply at the back edge of the convective line impacting the surface underneath the convective line. The RIJ was likely a significant source of the strong surface winds associated with the squall line.

In summary, the rapid update of both the PAR and CASA allowed us to monitor the evolution in a way that WSR-88D cannot. In addition, the four CASA network allowed at least one radar—and often two or three—to have a close look at the system so that we were able to resolve small-scale features.

Late in the evening convection developed in extreme eastern Wyoming and western Nebraska but it was too late in the evening to use it as a ProbWarn case.

Thursday: 05/01/08

The convective outlook for today was favorable and plans were set up for an IOP in the evening. Additional cases were viewed this afternoon and we reviewed a CASA case from a few weeks ago. This year’s data is much less noisy than data collected last year. The case we examined developed a convective line with “book-end vortices” that were well resolved in both reflectivity and velocity.

We went into IOP mode around 5 p.m. with both CASA and PAR. The southernmost storm was just barely in the northeast lobe of the CASA array and it became obvious that we would not be able to follow the evolution of the system with these radars. We switched to PAR which was ideally suited to view the storms.

Because the storms formed on the dry line they were oriented NE–SW. Initially this meant the northern storms were to our north and the southern storm was located to our west. It was close enough to the radar that it was partially in the “cone of silence” which is much larger for PAR than for WSR-88D. As the storm moved to the northeast, we had to update the viewing sector so that we could continue to monitor the storm. This sectoring problem will go away once the PAR has all four faces running; for now, only one face is available limiting the viewing angle to 90 degrees.

Later, all storms were located to our northeast and were essentially along the same radials from the radar resulting in velocity range folding (i.e., the “purple haze”). We also had some velocity dealiasing problems with some of the higher velocities in the mesocyclone. Consequently, it became difficult at times to get clean velocities from the storms. KTLX did a much better job on some of the storms. We also viewed KVNX for the northern storms since it had a better viewing angle.

In addition, the data refresh rate overwhelmed the WDSSII software and we struggled to view the data. In hindsight, this was almost certainly a result of having two WGs (WDSSII-GUI) running on the same workstation, which quickly consumed the available memory and cache.

Mike and I worked with Pam and Les on the PAR. Because this was a live case nobody yet knew the outcome. It was useful to have Les looking at the data and make suggestions on where to focus our attention and to point out features in the data that might be important.

Summary: an excellent case within the PAR domain but software issues prevented us from fully utilizing the data. If PAR and CASA data were integrated into AWIPS/D2D this would not be an issue since we would be using familiar software. WDSSII is research software and is not always adequate or appropriate for real time viewing of massive amounts of data.

Friday: 05/02/08

Last day of Week 1 and we spend much of it in a round-table discussion of what worked, what didn’t work, possible ways to mitigate the problems, and general suggestions. We also reviewed some of the data from last night’s supercells plus the cold frontal squall line that marched across the state overnight. The squall line move through both the CASA network and within range of the PAR (but because of sectoring issues, PAR had to choose whether to look at the northern end or southern end of the line).

We challenged the systems with the tremendous amounts of data flowing into the software and fully stressed it to the point that it was difficult to use. Our comments to the facilitators made it clear that some changes may be required with the most obvious being to run fewer instances of WG on a workstation and to load fewer windows. Instead, spread the WGs and windows across a multitude of workstations to reduce the load on any machine.

I’m excited about the future possibilities of PAR and CASA radars in an operational and warning environment. The improved spatial coverage offered by CASA and the increased temporal updates from both CASA and PAR means we will be better able to monitor the rapid evolution of severe storms. CASA also holds tremendous promise of filling in areas that have poor coverage at this time. That, of course, includes much of the western United States.

I’m less certain of the ProbWarn experiment. I’m a long time advocate of probabilistic warnings and was eager to try to issue warnings using this tool and philosophy. It is still in its early stages and much needs to be done. One of the biggest issues is how to calibrate probabilities for various threats. Different forecasters will issue different probabilities for the same threat. Another issue, albeit minor, is to distinguish these probabilities—which are really threat probabilities—from our traditional warnings. There will be some evolution in how the software works, how forecasters issue threats, and how to calibrate these. There will also be substantial training required of forecasters since this will be a paradigm shift in how we issue warnings to the public.

David Blanchard (NWS Flagstaff AZ – Week 1 Participant)

Tags: None

Forecaster Thoughts – Patrick Burke (2008 Shakedown Week)

Experimental Warning Program Blog Entry: 4/23/08

Patrick Burke

Between 3 and 5 pm today, Angelyn Kolodziej and I ran through an archived event, using PAR data as the basis for mock severe weather warnings. We both noted the utility of rapid-update radar data in catching the onset of low-level mesocyclogenesis. It was also enlightening to interrogate storms as a team viewing the same data. This forced us to state our reasoning aloud, and resulted in a verbal exchange of conceptual models and warning philosophies. I felt that we arrived at more accurate and timely warning decisions than either one of us could have accomplished alone. When the 1.5 hour simulation concluded, it was fairly obvious from my five plus years of NWS warning experience that we had issued more warnings over a small area than we would have issued using WSR-88D data. Angelyn and I suspect the PAR scan strategy captured certain features that tipped the scales toward issuing a warning, and which may have fallen between traditional WSR-88D volume scans.

Shortly after 5 pm, we had intended to begin probabilistic warning operations, but paused to observe the supercell that came up to the south and east of Norman. All eyes were fixed on the live updating PAR data and the SADS. Oklahoma City television crews delivered video of a low hanging wall cloud with rising scud, but weak rotation. The circulation never quite tightened up, and the storm had trouble maintaining supercell structure for any length of time.

The Norman storm was at the southeast extend of an extensive cluster of multicell thunderstorms that spread northward across Oklahoma through mid evening. Between 6 and 8pm, Mike Magsig worked this activity from one of the probabilistic warning desks. Meanwhile, at the second desk, I shifted my attention to an ongoing high-end severe weather event in north Texas. A long-lived, high-precipitation supercell moved eastward into the Fort Worth WSR-88D domain. This storm expanded in size, forming a classic bow echo anchored by a broad mesocyclone at the northern end. An exceptionally intense rear inflow jet presented 90 to 100 knot ground-relative velocities at times. Out ahead of this complex, another large supercell formed and approached the southern sides of Fort Worth. This storm quickly took on the appearance of a classic, tornadic supercell. Eventually, the bow echo overtook the tornadic storm in the vicinity of the KFWS radar. With the advantage of near range sampling at low levels, the radar detected several small-scale vortices along the leading edge of the storm outflow.

The variety of storm modes and storm motions in the two operational domains fostered a productive discussion between myself, Mike, and Greg Stumpf. Mike had difficulty drawing probabilistic grids for the transient multicell hail storms taking place in Oklahoma. One potential approach may be to outline a broad area of low probability severe hail, and then embed shorter duration, higher probability warnings for particular cores that show some persistence or organization. Whatever the warning philosophy, the actual grid preparation could benefit from some type of automated routine or suite of routines for defining a threat area. For instance, a tool that outlines the 55 dbZ contour with attention to echo overhang might be a good starting point for drawing a hail threat area.

The Texas storms brought up a host of even more complex issues, including personal tornado warning thresholds expressed as a percent chance of tornado, detailed spatial resolution of tornado threats (e.g. high probability surrounding a TVS and lower probabilities along the RFD gust front), and how to emulate longer lead-time information in a probabilistic way (e.g. drawing a 2-hour probability swath to mimic a special weather statement that WFO Fort Worth issued to raise awareness in the metropolitan area).

Much of our conversation stemmed from large scale design issues, using examples from the evening’s data to explore probabilistic warning strategies. In general, we concluded that in designing a probabilistic warning system, researchers may begin with an idealized philosophy, and then incorporate forecaster preferences that have been gained through experience. Many of these preferences will hopefully become evident throughout the course of the EWP spring activities.

Patrick Burke (WFO OUN Forecaster, EWP Weekly Coordinator-in-Training)

Tags: None