Summary – 29 May 2008

Debrief of the event with Eric / Patrick (Team 1) and Kevin / Mark (Team 2).

Patrick and Eric started off with a low-probability tornado warning early on — just looking at tornado threats. Patrick would have issued a Tornado Warning arounf the time that the actual NWS Warning came out. He felt like issueing the pre-warning low-probabilities was very natural. Kevin says that this mirrors what happens in a real office as the forecasters discuss how confident they are feeling about issuing a warning.

The two teams did hand off one storm from one group to another around 2200-2215 UTC.

Playing back the tornado threat areas overlaid on the NWS Tornado Warnings: Eric believes that the grids make it easier to think in a storm-based mode. Kevin mentions that the NWS tornado warnings are probably for the entire storm (including hail / wind threat).

Note: 2257 there may have been a time error on the southern set of storms. Check data later.

There is an interesting example around 0000 UTC of a threat area at the border of three CWAs. We are also noting differences in how big the NWS polygons are drawn from CWA to CWA.

Patrick notes that he would be OK with some level of automation — especially for hail threats, and especially if we are issuing warnings for different threat types. He wouldn’t want the algorithm issuing the warning, but would like the guidance that he could tweak and then issue.

Patrick notes that the low-probability threats they issued were based largely on the environmental conditions that the storms were developing in.

Eric thought that the workload was pretty heavy — he took over nine threats halfway through the event from Patrick. Patrick thought that the load was not too different than an event at the NWSFO.

Kevin says that the warningContourSource should either (a) advect with your warning or (b) overlay an outline of your cone shape that shrinks with time, it would help with management. Too many circles, and it was hard to tell which belonged to which storm.

Mark would like to see the circles (initial threat areas) advect as well.

Eric let one expire by accident — he would like to see a situational awareness tool that draws the forecaters eye to the expiration.

Eric would like to see a way to group the threat areas together to change the direction on multiple areas at once.

Eric says that he really likes the ability to show low pre-warning probabilities, and Kevin agrees with this. It would be useful for downstream users. Eric likes that it lets the forecaster focus on the meteorology and science and divorce it from the policy.

Patrick thought that the knobology would be difficult to work with, but found that it didn’t seem too much different than working in the NWS office. Thought he was able to manage the same workload in WDSSII (with some experience) as in the NWSFO.

Travis Smith (EWP Backup Weekly Coordinator, 27-30 May)

Tags: None

Live Blog – 29 May 2008 (7:50pm)

Making a note of this for the ROC, lookin gat velocity data quality issues: Patrick noted some velocity delaliasing failures on KUEX at a crucial decision-making point about 30 minutes ago. He stated that this happened several times on KFDX data yesterday (last night) as well.

Travis Smith (Gridded Warning Cognizant Scientist)

Tags: None

Week 4 Summary: 19-23 May 2008

This week, visiting participants included Steve Hodanish (WFO Pueblo, CO), Jonathon Howell (WFO Memphis, TN), Ryan Knutsvig (WFO Elko, NV), Steve Rogowski (WFO Sterling, VA), and Dave Patrick (Environment Canada, Winnipeg) as full-time participants. Other participants this week included Mike Magsig (WDTB), Paul Schlatter (WDTB), Don Rude (U. Virginia), Jerry Brotzge (OU) and an NSSL support team of Kevin Manross, Kiel Ortega, Kristin Kuhlman, Pam Heinselman, Dave Preignitz, Angelyn Kolodziej, and Greg Stumpf. Travis Smith (NSSL) was the Weekly Coordinator.

Monday, May 19:

There was very little threat of significant severe weather anywhere in the CONUS today, which made it an ideal training day for our visitors. The afternoon consisted of general orientation, map discussion, CASA & PAR orientation, and WDSSII training. In the evening, the forecasters participated in probabilistic warning guidance training on some real-time data SW of Brownsville, TX (issuing warning guidance for Mexico!)

Tuesday, May 20:

PAR and CASA playback cases in the afternoon, followed by a probabilistic warning IOP starting around 4pm CDT. The area of interest is the Carolinas and Georgia. A detailed summary of the post-event discussion for this case can be found in the blog entry “20 May 2008 – Summary”.

Wednesday, May 21:

Similar to yesterday – PAR and CASA playback cases in the afternoon, followed by an IOP around 6pm for NE Colorado. A detailed summary of the post-event discussion for this case can be found in the blog entry “21 May 2008 – Summary”.

Thursday, May 22:

HP Supercell event in Oklahoma, outside the PAR domain, but we worked it anyway to try to get the forecasters a real-time event.

Friday, May 23, Summary Discussion

CASA – no real-time operations this week.

  • Dave: really like the assimilated data display, saw it as advantageous to look at velocity data as vectors as opposed to the radial velocity display.
  • SteveH: 2D Wind display was a bonus.
  • Several forecasters like the idea of a 2D wind field presentation.
  • SteveH: would like to see data out to 60 km, or some way to see the big picture.
  • SteveR: not confident in the sectored scanning.
  • Dave: wants to make sure he can see low-level boundaries.
  • Need a visual cue for the cross-sections.
  • Dave: cross-section is overkill,
  • SteveH: can do the same thing better with rapid scan of sector
  • Dave: would like RHI capabilities.

PAR

  • SteveH: 100% wind/hail for HP supercell was a no-brainer, tornado was not so clear. Noisy, range-folded. Reflectivity was excellent, however, and FSI cross-sections were very useful. Thinks it will be a huge improvements once it gets the bugs out. Was most impressed by pulse storm playback case – can see core development aloft (rather than by luck with 88D).
  • Jonathan – good tool to update warnings as well. Can see more variation in intensity, see microbursts, etc.
  • No one uses detection algorithms in their offices, except for the gridded MESH. SteveH: can better interpret the raw data than algorithms. Need to be able to trust the algorithms (like MESH).
  • Dave: likes algorithms, if they are trustworthy.
  • Everyone liked the pulse storm case – “Pulse storms are a pain…!” – SteveH.
  • Jonathan: can see between volume scans.
  • This could help with longevity of warnings.
  • “The fire hose is coming, but that’s why they teach the fireman how to hold the hose!” – Steve H
  • Jonathan – not going to be a big change, things will look the same as the 88D (same conceptual models).
  • 3D will be valuable in the future – need to be integrated into operational system – lots of things need to be learned about it.
  • Scan strategies? SteveH: does it need to be adaptive? It already scans really quickly.
  • There are trade-offs with high-resolution temporal versus spatial sampling.
  • SteveR: higher resolution at lower tilts would be nice.
  • They would like to show some simulated cases to their staff. Could do in FSI. Have other training priorities, but could be worked in.

Probabilistic warning guidance:

  • Ryan: new way of thinking. More sophisticated users will love it.SteveH: thinks it is straightforward for forecasters.
  • Jonathan – protecting life is the primary objective; need to be clear on call to action (yes/no answer for most users)
  • SteveR: we could do both methods –
  • SteveH: need training to forecasters on what the probabilities mean. Need to train public is more important though – forecasters
  • SteveH doesn’t like the future trend prediction. Didn’t know what to do with it (forecasting the short-term intensity variations is very difficult)
  • Need organizational software

May 30 playback case:

  • Mark is very comfortable with a high level of automation, even for low-end warnings, so long as forcaster has final approval. Canadian way of doing things.
  • Eric thought the workload for the prob warn playback case was about right.

General final thoughts:

  • SteveR: exceeded expectations
  • SteveH: tour of SPC would be nice
  • All: Good pace – they were here to work, so no problems.
  • SteveH: more training on probabilistic warning tools. Had a good experience.
  • Dave: was action filled.
  • Jonathan: send out WDSSII to offices beforehand.
  • Jonathan: had a very good experience
  • They would come back next year.
  • SteveH: wants 30-minute training sessions.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Live Blog – 22 May 2008 (8:32pm)

Calling a wrap on operations w/ ongoing storm in Dewey/Custer Co. so forecasters can do the PAR survey. SHAVE has some 3 inch hail reports from this cell.

SAD Observation: Some lady just won a bunch of money on Deal Or No Deal!

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Live Blog – 22 May 2008 (7:02pm)

Forecasters are issuing PAR-based warnings using the Prob Warning tool. The way to interpret the data is that they are encircling the storms of interest with the polygon tool and selecting either “Severe Thunderstorm” or “tornado” warning.

Dealiasing is an issue, but not just on the PAR.

Storm is moving into higher dew point air.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Live Blog – 22 May 2008 (6:33pm)

Low HWT staffing today, so Live Blogging is slow…

PAR IOP is up, with all forecasters participating. SteveH is also doing Prob Warn.

Gate-to-gate right now has good continuity that cannot be seen in the KTLX velocity data.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Summary – 21 May 2008

SteveR — working with the new software and probabilities was easier tonight. Would be easier to combine wind/hail in some instances. Maybe a way to “paint” on the warning rather than having slider bars. Call to action based on probabilities — higher probs communicate stronger language. Forces you to think more about how the threat is changing/evoloving with time.

Jonathan — worked with SteveR, similar comments. Felt more comfortable. Had fewer threats tonight, so that may have contributed.

SteveH — knew what the threats were since it was in his backyard. Did a lot more analysis tonight even though he was working by himself. Updated every five minutes, 2 threats per storm, 3 storms. Was prety easy. Could work 7-8 storms if he was more comfortable with it. Storms moving same direction made it easier. Could work a lot more if it was the only thing he had to do.

Dave — finds it easier to work with ellipses than polygons. Really important to have a good measurement of the speed. Tonight we had development and decay of storms, and it would make it more challenging. No need to fine-tune areas when the storms are more “pulse” in nature.

Mike — larger your polygon the less you have to do…. modulates the work load.

Ryan — would like to make a “flexible ellipse” that could be stretched at one end. Would really like to see a preview.

Jonathan – WarnGen-style warning would be nice.

SteveR — would like the probability ticks every 5%. Several people agree.

SteveH — maybe tornado every 1%
Probabilities:

SteveH — when storms start rotating, you should put out a low-end probability

Jonathan — how will the ublic react to these low probabilities?

SteveR — how do we verify in this new paradigm? Verification is a huge workload at his office as directed by his region.

Jonathan — would like independent verification.

Discussion turned to the value of SHAVE-type verification and independent verification.

Another question: should GPRA goals be directly guide the probability threshold at which people should take action? Probably a WAS*IS question.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None