Summary – 29 May 2008

Debrief of the event with Eric / Patrick (Team 1) and Kevin / Mark (Team 2).

Patrick and Eric started off with a low-probability tornado warning early on — just looking at tornado threats. Patrick would have issued a Tornado Warning arounf the time that the actual NWS Warning came out. He felt like issueing the pre-warning low-probabilities was very natural. Kevin says that this mirrors what happens in a real office as the forecasters discuss how confident they are feeling about issuing a warning.

The two teams did hand off one storm from one group to another around 2200-2215 UTC.

Playing back the tornado threat areas overlaid on the NWS Tornado Warnings: Eric believes that the grids make it easier to think in a storm-based mode. Kevin mentions that the NWS tornado warnings are probably for the entire storm (including hail / wind threat).

Note: 2257 there may have been a time error on the southern set of storms. Check data later.

There is an interesting example around 0000 UTC of a threat area at the border of three CWAs. We are also noting differences in how big the NWS polygons are drawn from CWA to CWA.

Patrick notes that he would be OK with some level of automation — especially for hail threats, and especially if we are issuing warnings for different threat types. He wouldn’t want the algorithm issuing the warning, but would like the guidance that he could tweak and then issue.

Patrick notes that the low-probability threats they issued were based largely on the environmental conditions that the storms were developing in.

Eric thought that the workload was pretty heavy — he took over nine threats halfway through the event from Patrick. Patrick thought that the load was not too different than an event at the NWSFO.

Kevin says that the warningContourSource should either (a) advect with your warning or (b) overlay an outline of your cone shape that shrinks with time, it would help with management. Too many circles, and it was hard to tell which belonged to which storm.

Mark would like to see the circles (initial threat areas) advect as well.

Eric let one expire by accident — he would like to see a situational awareness tool that draws the forecaters eye to the expiration.

Eric would like to see a way to group the threat areas together to change the direction on multiple areas at once.

Eric says that he really likes the ability to show low pre-warning probabilities, and Kevin agrees with this. It would be useful for downstream users. Eric likes that it lets the forecaster focus on the meteorology and science and divorce it from the policy.

Patrick thought that the knobology would be difficult to work with, but found that it didn’t seem too much different than working in the NWS office. Thought he was able to manage the same workload in WDSSII (with some experience) as in the NWSFO.

Travis Smith (EWP Backup Weekly Coordinator, 27-30 May)

Tags: None

Summary – 28 May 2008

Today’s operations were quite successful. After the 1 pm briefing, our teams of two forecasters worked archive cases with PAR and CASA such that everyone has seen both data platforms in the two day period. Emphasis then shifted to the Probabilistic Warning program, beginning with an introductory presentation, then moving to the Hazardous Weather Testbed to practice the knobology of issuing warnings with WDSSII. The forecasters used data from a hail-producing thunderstorm in far southwest Texas to practice.

When operations began at 2200 UTC, operations moved up into New Mexico, within a tornado watch box where moist low level upslope, veering wind profiles, and an impinging broad jet stream supported supercell structures. Team 1 (Brad and Mark) were assigned a hail-producing storm near Los Alamos, where the SHAVE project was able to make verification calls in real time. Team 2 (Kevin/Eric) jumped onto a rightward-moving supercell in a rural area ENE from Albuquerque. The event was particularly well defined for Team 2, who followed this and a second supercell that developed in close proximity…for the entire 3.5 hour IOP. They maintained high tornado and hail probabilities with the lead supercell until it became HP, at which time the tornado probabilities decreased somewhat. The second supercell followed the path of the first, and the team had lower tornado probabilities until a well defined RFD caused probabilities to spike just at the end of operations.

Team 1 issued hail swaths on a series of storms that initiated in the foothills near Los Alamos. SHAVE verified this with marginally severe hail, and one golf ball report. After this area was overturned by a train of cells, the team shifted operations to southeast New Mexico. Much like Team 2, Team 1 was able to handle probabilistic threats for hail on two different storms, with a short-lived tornado probability on the eastern-most storm. The activity was just becoming elevated with a weakening trend when operations ended at 0130 UTC. The groups then gathered around with our two Prob-Warn cognizant scientists, Kristine and Travis, for a post-event discussion. We will take a fresh look at this event during the first part of our briefing on Thursday.

Patrick Burke (EWP Weekly Coordinator, 27-30 May)

Tags: None

Summary – 27 May 2008

Our four visiting forecasters arrived ready to work, and the day promised a fair chance of Oklahoma-based operations. After a quick tour of the National Weather Center, the group convened in the NSSL Develoment Lab for orientation, followed by a map discussion. Participants were trained on the PAR and CASA data platforms before moving to the Hazardous Weather Testbed around 21 UTC. Software training for the WDSSII used real-time data of a supercell occurring near Altus, OK.

Kevin (WFO OUN) and Eric (WFO AFG) then jumped on at the PAR workstation to dissect the Altus supercell at long range. Brad (WFO SEA) and Mark (Environment Canada – Winnipeg) practiced WDSSII using KFDR data on the same supercell, and were on standby for possible CASA operations. By 23 UTC, though, it became clear that thunderstorms would propagate southward into northwest Texas…as stable air emanating from a second storm complex had overspread the CASA domain.

By 2330 UTC…real-time operations ended, and both groups of forecasters turned to archive events. The pace slowed down, allowing more time for discussion of data strengths and weaknesses. The PAR data, in particular, spurred some interesting ideas as to what forecasters would ideally like to receive from a radar system. The CASA participants expressed some difficulty operating in a small domain using multiple radars. They also noted that the KTLX 88D better sampled one occurrence of strong straight-line winds, simply owing to viewing angle. They were impressed, however, at the temporal and spatial resolution of the CASA data which captures many interesting storm and sub-storm scale features.

Patrick Burke (EWP Weekly Coordinator, 27-30 May)

Tags: None

Summary – 22 May 2008

There was an HP Supercell event in Western Oklahoma, outside the PAR domain, but we worked it anyway to try to get the forecasters a real-time PAR event.

Otherwise, there was a cluster supercell outbreak in Central KS along I-70 as advertised by the SPC MODT Risk. Since we had the storm in Central Oklahoma, we did not work a ProbWarn IOP on the KS storms. Many storm chasers saw many fast-moving tornadoes.

A repeat performance is expected over KS and NW OK on Friday 23 May, however, we do not conduct operations on Fridays.  In fact, it looks quite active through 26 May, the Memorial Day Weekend, but we are off for the holiday.

Greg Stumpf (EWP Operations Coordinator)

Tags: None

Summary – 21 May 2008

SteveR — working with the new software and probabilities was easier tonight. Would be easier to combine wind/hail in some instances. Maybe a way to “paint” on the warning rather than having slider bars. Call to action based on probabilities — higher probs communicate stronger language. Forces you to think more about how the threat is changing/evoloving with time.

Jonathan — worked with SteveR, similar comments. Felt more comfortable. Had fewer threats tonight, so that may have contributed.

SteveH — knew what the threats were since it was in his backyard. Did a lot more analysis tonight even though he was working by himself. Updated every five minutes, 2 threats per storm, 3 storms. Was prety easy. Could work 7-8 storms if he was more comfortable with it. Storms moving same direction made it easier. Could work a lot more if it was the only thing he had to do.

Dave — finds it easier to work with ellipses than polygons. Really important to have a good measurement of the speed. Tonight we had development and decay of storms, and it would make it more challenging. No need to fine-tune areas when the storms are more “pulse” in nature.

Mike — larger your polygon the less you have to do…. modulates the work load.

Ryan — would like to make a “flexible ellipse” that could be stretched at one end. Would really like to see a preview.

Jonathan – WarnGen-style warning would be nice.

SteveR — would like the probability ticks every 5%. Several people agree.

SteveH — maybe tornado every 1%
Probabilities:

SteveH — when storms start rotating, you should put out a low-end probability

Jonathan — how will the ublic react to these low probabilities?

SteveR — how do we verify in this new paradigm? Verification is a huge workload at his office as directed by his region.

Jonathan — would like independent verification.

Discussion turned to the value of SHAVE-type verification and independent verification.

Another question: should GPRA goals be directly guide the probability threshold at which people should take action? Probably a WAS*IS question.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Summary – 20 May 2008

Wrapped up operations at 0030 UTC.

Ryan/SteveR started off with hail threats — looks like one “threat area” equals 4 NWS warnings art one point. A lot of the NWS warnings are county-shaped. Had a couple of tornado threat areas with low probabilities that matched a NWS tornado warning.

SteveH/Dave/Jonathan — started drawing big polygons and then got more comfortable with the software and began narrowing down their threat area. They ended up with six threat areas. SteveH thinks the display needs some color changes — threat areas overlay the data. Needs to be a contour instead of a solid block of color. Could have had better continuity. Jonathan believes the workload was too heavy to keep up with radar analysis.

SteveR wonders about data management with PAR and CASA data rates.

SteveH – current NWS warnings at his office are separated by threat type (sectorized).

Jonathan – combining threat types into a single warning (Tornado and Severe Thunderstorm) is easier to manage.

Ryan – liked the display of current hazards separated by type. Had to be more precise with updates to keep the storm in the current threat area polygon.

SteveR – felt more like a grid manager than a meteorologist making scientific decisions.

Ryan – would like storm motion first guess in the polygon.

SteveR – feels as fatigued now as in a real warning situation, even though it is just an experiment. SteveH nods in agreement.

Greg says what if you could just add a couple of features to WarnGen to include the probability and motion uncertainty? SteveR likes the idea.

SteveH would like to be forced to re-issue a warning every 20 minutes.

Ryan / SteveR believe that the limitations in the threat areas are primarily caused by the software and not in the science.

Mike M. would be comfortable with algorithm guidance providing a first guess for tornadoes (based on meso location) and hail. Greg would not automate the tornado threat.

Ryan thinks it might be good to combine the threats into a general “probability of severe weather” as a first step instead of Tornado/Wind/Hail.

The discussion wandered into the realm of forecaster workload across their entire spectrum of duties and wrapped up.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Summary – 19 May 2008

We examined some storms SW of Brownsville for the PWG training. No IOP this evening — any data collected are just for training purposes. No strong convection anywhere in the CONUS today.

At the end of the training, conversation ensued about the role of automated algorithm guidance in the warning decision-making process, how to calibrate probabilistic information for different threat type, and the value of advecting warning grids.

Travis Smith (EWP Weekly Coordinator, 19-23 May)

Tags: None

Summary – 15 May 2008

We did a probwarn mini-IOP on a supercell (and a few other storms) crossing the Rio Grande near Eagle Pass, TX. The storm had an unconfirmed report from law enforcement of a tornado after it crossed into the US, but a chaser on the scene we called from the HWT could not confirm it. This provided an interesting decision by our forecasters on how much confidence they put in the law enforcement report for their initial tornado probabilities. What was more certain to the forecasters is the large hail threat was very high, given the well-defined bounded weak echo region (below). Later, a 2.5″ diameter measured hail report was received from near Eagle Pass, along with downed trees.

Kevin Scharfenberg (EWP Backup Weekly Coordinator, 12-16 May)

Tags: None

Summary – 14 May 2008

Today we ran a PROBWARN IOP for the area of W to C Texas, primarily using KMAF, KDFX and KSJT, as well as the SC_Multi gridded data. We ‘sectorized’ putting DanM. and Ron P. on the northern part and Dave H. and Dan P. on the southern. Storms had begun about an hour prior to our starting time of 2130z.

The southern group spent most of their time (if not entirely) on storms forming off of “Old Faithful” on the Mexican side of the US border west of Eagle Pass, TX and moving into US territory. These storms trained, and provided some difficulty to Dan P. and Dave primarily in how best to handle trending given that some storms would increase in severe potential and others would die out.

Dan M. and Ron worked on at least one storm that transitioned from an HP supercell with primarily a hail threat to a tornado threat.

Dan P. sneaks a peek at the other team's warnings

Dan P. sneaks a peek at the other team’s warnings…

We ran until 7:30 pm and had some brief discussion before letting the forecasters take a look at some convection that moved into the CASA domain around 8 pm.

A few comments regarding the IOP follows:

  • It had been considered a couple of times that sectorizng based on threat type would be an iteresting exercise. I.e., one forecaster analyzes/warns on hail, while the other focuses on tornado/wind threats.
  • Workload issues were again brought up. Some had a hard time keeping multiple treats on the same storm (effectively tripling their warning issuance). And even those who were keeping up were losing situational awareness due to a “round robin” type of approach where the forecaster was going from storm to storm to nudge and move on. Main loss of S.A. in this case is in the vertical structure of the storm.
  • Comment was maid regarding a forecaster worrying or not completely feeling comfortable with providing the trend info. Perhaps a better tool for providing this info would help. Dan M. suggests starting with something like the GFE temporal editor.
  • Other software suggestions were having a transparency slider for the ProbGrid output. having the option to sync all the threat grids for the same storm to that storm’s motion

In the discussion that followed the exercise, some time was spent considering “exceeence thresholds” for different threats. For example, issuing a high probability for hail, but a low probability for hail larger than [golfball, baseball, etc.]. This adds another degree of freedom which A) allows the forecaster to provide greater detail, but B) adds another task for the warning forecaster and (potentially) increases workload.

Other concerns were the potential/likely inconsistency between forecasters and their probabilities. Should be an interesting discussion during the debrief.

Eve was impressed at the ability to

Eve was impressed at the ability to “Virtual Storm Chase”

Kevin Manross (EWP Weekly Coordinator, 12-16 May)

Addendum:

Here is the first attempt at an “accumulated” ProgGrid for yesterday’s event. Later, I’ll divide these grids into storm type and compare to Rotation Tracks, Hail Tracks, LSRs, and NWS warnings.

Greg Stumpf (EWP Operations Coordinator)

Tags: None

Summary – 13 May 2008

Well, we tried really hard to get something in CASA, but, alas, the atmosphere didn’t want to cooperate with us.

We had Dan P. and Dave H. work with CASA and Dan M. and Ron P. worked again with PAR. This configuration was suggested so that we have an experienced user during realtime ops. (These same forecasters worked the same respective stations last night).

Regarding the weather scenario – we had a few attempts at initiation in the extreme Ern part of the CASA domain, but that was the best we saw. Storms initiated along the cold front to the NE of the OKC area and the PAR remained focused on those storms during the duration of the evening. These storms moved very slowly – remaining anchored to the front. There were a number of severe warnings and one storm near Prague that had some broad low-level rotation (which tightened up from time-to-time) but never drew a warning.

One of the interesting things we tried tonight was to issue PROBWARN on PAR. (Dan M. is an old pro with both) Should be an interesting case to review from that sense.

As I type (~0110z), we have legitimate echo on the edge of the CASA domain.

Speaking of CASA, Dan P. offers part of a discussion (while waiting for *anything* to happen in CASA) regarding Three-Body-Scatter-Spikes (TBSS). Given the low-level area of focus of CASA, TBSS will be rarely, if at all, seen on this network. This, of course is an often-used indicator for severe hail.

Here is a snapshot of our PAR/PROBWARN exercise

And this is the best we could get for CASA (after 3 hours of staring intently at the screen). Actually, this also shows the 3DVAR Wind analysis product (on the left).

Kevin Manross (EWP Weekly Coordinator, 12-16 May)

Tags: None