Live Blog – 2 June 2008 (7:19pm)

Participants rotate (clockwise). Jon now on AWIPS using FSI amongst other things. Chris has the PW mouse and has opted to start up fresh to get more familiar with PW tool. George is now in assistant chair.

Liz Quoetone (EWP Weekly Coordinator, 2-6 June)

Tags: None

Live Blog – 2 June 2008 (7:06pm)

Shave is operating on our storms. Golfball hail in eastern most storm. This storm is undergoing some mergers (area is getting larger) and turning a little to the right so threat area is updated based on this.

Jon suggests that tracking feature translate (as in WarnGen) to better help forecasters see if they’ve located the same feature in previous frames.

Liz Quoetone (EWP Weekly Coordinator, 2-6 June)

Tags: None

Outlook – 2 June 2008

All undergoing training/familiarization today with CASA/WDSSII/Prob Warn. Will do PAR overview tomorrow.

Looks like nothing in the testbed tonight. Perhaps nothing until Thursday. Will have a PW IOP tonight over whatever area looks best, either new stuff going up along the Rockies or long-lived MCS/bow moving thru southeast MO.

LizQ

Tags: None

Week 5 Summary: 27-30 May 2008

In week 5, the Experimental Warning Program reached out to Canada and the Pacific Northwest, bringing in meteorologists with quite unique perspectives on thunderstorms and warning operations. Our forecaster/evaluators included Brad Colman, Meteorologist in Charge at the Seattle Washington NWSFO, Eric Stevens, Science and Operations Officer at the Fairbanks, Alaska NWSFO, and Mark Melsness of Environment Canada in Winnipeg. Adding some local experience to the group, we had Kevin Brown, Senior Forecaster at the Norman, Oklahoma NWSFO. And I am Patrick Burke, General Forecaster at the Norman NWSFO; I served as Weekly Coordinator, but also as an evaluator for Thursday’s operations.

Although the Memorial Day Holiday shortened Week 5 to three and a half days, the group was able to work on all three experimental data platforms, including plenty of live data – particularly for probabilistic warnings.

Tuesday initially showed some promise for a Central OK intensive operations period (IOP), so we ran with a game plan to complete PAR and CASA training. The group sat down for the first time in front of WDSSII to practice data interrogation using a live supercell near Altus, OK. Kevin and Eric then viewed PAR data for about an hour, with this storm just near the edge of the domain. Unfortunately, stable air overspread central OK, and all the thunderstorms propagated away from the PAR and CASA domain. Participants then turned to archive cases to round out the evening.

Wednesday brought an opportunity for the groups to trade places on the PAR and CASA archives before moving smoothly into Probabilistic Warnings for the remainder of the day. Upslope flow pushed mid and upper 50s dewpoints onto the parched high plains of eastern New Mexico, while mid and upper level winds showed a gradual increase downstream from a trough over southern California. The situation proved favorable for severe storms. Coordinating with the SHAVE project to find relatively dense verification swaths, Brad and Mark issued probabilistic warnings for hail on multiple storms, and eventually one low-probability warning for tornadoes. Meanwhile, Kevin and Eric inherited a long-lived eastward moving supercell which paralleled Interstate 40 from near Albuquerque to Tucumcari. The team issued probabilities for hail and tornadoes for this and a second cell which followed in the same path. Both cells received traditional tornado warnings from the Albuquerque NWSFO. Kevin and Eric eventually added a probabilistic swath for severe hail when the lead supercell took on high-precipitation character with an extensive rear flank downdraft. It was very impressive to see how comfortable the teams became with issuing multiple threats for multiple storms within hours of first being introduced to the experiment.

Thursday presented the best opportunity yet this spring for participants to test probabilistic warnings during an outbreak of long-lived tornadic supercells. After our map discussion which included a categorical High Risk for severe weather in Nebraska, we chose to put Kevin, Eric, and Mark stright to work on the Prob-Warn archive case; it is important to have as many forecasters as possible provide feedback on this one particular case so that meaningful statistics may be derived. We said goodbye to Brad who left as planned so he could attend to other obligations. Thus, when we jumped on the live Prob-Warn operations at 2130 UTC, Kevin was paired with Mark, and Eric with myself.

Both teams inherited severe storms already in progress, and though it was not the original intent, storms aligned such that it was beneficial for Team 1 to work within the Goodland NWSFO CWA, and Team 2 within the Hastings CWA. At one point this resulted in a unique opportunity to coordinate the passing of a probabilistic swath across CWA borders. Another storyline developed as teams tested the workload by issuing probabilities for hail, tornado, and straight line winds for each of 3 different storms, resulting in 9 threat areas. Threats from one storm often overlapped those of another storm, and the teams took to giving their warnings meaningful names based on the location of the initial warning.

Operations were also enhanced by the Situation Display which showed live video streaming from storm chasers in both CWAs. The Hastings team worked a storm that appeared to be producing a significant tornado at Kearney, NE, while the Goodland team received occasional tornado reports from Sheridan to Rooks Counties. The Goodland storms found deeper moisture and began producing more significant tornadoes near Jewell and Beloit, KS, just after the IOP ended. We allowed enough time to hold a short debriefing late that evening while replaying the event through WDSSII on the Situation Display.

Friday allowed us to hold a more thorough round table discussion of the Thursday event, and also reflect on previous days’ events. Despite the holiday-shortened week, forecasters gained experience with PAR, CASA, and Probabilistic Warnings, and worked one of the most data-productive events of the season on Thursday. Before concluding for the week, we were treated to a brown bag lunch at which our participants presented the following:

Mark Melsness: “Precipitable Water Veriication of a Ground-Based Sounder using RAOBS at Winnipeg.”

Eric Stevens: “Impact of Snow Cover on October Surface Temperatures in Fairbanks” (a.k.a. “Falling off the Cliff”)

Kevin Brown: “Influence of Radar Beam Ducting on Warning Decisions”

PAR Discussion:

· DISPLAY

o Multi-panel (even greater than 4) would be useful

o For Looping, would like to select from a range of time resolutions since 1-min updated is not as important 60 minutes in the past. Perhaps a hybrid loop that has 5-min resolution transition to 1-min upon nearing the present.

o Interlaced 0.5 deg data is good

· DATA

o Would like improved azimuthal resolution (beam width) at long range

o Like opportunity to scan even faster in 45 deg sectors

· ANALYSIS

o Tuesday’s Elk City supercell split was recognizable in PAR earlier than 88D

o For other features, KFDR was best simply because of location

o Very useful for intensity trends

o Numerous features that might prompt warnings from 88D perspective, but many are transient. Felt it was a luxury to watch them evolve and warn only on those that showed consistent upward trend in intensity. Will need much training on this concept to avoid dramatic increase in false alarm rate

o Could combine PAR with Prob-Warn thinking, and a good approach could be broad area of lower prob warning with pinpointed short-duration higher prob warning for these transient features.

CASA Discussion:

· DISPLAY

o Please add contours of wind speed to 3DVAR product

o 3D isosurface of wind speed, looped in time, would be useful

o Four-Dimensional Storm Investigator could be used in lieu of RHI

· DATA

o Adaptive scanning was too hard to follow. Forecasters preferred the 2.0° scan which is 360°. Greg Stumpf mentioned that WDSSII has a merger that can put all the data into a 3D grid, update only those portions (sectors) of the grid with live data, and then time-to-space displace the older data. 3DVAR product can also do the same

o Dual-pol could be useful for non-precipitation returns, like smoke, volcanic ash, during “big bubble no trouble” conditions (i.e., “severe clear”)

o Some were overwhelmed at the resolution, especially spatial. Need 88D-like data working side by side to maintain awareness of the big picture.

· ANALYSIS

o In Alaska, CASA would be more useful as a gap filler under 88D beam over population centers, rather than as an overlapping adaptive network. He gave the example of Delta Junction, AK, where there are “competing” mountain-valley circulations (Chinook v. Bora) which the CASA radar could help diagnose.

o Could be useful for Olympic venues (although for Vancouver, will be using mobile radars).

PROBABILISTIC WARNING Discussion:

· DISPLAY

o Need an alert system for warnings nearing expiration, or perhaps for storms bleeding outside the warning swath. Something akin to the AVNFPS could be ideal.

· DATA

o Would like options for applying a more advanced approach to swath creation, such as different motion uncertainty and buffer to the left versus right

o Would be nice to be able to multi-select several warnings and group-adjust variables like the motion vector.

· ANALYSIS/OPERATIONS

o Need to add information about intensity (e.g. Hail Size, Wind Magnitude)

o Workload is an issue. Varying degrees of comfort handling multiple threats on multiple storms. Could let algorithms assist, especially with hail which tends to be easiest to detect. Forecasters want ability to QC any warning the algorithm suggests before sending to public

o Like the ability to issue low-prob threats as this more continuously conveys trends in forecaster thinking compared to legacy warnings

o Gridded probabilities offer ability to derive output with many grades of sophistication for various users

· Related Discussion

o Greg Stumpf: What if a storm begins to turn right after issuing a storm-based polygon? Do you issue a new polygon? Does it overlap the old? Do you cancel the old polygon?

o Patrick: The gridded warning concept is preferred, as it allows you to nudge the motion vector when needed.

o Patrick Related his experience taking over warnings on the 5/24/08 Oklahoma supercell. With two tornado warnings in effect for the same county, each labeled directionally (e.g. Eastern Noble County) he cancelled the western warning (essentially the threat was advecting east out of the first warning and into the second. A television station only picked up on the Cancellation headline, and not the continuation of the eastern warning mentioned in the text. Thus, he had to issue another SVS 1 minute later to reiterate this. This was a good example of a non-meteorological condition affecting warning judgment. Feels that the PW system would have dealt with this a lot better.

TRAVEL & EXPERIMENT LOGISTICS:

· Organizers made experience very easy.

· Enjoyable.

· Nice mix of lecture and hands-on experience.

Patrick Burke (EWP Weekly Coordinator, 27-30 May)

Tags: None

Forecaster Thoughts – Kevin Brown (2008 Week 5)

I felt quite fortunate to be able to look at real-time CASA/PAR data sets. Although the amount of time and coverage of echoes was fairly limited, being able to see the rapid updates in real-time was valuable. Along with the higher resolution of CASA data, the increased frequency volume scans from both CASA and PAR appear to be challenges for the operational forecasters. The faster updates do not allow a lot of time for base data interrogation/interpretation, so forecasters will need to be more selective in what data to interrogate. This is primarily a training issue, to varying degrees, for each forecaster.

On two different days, we were able to work with probabilistic warnings in real-time, and from an operational forecaster perspective, I see great utility with this program. Currently, it can be quite difficult to get the overall thinking of the warning forecaster(s) across to his/her users and partners. There are shades of uncertainty that cannot be conveyed with the warn-no warn concept. Being able to issue probabilistic information should provide much more useful information to our partners and more sophisticated users. Conveying information probabilistically will allow some of our more advanced users to “get into the head of the warning forecaster”. During our probabilistic operations we mainly dealt with discrete supercells, and after a minimal amount of time, became somewhat proficient at issuing single and even multiple threat probabilities. However, I could see it being more challenging with squall lines and LEWP events.

I enjoyed the time I spent in the EWP, and am grateful for being able to work with such talented scientists.

Kevin Brown (WFO Norman OK – Week 5 Participant)

Tags: None

Summary – 29 May 2008

Debrief of the event with Eric / Patrick (Team 1) and Kevin / Mark (Team 2).

Patrick and Eric started off with a low-probability tornado warning early on — just looking at tornado threats. Patrick would have issued a Tornado Warning arounf the time that the actual NWS Warning came out. He felt like issueing the pre-warning low-probabilities was very natural. Kevin says that this mirrors what happens in a real office as the forecasters discuss how confident they are feeling about issuing a warning.

The two teams did hand off one storm from one group to another around 2200-2215 UTC.

Playing back the tornado threat areas overlaid on the NWS Tornado Warnings: Eric believes that the grids make it easier to think in a storm-based mode. Kevin mentions that the NWS tornado warnings are probably for the entire storm (including hail / wind threat).

Note: 2257 there may have been a time error on the southern set of storms. Check data later.

There is an interesting example around 0000 UTC of a threat area at the border of three CWAs. We are also noting differences in how big the NWS polygons are drawn from CWA to CWA.

Patrick notes that he would be OK with some level of automation — especially for hail threats, and especially if we are issuing warnings for different threat types. He wouldn’t want the algorithm issuing the warning, but would like the guidance that he could tweak and then issue.

Patrick notes that the low-probability threats they issued were based largely on the environmental conditions that the storms were developing in.

Eric thought that the workload was pretty heavy — he took over nine threats halfway through the event from Patrick. Patrick thought that the load was not too different than an event at the NWSFO.

Kevin says that the warningContourSource should either (a) advect with your warning or (b) overlay an outline of your cone shape that shrinks with time, it would help with management. Too many circles, and it was hard to tell which belonged to which storm.

Mark would like to see the circles (initial threat areas) advect as well.

Eric let one expire by accident — he would like to see a situational awareness tool that draws the forecaters eye to the expiration.

Eric would like to see a way to group the threat areas together to change the direction on multiple areas at once.

Eric says that he really likes the ability to show low pre-warning probabilities, and Kevin agrees with this. It would be useful for downstream users. Eric likes that it lets the forecaster focus on the meteorology and science and divorce it from the policy.

Patrick thought that the knobology would be difficult to work with, but found that it didn’t seem too much different than working in the NWS office. Thought he was able to manage the same workload in WDSSII (with some experience) as in the NWSFO.

Travis Smith (EWP Backup Weekly Coordinator, 27-30 May)

Tags: None