The implementation of the Supplemental Adaptive Intra-Volume Low-Level Scan (SAILS) for the 88D radars presented a problem for the WDSS-II ingestor ldm2netcdf because it relied on VCP definitions stored in XML configuration files. Those XML files defined which elevations matched up with each tilt. However, SAILS can insert a supplemental 0.5 degree scan into the existing VCP at any time. With no changes, ldm2netcdf would incorrectly label the new 0.5 tilt as the next expected tilt as defined in its VCP XML file.
To solve this problem, ldm2netcdf now processes Message 5 in the Level-II data stream (the RDA Volume Coverage Data) to map each incoming tilt to the correct elevation. The new 0.5 elevations get correctly labeled and saved just like any other 0.5 tilt.
Algorithms listening to 0.5 elevations will be notified of these new tilts just like normal. Algorithms that listen to all tilts will insert them into the constantly updating virtual volume as the latest 0.5-degree tilt of data for that elevation. So, with the change to ldm2netcdf, downstream algorithms such as w2qcnndp, w2vil, w2merger, etc. deal with the SAILS tilt transparently.
If you do not want the SAILS elevation to be inserted into the data stream, you can specify the ‘-e’ option on the command line of ldm2netcdf to separate out the extra SAILS tilts. The SAILS tilts will then be saved into a separate directory, such as Reflectivity_SAILS, or AliasedVelocity_SAILS. We do not recommend this, as you are essentially throwing away the extra information.
Finally, we took this opportunity to eliminate some outdated command line options in ldm2netcdf. First, is the ‘-D’ option for dealiasing. The dealiasing code in ldm2netcdf is very old and the dealias2d command provides much better results. Second, the ‘-c’ for compositing will be removed since w2vil does a much better job creating composites.
The new changes are being tested and will be rolled out when all the kinks are worked out.
In the past, in order to create a volumetric product in WDSS-II, it was required that the VCP used by the radar be known. This was not a problem for users working with data from the WSR-88D network, but for those utilizing data from outside of that network, a few extra steps were required, including the creation of a “fake” VCP file that contained the levels at which the radar had scanned.
However, the WSR-88D network recently adopted two new concepts to its scanning strategies. The Automated Volume Scan Evaluation and Termination (AVSET) concept allows any site to not perform higher-elevation scans when no storms are detected. The Supplemental Adaptive Intra-Volume Low-Level Scan (SAILS) gives radars in the 88D network the capability of adding in a supplemental 0.5 degree scan at any time.
While AVSET and SAILS have many advantages to them, the combination of these concepts have made using the VCP of a radar to help build volumetric products unreliable. Therefore, rather than depending on the VCP to build virtual volumes, we have taken the VCP dependence out of all of our products. This means that when working with data from outside of the WSR-88D network, including data from outside the US, users no longer need to create these “fake” VCP files, nor does the VCP need to be defined in the data. Users simply need to be sure that an appropriate expiry time for each scan is specified (using the ExpiryInterval attribute in the netcdf files) to ensure that old data ages off in a timely fashion.
One of the best ways to improve the quality of the Rotation Tracks products is to apply spatial QC using hysteresis and temporal QC using Multiple Hypothesis Tracking.
Unfortunately, this used to be quite slow. An hour of azimuthal shear data covering the CONUS could take as much as two hours to process. Therefore, it was used only in research studies and off-line, but not to produce the post-event rotation tracks that you can download from http://ondemand.nssl.noaa.gov/
w2accumulator’s -Q option now supports two vastly more efficient optimizations. You can specify that the number of hypothesis is 1 (meaning to only keep the best track, and not bother about second-best, third-best, etc.) or that the algorithm should retain all potential tracks (specifying -1 for number of hypotheses). These are the most likely values that you will want to specify and with these, the algorithm runs 20x faster. Yup, you can now process an hour of data in about 6 minutes.
CPU Time (microseconds)
You used to have only the first two options for -Q available. Now, you have two more, and these two “special” values are highly optimized.
What’s the impact of these options? (Open the images in different tabs in your browser and switch between them so that you can see the differences between the last two images more readily)
For more details about MHT-QC and its application to rotation tracks products, please see these scientific articles:
w2qcnndp is a WDSS-II algorithm that employs polarimetric moments to do quality-control of weather radar reflectivity data. Single-pol QC (via w2qcnn) has lots of problems distinguishing bioscatter from light precipitation, but polarimetric moments (variance of Zdr, especially) help w2qcnndp outperform w2qcnn in terms of removing bioscatter. If you are using w2qcnn on US weather radar data from after the polarimetric upgrade, you should definitely start using w2qcnndp instead.
w2qcnndp used to have lots of problems with electronic interference and with sun spikes. Recently, we added two modules to w2qcnndp (they are turned on by default; use -m “-sunstrobe -electronicinterference” to turn these off). Here are a couple of examples to show the impact of these modules.
First, electronic interference:
w2qcnndp used to have problems with sun spikes (sun strobes) that were connected to valid echoes, but this has been improved:
For more details about w2qcnndp, please refer to the following two scientific articles: