Introduction
Living on the Colorado front range in summer means dealing with a chance of thunderstorms almost every afternoon, some of which can become severe.
In this post I’ll go over how I mapped the severe thunderstorm risk outlook from the NOAA Storm Prediction Center , using R and leaflet.
Data
The day 1 (ie today) severe thunderstorm risk (Convective Outlook) can be found at the website: NOAA Storm Prediction Center . This website provides a map where you can toggle a bunch of layers on/off, but you cannot zoom in/out to get a more detailed view of where the warning areas fall relative to other non-major cities. Depending on where the risk falls, it can sometimes be difficult to tell exactly where the boundaries are, or what category your home town is in. My goal was to try to view the data in an interactive map where I can zoom in/out and see more detail. There are 2 ways of doing this:
View the data in Google Earth: Above the map there is an icon you can click to download a kmz file that can be opened in Google Earth.
Download the data as shapefiles (a link is provided above the map next to the kml download) and process them into a map.
In this post I am focusing on working with the shapefiles and creating an interactive map using R and leaflet.
Downloading the data
The shapefiles can be downloaded manually by clicking the link on the website, but the link changes whenever the forecast is updated. I wanted to do this in a more programmatic way and be able to automatically find the link for the latest forecast without having to manually copy and paste the link each time I run it. I experimented with using the SelectorGadget to isolate the link, but found it easier to create a list of all the links in the page using the rvest (Wickham 2022) package and then find the link containing the shapefile (ends in .shp.zip).
Code
base_url <- "https://www.spc.noaa.gov"
# Define a function to make a list of all links on a webpage
get_all_links <- function(page_url){
# read the html from the website for day 1 outlook
html <- rvest::read_html(page_url)
# create a list of all the hyperlinks on the website
links <- rvest::html_attr(html_nodes(html, "a"), "href")
}
links <- get_all_links(page_url=paste0(base_url,"/products/outlook/day1otlk.html"))
# find the link for the shapefile (they only one that ends in 'shp.zip')
shp_link <- links[which(stringr::str_ends(links,'shp.zip'))]
shp_url <- paste0(base_url,shp_link)
print(paste('The latest shapefile as of ',Sys.time(),' is ',shp_url))
[1] "The latest shapefile as of 2023-07-06 10:20:49 is https://www.spc.noaa.gov/products/outlook/archive/2023/day1otlk_20230706_1300-shp.zip"
Code
# filename of shapefile
shp_fname <- basename(shp_url)
#print(shp_fname)
# base filename (remove *-shp.zip*) to use to load files later
basefname <- stringr::str_remove(shp_fname,"-shp.zip")
#print(basefname)
Now that we have the link for the latest forecast, we can download the file (zip file) and unzip.
The unzipped folder contains shapefiles files for tornado,wind, and hail threat, but I will focus on just the categorical risk for severe thunderstorms (this is what you have probably seen on the weather forecast on the news). The shapfile I am interested in ends in cat.shp
Then we can use the sf (Pebesma 2018) package to read the shapefile into R
Code
# destination to save downloaded file to
dest_file <- file.path('.','data',shp_fname)
# download the zip file containing shapefiles
download.file(url=shp_url,destfile = dest_file, method="curl",quiet = TRUE)
# unzip into a separate folder using base filename
unzip(dest_file,exdir = file.path('.','data',basefname) )
# read shapefile into R w/ sf package
cat_file <- stringr::str_remove(basename(shp_url),"-shp.zip")
dat <- sf::st_read(file.path('.','data',basefname,paste0(basefname,'_cat.shp')))
Reading layer `day1otlk_20230706_1300_cat' from data source
`/Users/andy/Projects/Andy_Pickering_Portfolio/posts/Storm_Prediction_Center/data/day1otlk_20230706_1300/day1otlk_20230706_1300_cat.shp'
using driver `ESRI Shapefile'
Simple feature collection with 4 features and 8 fields
Geometry type: POLYGON
Dimension: XY
Bounding box: xmin: -123.85 ymin: 24.169 xmax: -67.471 ymax: 49.561
Geodetic CRS: WGS 84
Examine the object extracted from the shapefile:
Code
class(dat)
[1] "sf" "data.frame"
Code
dat
Simple feature collection with 4 features and 8 fields
Geometry type: POLYGON
Dimension: XY
Bounding box: xmin: -123.85 ymin: 24.169 xmax: -67.471 ymax: 49.561
Geodetic CRS: WGS 84
DN VALID EXPIRE ISSUE LABEL LABEL2
1 2 202307061300 202307071200 202307061244 TSTM General Thunderstorms Risk
2 3 202307061300 202307071200 202307061244 MRGL Marginal Risk
3 4 202307061300 202307071200 202307061244 SLGT Slight Risk
4 5 202307061300 202307071200 202307061244 ENH Enhanced Risk
stroke fill geometry
1 #55BB55 #C1E9C1 POLYGON ((-92.46012 48.6844...
2 #005500 #66A366 POLYGON ((-109.69 47.18, -1...
3 #DDAA00 #FFE066 POLYGON ((-105.19 45.5, -10...
4 #FF6600 #FFA366 POLYGON ((-102.51 39.91, -9...
Mapping the data
Now that we have the shapefile converted into a R object, we can make our map. I’ll be using the Leaflet (Cheng, Karambelkar, and Xie 2023) package to create a nice interactive map.
- One caveat is that the number of categories is not constant; it can vary (up to 5 categorgies) depending on the forecast. So we need to use a for loop to plot whichever categories are present (there is a row in the sf object for each risk category present in the forecast.
- We now have an interactive map (try zooming in/out and hovering over the different areas with the cursor!).
- NOTE this map is for the forecast at the time this post was rendered, not when you are reading it
Code
#| fig-cap: Map Showing Severe Weather Prediction Risk
# extract bounding box values from shapefile to set map bounds
bb <- as.list(st_bbox(dat))
# make base map
m <- leaflet() %>%
addTiles()
# add layers (1 for each risk category present in forecast)
for (i in 1:length(dat$geometry)){
m <- addPolygons(map = m, data = dat$geometry[i],
color=dat$fill[i],
label = dat$LABEL2[i])
}
m <- m %>%
setMaxBounds(lng1 = bb$xmin,lng2 = bb$xmax,lat1 = bb$ymin, lat2=bb$ymax) %>%
addLegend(labels=dat$LABEL2,colors = dat$fill)
m
SessionInfo
Code
R version 4.2.3 (2023-03-15)
Platform: x86_64-apple-darwin17.0 (64-bit)
Running under: macOS Big Sur ... 10.16
Matrix products: default
BLAS: /Library/Frameworks/R.framework/Versions/4.2/Resources/lib/libRblas.0.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/4.2/Resources/lib/libRlapack.dylib
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
attached base packages:
[1] stats graphics grDevices datasets utils methods base
other attached packages:
[1] htmltools_0.5.5 stringr_1.5.0 rvest_1.0.3 leaflet_2.1.2
[5] sf_1.0-13
loaded via a namespace (and not attached):
[1] Rcpp_1.0.10 compiler_4.2.3 pillar_1.9.0 class_7.3-22
[5] tools_4.2.3 digest_0.6.31 jsonlite_1.8.5 evaluate_0.21
[9] lifecycle_1.0.3 tibble_3.2.1 pkgconfig_2.0.3 rlang_1.1.1
[13] DBI_1.1.3 cli_3.6.1 rstudioapi_0.14 crosstalk_1.2.0
[17] curl_5.0.1 yaml_2.3.7 xfun_0.39 fastmap_1.1.1
[21] e1071_1.7-13 xml2_1.3.4 httr_1.4.6 dplyr_1.1.2
[25] knitr_1.43 generics_0.1.3 vctrs_0.6.2 htmlwidgets_1.6.2
[29] classInt_0.4-9 grid_4.2.3 tidyselect_1.2.0 glue_1.6.2
[33] R6_2.5.1 fansi_1.0.4 rmarkdown_2.22 selectr_0.4-2
[37] magrittr_2.0.3 ellipsis_0.3.2 units_0.8-2 renv_0.17.3
[41] KernSmooth_2.23-21 utf8_1.2.3 stringi_1.7.12 proxy_0.4-27