NEWS.md
get_storms and get_storm_data have been rewritten to utilize pkg crul’s asynchronous features. This will not make much of a difference in get_storms (and may actually be slightly slower; to be explained). But the difference with get_storm_data should be very noticeable. There is a limit to hitting the NHC archives; 80 requests per 10 seconds. Both functions send 4 links through every 0.5 seconds to avoid this limit. Timeout issues should no longer occur so options rrricanes.http_attempts and rrricanes.http_timeout have been removed. The primary cause of long processing now is due to scraping, particularly with the fstadv products; the amount of data in these products and the unstructured nature of the products require a number of rules. This can probably be simplified in future releases. (#94)load_storm_data now takes readr::read_csv parameters.Key variable added to discus dataframes. Key will be NA for all cyclones >= 2005. Should not be <= 2006. (#80)Adv variable from posest dataframes. Position estimates do not have advisory numbers. (#81)Adv variable from update. Updates do not have advisory numbers. (#84)Key to get_public dataframes. (#85)Key to get_update dataframes. (#86)get_fstadv. Hrs 48 and 72 hours only have 34 and 50kt wind fields. Hrs 96 and 120 have none. (#89)gis_advisory Typically will include current and past track data, forecast track data, forecast cone (margin of error) and wind radius data.gis_breakpoints List of breakpoints typically used for watch/warning areas but is not a requirement.gis_latest Retrieves the latest GIS products for all active storms.gis_outlook Retrives the latest tropical weather outlook in shapefile format.gis_prob_storm_surge Probabilistic storm surge; a polygon dataset for psurge and esurge products with various criteria.gis_windfield Wind radius datasets.gis_wsp Wind speed probabilitiesgis_download Use this function to download the URLs returned from the above functions.shp_to_df added to convert lines and polygons spatial dataframes to dataframes. Points dataframes can be converted using tibble::as_dataframe (target the @data object).pkgdown.load_storm_data directly returns dataframes. Additionally, retrieval by basin and years removed in favor of importing complete product datasets. Additionally, documentation has been added to the website on using data.world as a third option. The difference between these two options is load_storm_data will return complete datasets. Using data.world will allow users to write custom queries to retrieve data. (#76)al_prblty_stations, cp_prblty_stations and ep_prblty_stations may be removed on a future release. (#46)rrricanes.http_sleep to control time to sleep between multiple HTTP requests.get_fstadv, get_prblty, get_wndprb, tidy_fstadv, tidy_wr, tidy_fcst and tidy_fcst_wr.tidy_fcst and tidy_fcst_wr would err if all forecast periods were not available for a cyclone. Functions now analyze dataframe to determine what forecast fields exist, then tidies based on the result. (#73)Changed name from Hurricanes to rrricanes.
get_storm_data can now be chained to other commands and returns a list of dataframes.
load_storm_data accesses pre-scraped datasets and returns requested products through the github repo rrricanesdata. This was done to make it quicker to get data. It should not be relied on to get the most immediate data for current storms. However, it should be fairly up-to-date. Original functions can be used if for some reason immediate data access is needed.
saffir returns Saffir-Simpson classification of tropical cyclones; abbreviated.
status_abbr_to_str converts storm status abbreviations (i.e., TD, TS, HU) to string.
twoal and twoep parse tropical weather outlook XML files. Gives current status, if any, of areas of interest in either basin.
tidy_fstadv, tidy_wr, tidy_fcst and tidy_fcst_wr have been added to replaced now-removed fstadv_split().rrricanes.http_timeout and rrricanes.http_attempts added to give user more control over this. Default is 3 attempts with no more than 5 permitted.get_storms on some linux distros generated xpath_element error. Corrected. (#67)get_storm_data. Replaced css parameter in rvest::html_nodes calls with xpath parameter. Some products (notably, get_prblty) do not have a “pre” tag but are text documents (not HTML). Modified scrape_contents to return full contents if “pre” tag doesn’t exist. Tested get_discus and get_public; no errors generated. (#68)Retrieve all storm’s for a given year (>=1998) and access data from a given storm’s history. Can access “current” storm position, structure details, forecast, and discussions.
This release should be considered beta. While I’ve made every effort to ensure quality there may be an issue here or there. I will work on developing QA/QC scripts as time permits.
Please send any issues or questions to: https://github.com/timtrice/Hurricanes/issues.
Use get_storm_data to access one or multiple products for a specific storm.
Not parsed but contains technical information on the cyclone, development tendencies and forecast model tendencies.
Contains the meat of data. Current storm information, forecast information, wind and sea data. Can use fstadv_split() to break the wide dataframe to multiple, relational dataframes.
Contains current position estimate for a given storm. Usually issued during threats to land. Not issued for all storms. Not parsed.
Strike probabilities for given locations prior to 2006 (See Wind Speed Probabilities for >= 2006).