As the clean-up continues on the eastern seaboard, I wanted to follow up on Monday’s post on tracking Hurricane Sandy with Open Data with a couple of other R-based data applications spawned by the storm.
As the clean-up continues on the eastern seaboard, I wanted to follow up on Monday’s post on tracking Hurricane Sandy with Open Data with a couple of other R-based data applications spawned by the storm.
Josef Fruehwald created an R script to tap into local weather sensors to keep track of air pressure, wind speed and rainfall near his home and for other affected locations. Check out the plunge in barometric pressure as the centre of the storm passed near Atlantic City on Monday night, and then over Philadelphia a couple of hours later:
The greater scale of destruction in Atlantic City compared to Philadelphia is reflected in this comparison of wind speeds over the same period:
Joseph used R to download the data from Weather Underground station records (like this one) in XML format, and then used the ggplot2 package to create the charts. You can find the complete R script at RPubs: Analysis of a Philadelphia Weather Station Data during Hurricane Sandy.
Twitter’s Ed Chen took a different look at the effects of Sandy: how the storm knocked out power around the NY region, and the race by the utulities to restore power. Where did he find such data? Tweets from Twitter users, of course! Geolocated tweets (from smartphones, presumably) along the lines of “I’ve just lost power!” , “Still no power here!” and “The power is back on!” gave Ed information he needed to use R to create an animation over time of the power going out and coming back on across NYC.
The above is just a snapshot; visit Ed Chen’s blog for an animated version, with a slider to see the changes during the night of the storm.