Summary. Having up to date and specific weather and travel information in a single point of reference for the daily commute can be a very useful time saver and convenience. This project uses real time and forcasted data from the Transport for London Unified API and the MET Office weather API Datapoint to develop a plotly Dash dashboard. The dashboard provides up to date travel information for several designated routes as well as real time and forcasted weather information for the local area. The dashboard is deployed online using Heroku.
Skills Used:
The steps followed in order to create a weather and travel app that would be useful for the end user were:
json_normalize(json_data['SiteRep']['DV']['Location']['Period'][0]['Rep'])
This yielded the weather forcast for the first period in the data retrieved, the appearance of which is shown in Figure 1.
Figure 1: Format of data displayed for first period.
The columns show the forcast data in terms of different variables with the rows representing different times in the current three hour period. In order to decode the column variables used, the contents of
json_normalize(json_data['SiteRep']['Wx']['Param'])
was used which is shown in Figure 2.
Figure 2: Column names and their meanings.
The units were added to the variable description and then the column names from the dataframe shown previously renamed accordingly with
.rename(columns=weatherheader.set_index('name')['$'], inplace=True)
Weather codes stored as a dict allowed codes in the 'Weather Type' variable in the dataframe to be translated to meaningful string representations.
Transport for London's Unified API [2] was used to retrieve transport data. From the list of available requests on their website, the 'Gets the list of arrival predictions for the given stop point id' query seemed the most useful for the end user. However, this required the 'naptan ID' codes for each stop point of interest as input. These were found using the following steps:
which allow data in the json format to be accessed. A similar procedure was followed to retrieve the relevant tube data.
The data from these requests was investigated in more detail to find out its structure and identify how to extract the elements of interest. The json output was decoded into a dictionary and then turned into a dataframe with json_normalise. The result was a dataframe with variables listed in columns and rows representing the different bus numbers and their relative direction (inbound or outbound with respect to central london). In order to identify the buses and directions of travel of interest, the dataframe was filtered based on lineID (bus number) and destination (indicating direction of travel). A loop through the remaining data points allowed each of the subsequent arrival times to be found. For the tube data, a general status of the line of interest was identified using
json_result[0]['lineStatuses'][0]['statusSeverityDescription'] # status as string description
json_result[0]['lineStatuses'][0]['reason'] #reason
Plotly Dash [3] seemed an attractive proposition to use to create the dashboard for this project. It's key advantage is that all the dashboard can be written in Python. Having retrieved the data and extracted the relevant information, the main task was to define the layout in app.layout. Much of the data is currently presented in text form as this seemed most sensible in order to get things up and running and considering for the end user interactivity is unlikely to be important. However, some of Dash's interactive capabilities were investigated by presenting the numeric weather information in a line graph with the variable of interest selectable from a drop down menu and shown on the y axis with time/ subsequent forcast periods on the x axis. A slider was added to enable the user to move through subsequent three hour forcast windows. The current format of the dashboard is shown in Figure 3. The dashboard was set up to refresh the information displayed on page reload.
Figure 3: Dashboard layout.
Heroku [4] was used deploy the app on the web. This required a free Heroku account and Python 3.6, pipenv and Postgres be installed locally. The Heroku command line interface was installed on linux using
sudo snap install heroku --classic
Then using
heroku login
requests the login details. The app was deployed to Heroku using
heroku create
git push heroku master
Different numbers of instances of the app can then be run using
heroku ps:scale web=number_of_instances
Heroku recognises it is dealing with a Python app by the existence of a requirements.txt file. Requirements for the app are dealt with using pipenv. For debugging and development a local version of the app can be run using
heroku local web
which uses the local Procfile to find what to run. The app is available at http://localhost:5000. To push changes to Heroku the following is used
git add .
git commit -m "Demo"
git push heroku master