Behavioral Design and the Air Quality Dashboard

by Chloe Brown

July 17, 2019

Air pollution in Pittsburgh has come a long way since the days when smoke could blot out the midday sun. Yet for many in the region, clean air remains a concern. Following the Christmas Eve fire at Clairton Coke works, the Allegheny County Health Department approached CountyStat about developing a new dashboard to make the department’s hourly air quality data more accessible to the general public.

But while the data was already publicly available, developing such a tool still posed several challenges. First, the topic itself is complex. Fitting all the context of environmental regulations, evaluation periods, and health risks for each pollutant into one slim dashboard would be no easy feat. Second, the data from the county’s monitors only shows what has happened, rather than predicting what will occur. We wanted viewers who were used to seeing AirNow forecasts on the Health Department’s website to be able to understand why numbers might be different. And finally, we needed one tool to serve the needs of many types of users, from researchers and advocates who focused on these issues every day, to a concerned parent who wants to check the air quality before taking their child to the park.

As CountyStat set out to create an air quality dashboard that would be engaging, easy to use, and accessible to all, we used principles of behavioral design. In practice, this broke down into two goals for the final product: (1) to reduce the cognitive load, or amount of mental effort it takes for a person to find out what they want to know; and (2) to make the user experience pleasant enough to encourage people to interact with it again and again. After all, what good is a tool no one wants to use it?

Below are the lessons we learned – and the strategies they inspired – along the way.

 

  1. Leverage scales to make numbers easier to understand.

A problem with the original air quality data was that each pollutant was reported in different units and posed health risks at different thresholds. If our goal was to make something user-friendly, but we forced people to first learn what “μg/m3” means before they can find out if the air today is good or bad…  we wouldn’t exactly be off to a great start.

Luckily, the EPA has created a tool to solve this exact issue. The Air Quality Index (AQI) is a scale from 0 to 500 that standardizes air pollution across types; zero is good, 500 is bad, and 100 typically marks the point at which some groups may start experiencing health effects. To make things even easier, the scale is broken into six categories that correspond with simple definitions and recommendations for action.

What this meant for CountyStat:

Translate the data: Since the monitor readings we use are all reported in the original units of each pollutant, we wrote a function to convert this data to AQI automatically using the EPA’s Technical Assistance Document (a delightful summer read).

Be clear about limitations: Although the “Today” dashboard applies the AQI to hourly data, the official AQI can’t actually be calculated until the end of the day. This is partly because engineers must review each hourly reading to make sure the equipment is operating correctly, but also because some types of pollution are evaluated by averaging multiple hours. To make sure there were multiple points of entry to encounter this context, we included a field about how the final AQI is calculated in the description of each pollutant, notes in the pop-ups beside each chart, and a full explainer in the FAQs tab.

 

  1. Use subtraction as a spotlight.

There is a huge amount of information contained in this dashboard, but there could have been more. Choosing what to include can be a balancing act. With each new layer of complexity added to a tool, we risk raising the cognitive load required to use it. Even the perception that something will be difficult to understand can decrease user engagement.

Given the genuine complexity of the air quality data, it was deeply important to do everything we could to avoid overwhelming the viewer. This meant only including what was most important, creating a clear hierarchy of information, and keeping as much white space as possible.

What this meant for CountyStat:

Limit to three pollution types: Early drafts of the dashboard included data about six pollutants, including carbon monoxide, nitrogen dioxide, and coarse particulate matter. However, none of these types have had an AQI score above “Moderate” for the past ten years. We excluded these options to keep the view simple (though air quality enthusiasts can still check them out using the EPA’s long-term data visualization tool here).

Show one pollutant at a time: We chose to focus the users’ attention on one pollutant at a time, rather than squeezing in everything at once. This allowed us to designate spaces for certain purposes, such as descriptions, hourly data, and the map of most recent readings. The dashboard viewing mode defaults to PM2.5 because it frequently has the highest AQI value.

Interactivity creates depth without mess: To add background about the data without crowding out the focal points of the dashboard, we hid additional information in the question mark icons. This makes context accessible, but only to those who want it. This strategy carried over to the FAQs section as well. To avoid creating a wall of text, users only view the answer to the question most recently selected.

 

  1. Colors add context to data.

Color can be a powerful design tool to help users intuitively understand what’s important. Yet as the old Spiderman adage holds, with great power comes great responsibility. A non-strategic use of color can increase the cognitive load. It’s difficult to make sense of brightly colored data, because part of our energy must go into putting on mental sunglasses. Similarly, using too many colors at once, or the same color to mean different things can make a dashboard harder to read.

What this meant for CountyStat:

Choosing a palette: We’d originally started designing with a blue-to-brown color scale – a nice concept for air quality, but potentially confusing since it conflicts with the colors the EPA uses for AQI. While the EPA’s palette is an easy-to-understand rainbow scale, the colors are all hyper-saturated, making them visually exhausting to look at. To make our dashboard easier on the eyes but still consistent with the EPA’s color scheme, we played around with the saturation, brightness, and hue for each category to find a happy medium.

  

 

Color elsewhere: In addition to limiting the total number of colors, each color should only mean one thing in a dashboard. Going with a rainbow palette meant that most colors were already spoken for and adding new ones could be risky. Therefore, we used color very sparingly for anything that wasn’t communicating an AQI category. The rest of the dashboard is in grayscale, with one muted shade of teal to draw attention to some interactive features.

Keeping it accessible: Roughly 5-8% of men and less than 1% of women are affected by some form of colorblindness. These rates may seem small, but given current population estimates for our county, that could mean that as many as 53,000 people have difficulty telling colors apart. If accessibility is a main goal of this project, then we wanted to make sure that we were designing a tool for them too. While developing our color palette, we ran screenshots through an online colorblindness simulator ensure they were as accessible as possible.

We also tried to build in backup signifiers wherever color was communicating something important. For example, in bar chart showing hourly AQI and the scatter plot of daily values, each mark’s position conveys the same information about the AQI value. Furthermore, clicking any piece of colorful data on the charts will provide a pop-up that includes a text-version of the AQI value and category. While redundancies can often clutter a visualization, both these features were elements already included in the graphs. This meant that the task was less about adding on unnecessary features, but rather, about ensuring that the ones in use already would be clear for someone with trouble seeing colors.

Behavioral design is all about creating an environment that subtly nudges the user towards a desired action. And while the air quality data itself may have posed certain challenges, the “desired action” in this case was remarkably straightforward. People are already motivated to find out more about their air quality, and the data was already out in the open. Our job was to simply take stock of the barriers separating folks from this data and, in true Pittsburgh-fashion, build a bridge to connect them.