UX Design Intern at ICRI, Intel, in London MSc Dissertation at UCL Interaction Centre Supervised by Lorna Wall and Han Pham
Data Visualization, Mobile App Design, Research
Findings from literature review & user interviews
Nitrogen dioxide (NO2) lacks visibility, while it is a particular problem in London. UK has the highest levels of NO2 in Europe (Pollockm, 2014). Participants were not aware of the severity of pollution problems in London.
Participants could understand the air quality was bad via visual cues, such as photographs. With visual representations, information is more engaging, more accessible, more persuasive, and easier to recall (NeoMam Studios, 2014). Comparison is key to help people understand air quality
Design And Implementation
I learned XCode and Swift from zero, designed and implemented an iPhone mobile app in a month.
InstaNO2, take photos to explore air quality.
The app was designed as a camera app, which allowed people to explore the current NO2 level from places they were via an existing habitual behaviour, taking photos.
A photo-based air quality visual representation
The app visualized air quality data using photograph as a medium and displayed numeric air quality data on top of the photo, with a green or red, healthy or unhealthy colour block.
Whilst not only information was displayed, the appearance of the photo would also be adjusted based on the air quality data. If the air quality was healthy, the photo would be brighter with blue sky and a clear view; if the air quality was unhealthy, the photo would be darker. In this way, the app made invisible pollutants visible, and made air quality data more explicit.
By applying photo filters, the app enabled users to compare the current air quality data with other locations and dates, and it also enabled users to talk and share air quality data instantly. The photo-based air quality data could be shared on social media, to a designed photo collection website and to the users’ My Places stream, for users to check the real-time air quality at places they had been to.
#1 Take photo to explore air quality #2 Make the invisible visible via photographs #3 Explore and compare air quality data using photo filters #4 Talk and share air quality data instantly #5 Motivate people to check air quality #6 Extensibility
The app is easy to use. Participants were using the app just like using other camera apps.
“ People like taking photos. It’s easy to do.” (E3)
“ I like the camera interface. Users don’t have to learn a new interface.” (E4)
Photographs visualized the effects of air pollution that could affect the person.
“I think colour is enough to represent the current air quality. But photos made it more real, and give you a sense of how good or bad it might affect you. When I see the current picture applied with a poor air quality filter, I feel dirty and a little uncomfortable, and I want to move away from that place soon.” (E5)
It's easy to tell the difference between the first and the second images, because one says healthy and the other says unhealthy. But it's difficult to tell the difference between the second and the third images. Only using numbers and red colours, the difference between 300 μg/ m3 and 400 μg/m3 could not be felt. But the photographs and photo filters visualized the difference between numbers and enabled participants see the severity of air pollution visually by adjusting the appearance of photos, thus it made the air quality data on top of photos understandable.