URBAN ENVIRONMENTAL VISUALIZATION
We develop urban data visualization applications to communicate urban environmental data to citizens. Citizens are generally only aware of the city scale mean pollutant concentration or temperature values, but often do to understand the variable environmental quality conditions present in the urban fabric. However, these variable environmental gradients in cities can strongly impact the well-being of citizens. We develop immersive urban visualization strategies aiming to effectively reveal the complex urban environmental gradients present in cities and empower citizens to make smart decisions in their everyday lives. We utilize map based and augmented reality technologies to compile urban environmental data and make it accessible to citizens through intuitive and interactive graphic layers.
We have developed the Navigating Urban Environments (NUE) mobile app to display on-site high spatiotemporal air quality data. NUE is a user-centered app that enables a direct engagement with the surrounding environment. lts information visualization corresponds to real-time data acquired from urban Geographic lnformation Systems (GIS) and MUST sensing networks. In the NUE app, two AR environmental visualization modes are enabled, the first-person view, and the map view. In the first-person view, locational services of the user's mobile device are accessed, and by referencing the mobile GIS platform, geotagged environmental data are overlaid on the camera view as graphic filters. The app also enables a map view of the environmental data through accessing geolocation and using Google Map API services over which environmental gradients are overlaid. The local weather data for the app visualization are acquired from the MUST sensing kits, as well as through queries to online web services, such as OpenWeatherMap and AQICN. From these diverse sources, real-time data on temperature, humidity and air quality are collected. Furthermore, using GeoNames geocoding, the location of different urban intersections is stored. These intersections are then identified with markers and their spatially-distributed air quality values are displayed. This behavior is activated, when the user points a smartphone towards a nearby street intersection, which then allows the app to draw a marker over the area displaying street names as well as the relevant environmental data of that intersection. Thus, the app allows the user to associate air quality parameters with the urban fabric. The collected environmental data are stored in an online database where aggregated users query geotagged air quality information.
The Virtual Urban Environments (VUE) mobile app, is an Augmented Reality app that aims to complement the currently available city scale environmental data. The VUE app has enabled a spatiotemporal visualization of all Seoul districts' air quality during its initial launch. VUE is an immersive experience that makes the citizens aware of their immediate microclimate while highlighting the interaction between environmental and urban parameters. The VUE app can be displayed in any space where the designated Seoul (or another city) markers are visible. These markers are composed of various GIS and Remote Sensing layers of Seoul and trigger the app to overlay air quality data over the city markers. Historical air quality data (for Seoul from 1986 to 2016) can be viewed through interacting with a slider and dropdown menu. For 2016 in Seoul, we made an hourly visualization to understand the seasonal variability as well as the diurnal variability of the air quality data. Data for PM 1O, PM 2.5, 03, S02, N02, and CO is made available. Furthermore, the app enables real-time data to be visualized over the city map. The setup aims to encourage the viewer to make correlation between the historical and real-time air quality data, while viewing the complexity of urban and geographic layers. The VUE app can be downloaded from the Apple App Store and the data can be visualized with smartphones or tablets. AR devices are also made available in the exhibition space where the app has been installed. The air quality data displayed at the city scale, have been gathered from queries to online web services such as, OpenWeatherMap and AQICN, as well as Seoul city weather station databases. These data are visualized alongside the air quality data gathered from MUST. The combination of weather station and pedestrian level recordings aims to offer a holistic description of Seoul's local air quality characteristics.
Holographic Urban Environments (HUE) is an augmented reality mobile app that utilizes a smartglass handset to enable immersive environmental data visualization and analysis. The app uses an environmental visualization platform based on the game engine Unity and the Microsoft Hololens smartglass handset. The app queries data in real time from online weather station databases and visualized as a particle simulation. The environmental data is mapped onto 3D physical model that the Hololens kit uses as a marker. This app is being developed aiming for a higher degree of interactivity with the environmental data through spatial hand textures or voice activation by the user.
Within an urban block distance, air quality indicators can vary by more than 8 times. Air contaminant concentrations and environmental phenomena, in general, also strongly vary following diurnal and seasonal cycles. Thus, having access to high spatiotemporal resolution urban data for parameters such as air quality, as well as water quality, green cover, urban density, traffic or crime, could radically transform the daily life of citizens and the livability of cities. The City Compass project focuses on the analysis (data processing, formatting, and complementing) and visualization (development of user friendly smartphone and web applications) of complex urban health and environmental data in conjunction with the physical attributes of the city.
CITY COMPASS is an augmented reality mobile application -currently under development- designed to become the contemporary urban well-being guide. City Compass is an urban environmental and well-being data synthesizer that aims to empower citizens to make smart decisions in their every-day urban lives. Based on the specific urban interests of each individual, layers of distinct urban data can be selected and displayed in the app. Layers on the topics of urban health, urban environment, and the physical attributes of the city are made available, and other can be added in the future. Information on urban data such as air quality, water quality, greenery, or crime rates will be showcased through map and first-person visualizations. The map view will enable citizens navigating the city to make on-the-fly decisions on the best jogging trajectories or biking routes, best park for their family picnic, the best neighborhood to go for a walk, etc.
CITY COMPASS MAP VIEW
The CITY COMPASS city map view will enable citizens to visualize detailed urban spatial gradients. Through marker-based tagging, data events (extremely high temperatures for example) will be high-lighted over the city map. The user will be able to click on them to search for further information and pan and zoom in and out to explore the data at various spatial scales. The gradient based overlays on the other hand will provide an intuitive understanding of the variation of the chosen urban parameters across the city enabling citizens to visualize in real-time what is the most polluted area of the city, what areas have the lowest pollen concentrations, or which are the least crowded picnic areas with a dense green canopy, amongst others.
CITY COMPASS FIRST PERSON VIEW
The CITY COMPASS first person visualization will access the video camera on the smart phone of the user, and enable an augmented reality overlay of the urban field. A unique 3-dimensional augmented reality experience will provide a location specific experience of the selected urban parameters. Based on the urban parameters chosen by the user, overlays such as tree canopy densities, or local emission pollutant sources will be spatially high-lighted and overlaid on the users camera view.