The University of Sheffield1
The aim of the Citizen's Water Observatory of the EU-funded WeSenseIt project
This paper outlines work carried out within the visualisation Work Package of the WeSenseIt project. Augmented Reality (AR) allows to overlay virtual models in perspective view over the existing [urban] landscape using mobile devices and to experience the landscape directly while on site (Lange, 2011). A semi-immersive AR app is developed to visualise potential flood levels of known flood zones in real-time. Evidence in support of user's preference towards immersive experience has been investigated recently (Gill, 2013). Mobile devices are omnipresent which makes dissemination of our work almost instantaneous.
The main objective of the proposed work is to raise public awareness of the reality and danger of local flooding by providing an on-site AR impression as to the severity of a given flood warning or a user-specified flood level. It includes facilitating a link between the app and WeSenseIt water level sensors in Doncaster, or a flood model based on these sensors, via a mobile internet connection. This could potentially provide real-time visualisation of near-future water levels in the area. Another goal is to implement an experimental port to wearable technologies, in particular Google Glass, to provide a fully immersive experience. With backing of today's big business wearable technologies are envisaged to play an increasing role in future everyday life.
Challenges include demonstrating the ability to utilize AR for flood visualisation on mobile devices, including calibration and maintenance of consistent frame of camera reference within the environment e.g. since no two user's viewpoints are ever the same. Another difficulty is to effectively recognize and/or create a 3D model of the surrounding buildings in a given flood zone for correct flood visualisation. Models should be built on-site and in real-time using the app. Once created, models can be uploaded for reuse by all users. General operation loosely proceeds thus: users download the app for free, take up an arbitrary position along a river and easily calibrate the app to the given vista. Using a suitable building model the user is then free to navigate the vicinity and experience different flood levels. Alternatively, using a flood model the WeSenseIt water sensors could directly control the visualisation in real-time.
Flooding has become increasingly common in recent times (Yorkshire Water, 2014). The importance of this work therefore is to raise public awareness of potential river levels in the local area. The public will be quicker to respond in times of flooding and potentially reduce injury and save lives. Overall, this work furthers the development of practical and life-saving tools to help local populations living in and around flood zones worldwide. The scope of this work goes beyond flood visualisation. Adaptations to other water-related projects are possible including planning and design and renewable energies in and around water catchment areas.
The procedures to attain our goal include (i) investigation of potential AR technologies, such as the open source Vuforia Software Development Kit (SDK), used as the core technology for the app, (ii) testing different methods of target recognition for visual AR calibration, (iii) creating voice- and GUI-controlled tools for model building, (iv) creating an inbuilt tool for user-based model repositioning, (v) adding flood visualisation taking into account the building model, and finally (vi) performing a case-study of the app.
AR's recent revival has given rise to many SDKs which remain largely undeveloped for; Google Glass and the Oculus Rift are very recent examples. There are many unexplored potential applications, which the proposed work addresses. The challenge is to determine how to use the available SDKs with the objectives of this work in mind.
Calibration of AR is a technical challenge. When moving the mobile device the camera image changes to such a degree that the app loses track of its original target. We present a novel solution, part of which is based on a combination of voice- and tactile gesture-controlled commands; voice control is less cumbersome but sometimes unreliable, so depending on conditions the two modes of operation can be combined interchangeably. Together, these tools provide for a user-friendly and robust AR app to enable flood simulation in support of the project's main objective.
Work in progress relates to the WeSenseIt sensors, the experimental Google Glass port, and the case-study, as detailed by the objectives in the introduction.
Current progress supports the viability of the proposed work, with successful deployment of an app of accurate calibration and tracking of the visual field. The editing suite allows a user to construct building models in real-time. Preliminary results support proof of concept as a prototype application.
2. Gill, L., Lange, E., Morgan, E., Romano, D., (2013). An analysis of usage of different types of visualisation media within a collaborative planning workshop environment. Environment and Planning B: Planning and Design 40(4), 742 Â– 754.
3. Lanfranchi, V., Wrigley, S. N., Ireson, N., Wehn, U., & Ciravegna, F., (May 2014) Citizens' Observatories for Situation Awareness in Flooding. Proceedings of the 11th International ISCRAM Conference Â– University Park, Pennsylvania, USA. S.R. Hiltz, M.S. Pfaff, L. Plotnick, and P.C. Shih, eds.
4. Lange, E. (2011) 99 volumes later: We can visualise. Now what? Landscape and Urban Planning 100, 403-406. Special issue commemorating publication of the 100th volume of Landscape and Urban Planning.
5. Yorkshire Water (2014) River Don. "Our Plan for the River Don, full report." [online] Available at: http://www.yorkshirewater.com/your-water-services/local-improvements/cleaning-our-rivers/river-don.aspx [Accessed 24 Oct. 2014].