One of the main focuses for the Realities Lab is how the emerging fields of Virtual and Augmented Reality can be used to assist people with disabilities. This is the third post in a series on the research conducted in this area.
Virtual and Augmented Reality applications need a detailed model of the surrounding environment as well as their own location within it to provide users with immersive, interactive experiences. However, current techniques to build such models are often time-consuming, and require manual correction as the area they represent changes and evolves. As the model of the environment is often completed before the application put into use, it is unable to benefit from new information detected by the location’s tracking sensors. This paper proposes a solution that utilizes aggregate position data from an area to develop that space’s walkable mesh model. Participant locations were measured from Bluetooth Low Energy beacons via mobile devices, and that data set was then grouped together and filtered to produce an estimation of the total environment in the form of a two dimensional navigation mesh. The experiments produced several meshes that conformed to the walkable pathways of distinct layouts in an academic building.
Full paper available at IEEE Computer Society Digital Library or upon request.