Big data for indoor crowd mobility


Current indoor localization solutions are predominantly used for navigation and simple user analytics. However, as the standard and availability of mobile sensors increase, new possibilities arise to generate new kinds of high quality localized data. Moreover, IoT sensing infrastructure and geographic information systems are becoming more common. Separately these data streams already enable many novel solutions, but in combination they may help uncover new profound knowledge. In this work, we outline how we bring big data from the crowd of navigating users together with building information data in order to analyze and predict human actions in indoor environments. For this purpose, multi-source data are acquired, assessed for usefulness, and processed in a big data pipeline. Having multiple constraints from different sources allows to remove systematic differences, reject outliers, and to suppress noise. Machine learning and artificial intelligence are employed to compress and summarize the data. Visualizing these data then helps to interpret how buildings are used. Furthermore, the data can be utilized to train realistic mobility models that predict how people will interact with each other and their environment. Together the combination of new data and techniques dramatically increases what can be learned about a building.