Recognition of crowd behavior from mobile sensors with pattern analysis and graph clustering methods
Pages: 521 - 544,
Daniel Roggen - Wearable Computing Laboratory, Gloriastrasse 35, ETH Zurich, CH-8092 Zurich, Switzerland (email)
Martin Wirz - Wearable Computing Laboratory, Gloriastrasse 35, ETH Zurich, CH-8092 Zurich, Switzerland (email)
Gerhard Tröster - Wearable Computing Laboratory, Gloriastrasse 35, ETH Zurich, CH-8092 Zurich, Switzerland (email)
Dirk Helbing - CLU E11, Clausiusstrasse 50, ETH Zurich, CH-8092 Zurich, Switzerland (email)
Mobile on-body sensing has distinct advantages for the analysis and understanding of crowd dynamics: sensing is not geographically restricted to a specific instrumented area, mobile phones offer on-body sensing and they are already deployed on a large scale, and the rich sets of sensors they contain allows one to characterize the behavior of users through pattern recognition techniques.
In this paper we present a methodological framework for the machine recognition of crowd behavior from on-body sensors, such as those in mobile phones.
The recognition of crowd behaviors opens the way to the acquisition of large-scale datasets for the analysis and understanding of crowd dynamics.
It has also practical safety applications by providing improved crowd situational awareness in cases of emergency.
The framework comprises: behavioral recognition with the user's mobile device, pairwise analyses of the activity relatedness of two users, and graph clustering in order to uncover globally, which users participate in a given crowd behavior.
We illustrate this framework for the identification of groups of persons walking, using empirically collected data.
We discuss the challenges and research avenues for theoretical and applied mathematics arising from the mobile sensing of crowd behaviors.
Keywords: Crowd dynamics, crowd behavior recognition, mobile sensing, machine learning.
Mathematics Subject Classification: Primary: 68T10, 91-04, 91-02, 91D30, 62H30; Secondary: 91C20, 93A15, 93A30.
Received: December 2010;
Available Online: August 2011.