Darwin Phones the Evolution of Sensing and Inference on Mobile Phones
We present Darwin, an enabling technology for mobile phone sensing that combines collaborative sensing and classiﬁcation techniques to reason about human behavior and context on mobile phones. Darwin advances mobile phone sensing through the deployment of eﬃcient but sophisticated machine learning techniques speciﬁcally designed to run directly on sensor-enabled mobile phones (i.e., smartphones). Darwin tackles three key sensing and inference challenges that are barriers to mass-scale adoption of mobile phone sensing applications: (i) the human-burden of training classiﬁers, (ii) the ability to perform reliably in diﬀerent environments (e.g., indoor, outdoor) and (iii) the ability to scale to a large number of phones without jeopardizing the “phone experience” (e.g., usability and battery lifetime). Darwin is a collaborative reasoning framework built on three concepts: classiﬁer/model evolution, model pooling, and collaborative inference. To the best of our knowledge Darwin is the ﬁrst system that applies distributed machine learning techniques and collaborative inference concepts to mobile phones. We implement the Darwin system on the Nokia N97 and Apple iPhone. While Darwin represents a general framework applicable to a wide variety of emerging mobile sensing applications, we implement a speaker recognition application and an augmented reality application to evaluate the beneﬁts of Darwin. We show experimental results from eight individuals carrying Nokia N97s and demonstrate that Darwin improves the reliability and scalability of the proof-ofconcept speaker recognition application without additional burden to users.
The continuing need to communicate has always pushed people to invent better and more eﬃcient ways to convey messages, propagate ideas, and share personal information with friends and family. Social-networking, for example, is the fastest growing phenomenon of the Internet era where people communicate and share content with friends, family, and acquaintances. Recently, researchers started investigating new ways to augment existing channels of communication and improve information exchange between individuals using the computational and sensing resources oﬀered by sensor-enabled mobile phones (aka smartphones). These phones already utilize sensor data to ﬁlter relevant information (e.g., location-based services) or provide better user experiences (e.g., using accelerometer data to drive mobile phone sensing applications). However, information about user’s behavior (e.g., having a conversation) and personal context (e.g., hanging out with friends) is often provided manually by the user. This naturally leads to the following thoughts: what if the available sensors are further exploited to automatically infer various aspects of a person’s life in ways that have not been done before? What if the characterization of the person’s microcosmos could be seen as a new form of communication? We believe that as sensor-enabled mobile phones become commonplace, they can be used at a personal-scale to enrich and support communication and collaboration, to measure and improve task performance, and to aide in the assessment of health and wellness.