The power and prevalence of modern smartphones has quickly made the relationship between people and their computers closer and stronger than ever. With an ever expanding array of onboard sensors, smartphones are privy to behavioral information unlike the social cues that humans have evolved to utilize. My research seeks to understand the relationship between the behavioral information that can be observed by smartphones and the contextual state of the user as it relates to their experience.
Often there is a mismatch between the expectations a developer has for an intelligent system and the expectation the end-user has. My research is focused on designing and evaluating interactive systems using semi-supervised learning and domain adaptation approaches which can learn to model a concept as the end-user percieves it instead of relying solely on how the developer percieves it. This has important implications for how we approach research in self-tracking and affective computing where it is important for the learned concept (e.g.: activity, mood) to be aligned with the user's perception of it.
mHealth and self-tracking applications are a fast growing area of research and development and show great potential to provide health services to many of those who would otherwise have difficulty getting access. Currently, I am collaborating with the Center for Behavioral Intervention Technology for my Segal Design Fellowship project to understand how context-aware technologies might help users to track and manage their mental health.