Robots hold the promise of serving human needs, like helping older adults live independently at home or assisting drivers in preventing crashes. For these robots to integrate seamlessly into people's lives, they must provide proactive assistance that is responsive to their human partners' needs. Often, these needs are a result of underlying mental states like intent or awareness. Conversely, it is also useful for people to have an accurate mental model of their robot assistant's policy and knowledge. Mental states may be revealed implicitly through actions the agents take, such as gazing at a certain object or moving in a certain way. This talk describes research on developing collaborative robots that infer people's needs through interaction, adapt to people's individual preferences, and communicate their own models to make the interaction more explainable. These robots are evaluated in a range of human-robot interaction domains, such as manipulation and driving.
Bio: Dr. Henny Admoni is an Associate Professor in the Robotics Institute at Carnegie Mellon University, where she leads the Human And Robot Partners (HARP) Lab. Dr. Admoni’s research interests include human-robot interaction, assistive robotics, and nonverbal communication. Dr. Admoni holds a PhD in Computer Science from Yale University, and a BA/MA joint degree in Computer Science from Wesleyan University.