General Information
Details
My group’s research interests include Wearable and Ubiquitous Computing, Artificial Intelli-
gence, Machine Learning, Pattern Discovery, and Enabling Technology for those with dis-
abilities. We develop mobile computing techniques for supporting users while they are on-the-go or
engaged in face-to-face conversation. Our research relies on machine learning and human computer
interface (HCI) techniques for creating interfaces for use in everyday mobile environments, and we
use formal user studies for evaluation. Our work influences commercial products, the most notable
example being Google’s Glass 2010-2023.
A long term focus is Symbiotic Intelligence, an approach to Artificial Intelligence where the com-
puter learns about human interactions in the world through discovering patterns in a user’s everyday
behavior. We use motion sensors, microphones, cameras, and brain interfaces to understand the
user’s context and infer knowledge about the world and the user.
We leverage our knowledge of machine learning and human computer interface to create enabling
technology for the Deaf community and people with disabilities. Our long-term research efforts, such
as developing a American Sign Language (ASL) recognition system, focus on practical tasks that
have the potential for meaningful change. For example, our educational smartphone game Popsign
uses sign language recognition to help hearing parents of deaf children learn ASL so that they can
communicate with their children and help them acquire language skills. Lack of these skills corre-
lates with physical abuse, poor mental health, and low educational and career outcomes. Our work
on Passive Haptic Learning helps wearers learn new manual tasks, such as playing piano and text
entry, and Passive Haptic Rehabilitation helps wearers recover from stroke and partial spinal cord
injuries. Similarly, we use our experience with wearable computing, machine learning, and HCI to cre-
ate interfaces that aid communication with animals, focusing communication with wild Atlantic
spotted dolphins and working dogs.
As computing has become popular, affecting most other disciplines in academia, the course size has
become unsustainable using previous classroom techniques. My “Scaling Tools” team seeks to apply
AI and HCI techniques to help students better learn in classrooms of 1000 students, whether on-line
or in-person. Examples include PARQR, which suggests related posts on the educational forum Piazza
while a student is preparing a new post. PARQR has been shown to increase student satisfaction and
reduce posts by up to 40% as students can more easily find the information they need. After a demon-
stration to Piazza’s CEO, Piazza designed their own version, that now serves millions of students.
We also investigate how to detect and improve uneven grading by teaching assistants, detect plagia-
rism, provide in-context learning support in coding platforms, and restrict unauthorized AI usage in
assignments.
Popsign ASL v1. 0: An isolated american sign language dataset collected via smartphones
Symbiotic artificial intelligence: order picking and ambient sensing
Imitation of computer-generated sounds by wild Atlantic spotted dolphins (Stenella frontalis)
Examinator v4.0 : Cheating Detection in Online Take-Home Exams
Passive Haptic Rehearsal for Augmented Piano Learning in the Wild